Type-Safe NATS Messages with Protocol Buffers

Learn how to combine NATS messaging with Protocol Buffers to achieve end-to-end type safety in TypeScript, eliminating runtime serialization errors and providing compile-time guarantees for message contracts.

technical10 min readBy Klivvr Engineering
Share:

JSON is the default serialization format for most NATS applications, and for good reason: it is human-readable, universally supported, and requires no schema definition. But as systems grow, JSON's flexibility becomes a liability. A publisher changes a field name, and a subscriber crashes at runtime. A new required field is added, and downstream services silently ignore it. Type safety at the message boundary, not just within individual services, is what separates robust distributed systems from fragile ones.

Protocol Buffers provide a schema-driven serialization format that catches these problems at compile time rather than in production. When combined with the node-nats client library, protobuf gives you end-to-end type safety: every published message conforms to a defined schema, and every subscriber knows exactly what shape of data it will receive. This article shows how to build this integration from schema definition through to production-ready TypeScript code.

Why JSON Falls Short at Scale

Consider a typical JSON-based NATS message flow. A service publishes an order event:

// Publisher -- order-service
nc.publish("events.orders.created", jc.encode({
  orderId: "ord_123",
  userId: "usr_456",
  total: 149.99,
  currency: "USD",
  createdAt: new Date().toISOString(),
}));

And a consumer processes it:

// Consumer -- notification-service
for await (const msg of sub) {
  const order = jc.decode(msg.data) as {
    orderId: string;
    userId: string;
    total: number;
    currency: string;
    createdAt: string;
  };
 
  await sendOrderConfirmation(order.userId, order.orderId, order.total);
}

This works until it does not. The as type assertion is a lie the developer tells the compiler. There is no runtime validation that the message actually conforms to the expected type. If the publisher renames total to amount, the consumer silently receives undefined and the notification includes "Your order total: undefined."

Runtime validation libraries like Zod or io-ts can catch these errors, but they add overhead and still rely on developers manually keeping schemas synchronized across services. Protocol Buffers solve this at a structural level.

Defining Message Schemas

Start by defining your message schemas in .proto files. These files serve as the single source of truth for your message contracts:

// proto/events/orders/v1/orders.proto
syntax = "proto3";
 
package events.orders.v1;
 
import "google/protobuf/timestamp.proto";
 
message OrderCreatedEvent {
  string order_id = 1;
  string user_id = 2;
  repeated OrderItem items = 3;
  Money total = 4;
  google.protobuf.Timestamp created_at = 5;
  OrderSource source = 6;
}
 
message OrderItem {
  string product_id = 1;
  int32 quantity = 2;
  Money unit_price = 3;
}
 
message Money {
  string currency_code = 1;  // ISO 4217
  int64 units = 2;           // Whole units
  int32 nanos = 3;           // Nano units (10^-9)
}
 
enum OrderSource {
  ORDER_SOURCE_UNSPECIFIED = 0;
  ORDER_SOURCE_WEB = 1;
  ORDER_SOURCE_MOBILE = 2;
  ORDER_SOURCE_API = 3;
}

This schema explicitly defines every field, its type, and its wire format. The Money type avoids floating-point precision issues that plague JSON-based financial systems. The enum provides a closed set of valid values. The timestamp uses Google's well-known type, which maps to language-native datetime types.

Generating TypeScript Code

Several tools can generate TypeScript code from .proto files. We recommend ts-proto because it generates idiomatic TypeScript with proper types, interfaces, and encode/decode functions:

// Install dependencies
// npm install ts-proto @bufbuild/protobuf
// npx protoc --ts_proto_out=./src/generated --ts_proto_opt=outputServices=false \
//   --ts_proto_opt=esModuleInterop=true proto/events/orders/v1/orders.proto

The generated code provides typed interfaces and encoding functions:

// src/generated/events/orders/v1/orders.ts (generated)
export interface OrderCreatedEvent {
  orderId: string;
  userId: string;
  items: OrderItem[];
  total: Money | undefined;
  createdAt: Date | undefined;
  source: OrderSource;
}
 
export interface OrderItem {
  productId: string;
  quantity: number;
  unitPrice: Money | undefined;
}
 
export interface Money {
  currencyCode: string;
  units: number;
  nanos: number;
}
 
export enum OrderSource {
  UNSPECIFIED = 0,
  WEB = 1,
  MOBILE = 2,
  API = 3,
}
 
export const OrderCreatedEvent = {
  encode(message: OrderCreatedEvent): Uint8Array { /* ... */ },
  decode(input: Uint8Array): OrderCreatedEvent { /* ... */ },
  fromJSON(object: any): OrderCreatedEvent { /* ... */ },
  toJSON(message: OrderCreatedEvent): unknown { /* ... */ },
};

Now your TypeScript code has compile-time knowledge of every message type. If you try to access a field that does not exist, the compiler catches it. If you pass the wrong type, the compiler catches it.

Building a Type-Safe NATS Wrapper

With generated types in hand, you can build a wrapper around the node-nats client that enforces type safety at the messaging boundary:

import { connect, NatsConnection, Subscription, Msg } from "nats";
import {
  OrderCreatedEvent,
  OrderItem,
  Money,
  OrderSource,
} from "./generated/events/orders/v1/orders";
 
// Generic type-safe publisher
interface MessageCodec<T> {
  encode(message: T): Uint8Array;
  decode(input: Uint8Array): T;
}
 
class TypedNatsClient {
  constructor(private nc: NatsConnection) {}
 
  publish<T>(subject: string, codec: MessageCodec<T>, message: T): void {
    const data = codec.encode(message);
    this.nc.publish(subject, data);
  }
 
  async request<TReq, TRes>(
    subject: string,
    reqCodec: MessageCodec<TReq>,
    resCodec: MessageCodec<TRes>,
    message: TReq,
    timeout: number = 5000
  ): Promise<TRes> {
    const data = reqCodec.encode(message);
    const response = await this.nc.request(subject, data, { timeout });
    return resCodec.decode(response.data);
  }
 
  subscribe<T>(
    subject: string,
    codec: MessageCodec<T>,
    options?: { queue?: string }
  ): AsyncIterable<{ data: T; msg: Msg }> {
    const sub = this.nc.subscribe(subject, options);
 
    return {
      [Symbol.asyncIterator]() {
        const iterator = sub[Symbol.asyncIterator]();
        return {
          async next() {
            const result = await iterator.next();
            if (result.done) {
              return { done: true, value: undefined };
            }
            const data = codec.decode(result.value.data);
            return {
              done: false,
              value: { data, msg: result.value },
            };
          },
        };
      },
    };
  }
}

Using this wrapper, your publish and subscribe calls are fully type-checked:

async function main() {
  const nc = await connect({ servers: "nats://localhost:4222" });
  const client = new TypedNatsClient(nc);
 
  // TypeScript enforces the correct shape
  client.publish("events.orders.created", OrderCreatedEvent, {
    orderId: "ord_20250812_001",
    userId: "usr_abc123",
    items: [
      {
        productId: "prod_widget",
        quantity: 2,
        unitPrice: { currencyCode: "USD", units: 49, nanos: 990_000_000 },
      },
    ],
    total: { currencyCode: "USD", units: 99, nanos: 980_000_000 },
    createdAt: new Date(),
    source: OrderSource.WEB,
  });
 
  // This would cause a compile error:
  // client.publish("events.orders.created", OrderCreatedEvent, {
  //   orderId: "ord_123",
  //   amount: 99.98,  // Error: 'amount' does not exist on OrderCreatedEvent
  // });
 
  // Subscriptions are also typed
  const orders = client.subscribe(
    "events.orders.created",
    OrderCreatedEvent,
    { queue: "notification-service" }
  );
 
  for await (const { data, msg } of orders) {
    // 'data' is typed as OrderCreatedEvent
    console.log(`Order ${data.orderId} from user ${data.userId}`);
    console.log(`Total: ${data.total?.currencyCode} ${data.total?.units}.${data.total?.nanos}`);
 
    if (msg.reply) {
      msg.respond(new Uint8Array());
    }
  }
}

Subject-to-Type Mapping

To take type safety further, you can create a registry that maps NATS subjects to their expected message types. This prevents the common error of subscribing to a subject with the wrong decoder:

// subject-registry.ts
import { OrderCreatedEvent } from "./generated/events/orders/v1/orders";
import { PaymentProcessedEvent } from "./generated/events/payments/v1/payments";
import { UserSignedUpEvent } from "./generated/events/users/v1/users";
 
// Define the mapping of subjects to message types
interface SubjectMap {
  "events.orders.created": typeof OrderCreatedEvent;
  "events.payments.processed": typeof PaymentProcessedEvent;
  "events.users.signed_up": typeof UserSignedUpEvent;
}
 
type SubjectType<S extends keyof SubjectMap> = SubjectMap[S] extends MessageCodec<infer T>
  ? T
  : never;
 
class StrictNatsClient {
  constructor(private nc: NatsConnection) {}
 
  publish<S extends keyof SubjectMap>(
    subject: S,
    codec: SubjectMap[S],
    message: SubjectType<S>
  ): void {
    const data = (codec as any).encode(message);
    this.nc.publish(subject, data);
  }
 
  subscribe<S extends keyof SubjectMap>(
    subject: S,
    codec: SubjectMap[S],
    options?: { queue?: string }
  ): AsyncIterable<{ data: SubjectType<S>; msg: Msg }> {
    // Implementation same as TypedNatsClient.subscribe
    const sub = this.nc.subscribe(subject, options);
    return {
      [Symbol.asyncIterator]() {
        const iterator = sub[Symbol.asyncIterator]();
        return {
          async next() {
            const result = await iterator.next();
            if (result.done) return { done: true, value: undefined };
            const data = (codec as any).decode(result.value.data);
            return { done: false, value: { data, msg: result.value } };
          },
        };
      },
    };
  }
}
 
// Usage: the compiler ensures subject and codec match
const client = new StrictNatsClient(nc);
 
// Correct: subject matches codec
client.publish("events.orders.created", OrderCreatedEvent, {
  orderId: "ord_001",
  userId: "usr_001",
  items: [],
  total: undefined,
  createdAt: new Date(),
  source: OrderSource.WEB,
});
 
// Compile error: PaymentProcessedEvent is not assignable to SubjectMap["events.orders.created"]
// client.publish("events.orders.created", PaymentProcessedEvent, { ... });

This pattern catches an entire category of wiring bugs at compile time. When a developer writes a new consumer, the type system guides them to the correct decoder for each subject.

Schema Evolution and Backward Compatibility

One of Protocol Buffers' greatest strengths is its built-in support for schema evolution. You can add new fields, deprecate old ones, and maintain backward compatibility without coordinating deployments across services:

// v1: Original schema
message OrderCreatedEvent {
  string order_id = 1;
  string user_id = 2;
  repeated OrderItem items = 3;
  Money total = 4;
  google.protobuf.Timestamp created_at = 5;
  OrderSource source = 6;
}
 
// v1.1: Added fields (backward compatible)
message OrderCreatedEvent {
  string order_id = 1;
  string user_id = 2;
  repeated OrderItem items = 3;
  Money total = 4;
  google.protobuf.Timestamp created_at = 5;
  OrderSource source = 6;
  // New fields -- old consumers ignore them, new consumers use defaults
  string promotion_code = 7;
  Money discount = 8;
  string shipping_address_id = 9;
}

Adding fields with new field numbers is always safe. Old consumers that do not know about field 7, 8, or 9 simply skip them during decoding. New consumers that receive a message without these fields see them as their default values (empty string, nil, etc.).

Removing fields requires care. Never reuse a field number for a different purpose. Instead, mark removed fields as reserved:

message OrderCreatedEvent {
  reserved 6;
  reserved "source";
 
  string order_id = 1;
  string user_id = 2;
  repeated OrderItem items = 3;
  Money total = 4;
  google.protobuf.Timestamp created_at = 5;
  // field 6 (source) removed in v2, use order_channel instead
  OrderChannel order_channel = 7;
}

Practical Tips for Protobuf with NATS

Keep your .proto files in a shared repository or package that all services depend on. When a schema changes, every service that imports it gets the updated types at compile time. CI pipelines should generate TypeScript code from proto files automatically, ensuring generated code is never stale.

Use the Money pattern (units + nanos) instead of floating-point types for financial data. A double field loses precision for values like 0.1, which is unacceptable in fintech systems. The units/nanos representation is exact and aligns with how gRPC and Google APIs handle monetary values.

Version your proto packages (events.orders.v1, events.orders.v2) when making breaking changes. Publish to versioned subjects (events.orders.v2.created) so that consumers can migrate at their own pace. This is the same subject versioning strategy recommended for core NATS, now backed by schema-level versioning.

Consider generating validation code alongside your types. Tools like buf can enforce rules such as "field names must be snake_case" and "enums must have an UNSPECIFIED zero value," catching style violations before they reach code review.

Benchmark your serialization. Protobuf encoding and decoding is significantly faster than JSON.parse/JSON.stringify for large messages, often by a factor of 5-10x. For small messages (under 100 bytes), the difference is negligible. Let your actual message sizes guide the decision.

Conclusion

Combining Protocol Buffers with the node-nats client library gives you something that JSON-based messaging cannot: compile-time guarantees that every message published to NATS conforms to a defined schema, and every consumer decodes it correctly. The typed wrapper pattern catches wiring errors before they reach production. Schema evolution rules ensure that services can be deployed independently without breaking message compatibility. For teams building distributed systems where incorrect message handling has real consequences, the investment in protobuf-based type safety pays for itself with the first production incident it prevents.

Related Articles

business

Operating NATS in Production: Monitoring and Scaling

A practical operations guide for running NATS in production environments, covering monitoring strategies, capacity planning, scaling patterns, upgrade procedures, and incident response for engineering and platform teams.

12 min read
business

Messaging Architecture for Fintech Systems

A strategic guide to designing messaging architectures for financial technology systems, covering regulatory requirements, data consistency patterns, auditability, and the role of NATS in building compliant, resilient fintech infrastructure.

11 min read
technical

Securing NATS: Authentication and Authorization

A comprehensive guide to securing NATS deployments with authentication mechanisms, fine-grained authorization, TLS encryption, and account-based multi-tenancy, with practical TypeScript client configuration examples.

10 min read