A transport is how postal crosses an execution boundary — same-origin tabs via BroadcastChannel, iframes and workers via MessagePort, Node.js worker threads via MessagePort. Without a transport, a postal bus is local to its JavaScript context. With one, publishes on one side automatically appear on the other.
Transports are symmetric: each side registers its own transport, and postal handles the rest — echo prevention, filtering, and local dispatch.
type Transport = { /** Send an outbound envelope to the remote side. */ send: (envelope: Envelope) => void; /** Listen for inbound envelopes from the remote side. Returns an unsubscribe function. */ subscribe: (callback: (envelope: Envelope) => void) => () => void; /** Optional cleanup when the transport is removed or the bus is reset. */ dispose?: () => void;};
You rarely implement this directly — the transport packages do it for you. The interface matters when you’re building a custom transport (see below).
You can restrict which envelopes a transport forwards using a TransportFilter. Unfiltered transports forward everything.
addTransport(transport, { filter: { // Only forward envelopes on these channels (exact match) channels: ["orders", "inventory"], // Only forward envelopes matching these topic patterns (AMQP wildcards) topics: ["item.*", "stock.#"], },});
Both channels and topics are optional and additive — an envelope must pass both to be forwarded. "reply" envelopes (RPC responses) always bypass the filter, because they need to complete a round-trip that already matched on the way out.
When postal forwards an outbound envelope, it stamps it with the local instanceId in the source field. On the receiving side, any envelope whose source matches the local instanceId is silently dropped before dispatch. This prevents a message from bouncing back and triggering a second dispatch in the originating context.
You don’t need to do anything — it’s automatic. It’s documented here so you know why envelope.source exists.
Point-to-point transport for iframes, dedicated workers, and Node.js worker threads. Uses a MessageChannel with a handshake (SYN/ACK) to ensure both sides are ready before messages flow.
npm install postal-transport-messageport
Iframe — parent side:
import { addTransport, getChannel } from "postal";import { connectToIframe } from "postal-transport-messageport";const iframe = document.querySelector<HTMLIFrameElement>("#my-frame")!;const transport = await connectToIframe(iframe);addTransport(transport);// Messages published here are now forwarded into the iframegetChannel("orders").publish("item.placed", { sku: "WIDGET-42", qty: 1 });
Iframe — child side:
import { addTransport } from "postal";import { connectToParent } from "postal-transport-messageport";const transport = await connectToParent();addTransport(transport);// Inbound messages from the parent are dispatched locallygetChannel("orders").subscribe("item.*", env => { console.log("Got from parent:", env.payload);});
Worker — main thread:
import { addTransport } from "postal";import { connectToWorker } from "postal-transport-messageport";const worker = new Worker(new URL("./my-worker.js", import.meta.url));const transport = await connectToWorker(worker);addTransport(transport);
Worker — inside the worker:
import { addTransport } from "postal";import { connectToHost } from "postal-transport-messageport";const transport = await connectToHost();addTransport(transport);
Node.js worker thread — main thread:
import { Worker } from "node:worker_threads";import { addTransport } from "postal";import { connectToWorkerThread } from "postal-transport-messageport/node";const worker = new Worker(new URL("./my-worker.js", import.meta.url));const transport = await connectToWorkerThread(worker);addTransport(transport);
Node.js worker thread — inside the worker:
import { addTransport } from "postal";import { connectFromWorkerThread } from "postal-transport-messageport/node";const transport = await connectFromWorkerThread();addTransport(transport);
Many-to-many transport for same-origin tabs and windows. Uses the browser’s BroadcastChannel API — no handshake, no configuration. Every tab that registers the transport is immediately in the mesh.
npm install postal-transport-broadcastchannel
import { addTransport, getChannel } from "postal";import { createBroadcastChannelTransport } from "postal-transport-broadcastchannel";// All tabs using "postal" as the channel name share messages automaticallyaddTransport(createBroadcastChannelTransport());// Optional: use a custom name to isolate multiple apps on the same originaddTransport(createBroadcastChannelTransport("my-app"));// Now any publish is forwarded to all other tabsgetChannel("orders").publish("item.placed", { sku: "WIDGET-42", qty: 1 });
Transport that opens a dedicated MessagePort between each tab and the controlling ServiceWorker. The SW participates in postal as a peer — it can publish and subscribe like any other context. Tab-to-tab messaging is still handled by BroadcastChannel; this transport handles the tab-to-SW leg.
npm install postal postal-transport-serviceworker postal-transport-messageport
Tab (client) side:
import { getChannel } from "postal";import { connectToServiceWorker } from "postal-transport-serviceworker";await navigator.serviceWorker.register("/sw.js");const registration = await navigator.serviceWorker.ready;const removeTransport = await connectToServiceWorker(registration, { timeout: 5000, onDisconnect: () => { // SW was replaced (update or browser restart) — reconnect if needed console.log("SW disconnected"); },});getChannel("notifications").publish("push.received", { title: "Hello" });
ServiceWorker (sw.js) side:
import { addTransport, getChannel } from "postal";import { listenForClients } from "postal-transport-serviceworker/sw";// SW-side API is a separate entry point — keeps browser globals out of your client bundleconst { dispose } = listenForClients({ filter: { channels: ["notifications"] }, // optional});getChannel("notifications").subscribe("push.received", data => { // Fan out to all connected tabs via postal's transport layer getChannel("notifications").publish("push.displayed", data);});self.addEventListener("activate", event => { event.waitUntil(self.clients.claim());});
IPC transport for Node.js child_process.fork() and cluster — bridges postal pub/sub across processes. Uses the same SYN/ACK handshake as postal-transport-messageport over the Node.js IPC channel.
npm install postal postal-transport-childprocess
Parent (main.js):
import { fork } from "child_process";import { addTransport, getChannel } from "postal";import { connectToChild } from "postal-transport-childprocess";const child = fork("./worker.js");const transport = await connectToChild(child);const remove = addTransport(transport);// Clean up when the child exitschild.on("exit", () => remove());getChannel("jobs").publish("task.start", { id: 1 });
Child (worker.js):
import { addTransport, getChannel } from "postal";import { connectToParent } from "postal-transport-childprocess";const transport = await connectToParent();addTransport(transport);getChannel("jobs").subscribe("task.start", data => { // do work, publish results back getChannel("jobs").publish("task.done", { id: data.id, result: "ok" });});
Cluster — primary:
import cluster from "cluster";import { addTransport } from "postal";import { connectToClusterWorker } from "postal-transport-childprocess/cluster";const worker = cluster.fork();const transport = await connectToClusterWorker(worker);const remove = addTransport(transport);worker.on("exit", () => remove());
Cluster — worker:
import { addTransport } from "postal";import { connectToClusterPrimary } from "postal-transport-childprocess/cluster";const transport = await connectToClusterPrimary();addTransport(transport);
Unix domain socket transport for independent Node.js processes. Unlike the childprocess transport, processes don’t need a parent/child relationship — any process can connect to a known socket path. The topology is hub-and-spoke: one server, N clients.
npm install postal postal-transport-uds
Server (hub):
import { getChannel } from "postal";import { listenOnSocket } from "postal-transport-uds";const { dispose } = await listenOnSocket("/tmp/postal.sock");// All connected clients receive thisgetChannel("jobs").publish("task.start", { id: 1 });// Tear down: close server, disconnect all clientsdispose();
resetTransports() removes all registered transports and calls dispose() on each:
import { resetTransports } from "postal";// Tear down transports onlyresetTransports();
resetChannels() calls resetTransports() internally, so a full bus reset also cleans up transports. Use resetTransports() directly when you want transport-only teardown without clearing channels and subscribers.