Converters and encryption - TypeScript SDK
Payload Converter and Payload Codec Summary
This section summarizes the difference between a Payload Converter and Payload Codec.
Payload Converter
Payload Converters are responsible for serializing application objects into a Payload and deserializing them back into application objects. A Payload, in this context, is a binary form suitable for network transmission that may include some metadata. This serialization process transforms an object (like those in JSON or Protobuf formats) into a binary format and vice versa. For example, an object might be serialized to JSON with UTF-8 byte encoding or to a protobuf binary using a specific set of protobuf message definitions.
Due to their operation within the Workflow context, Payload Converters run inside the Workflow sandbox. Consequently, Payload Converters cannot access external services or employ non-deterministic modules, which excludes most types of encryption due to their non-deterministic nature.
Payload Codec
Payload Codecs transform one Payload into another, converting binary data to a different binary format. Unlike Payload Converters, Payload Codecs do not operate within the Workflow sandbox. This allows them to execute operations that can include calls to remote services and the use of non-deterministic modules, which are critical for tasks such as encrypting Payloads, compressing data, or offloading large payloads to an object store. Payload Codecs can also be implemented as a Codec Server (which will be described later on).
Operational Chain
In practice, these two components operate in a chain to handle data securely. Incoming data first passes through a Payload Converter through the toPayload
method, turning application objects into Payloads. These Payloads are then processed by the Payload Codec through the encode
method, which adjusts the Payload according to the required security or efficiency needs before it is sent to the Temporal Cluster.
The process is symmetric for outgoing data. Payloads retrieved from the Temporal Cluster first pass through the Payload Codec through the decode
method, which reverses any transformations applied during encoding. Finally, the resulting Payload is converted back into an application object by the Payload Converter through the fromPayload
method, making it ready for use within the application.
Payload Codec
API documentation: PayloadCodec
The default PayloadCodec
does nothing. To create a custom one, you can implement the following interface:
interface PayloadCodec {
/**
* Encode an array of {@link Payload}s for sending over the wire.
* @param payloads May have length 0.
*/
encode(payloads: Payload[]): Promise<Payload[]>;
/**
* Decode an array of {@link Payload}s received from the wire.
*/
decode(payloads: Payload[]): Promise<Payload[]>;
}
Use custom payload conversion
Temporal SDKs provide a Payload Converter that can be customized to convert a custom data type to a Payload and back.
The order in which your encoding Payload Converters are applied depending on the order given to the Data Converter. You can set multiple encoding Payload Converters to run your conversions. When the Data Converter receives a value for conversion, the value gets passes through each Payload Converter in sequence until the converter that handles the data type does the conversion. You will explore more in detail now. You will explore more in detail now.
Composite Data Converters
Use a Composite Data Converter to apply custom, type-specific Payload Converters in a specified order. Defining a new Composite Data Converter is not always necessary to implement custom data handling. You can override the default Converter with a custom Codec, but a Composite Data Converter may be necessary for complex Workflow logic.
A Composite Data Converter can include custom rules created, and it can also leverage the default Data Converters built into Temporal. In fact, the default Data Converter logic is implemented internally in the Temporal source as a Composite Data Converter. It defines these rules in this order:
export class DefaultPayloadConverter extends CompositePayloadConverter {
constructor() {
super(
new UndefinedPayloadConverter(),
new BinaryPayloadConverter(),
new JsonPayloadConverter(),
);
}
}
The order of applying the Payload Converters is important. During serialization, the Data Converter tries the Payload Converters in that specific order until a Payload Converter returns a non-null Payload.
To replace the default Data Converter with a custom CompositeDataConverter
, use the following:
export const payloadConverter = new CompositePayloadConverter(
new UndefinedPayloadConverter(),
new EjsonPayloadConverter(),
);
You can do this in its own payload-conterter.ts
file for example.
In the code snippet above, a converter is created that first attempts to handle null
and undefined
values. If the value isn't null
or undefined
, the EJSON serialization logic written in the EjsonPayloadConverter
is then used. The Payload Converter is then provided to the Worker and Client.
Here is the Worker code:
const worker = await Worker.create({
workflowsPath: require.resolve('./workflows'),
taskQueue: 'ejson',
dataConverter: {
payloadConverterPath: require.resolve('./payload-converter'),
},
});
With this code, you now ensure that the Worker serializes and deserializes Workflow and Activity inputs and outputs using your EJSON-based logic, along with handling undefined values appropriately.
Here is the Client:
const client = new Client({
dataConverter: {
payloadConverterPath: require.resolve('./payload-converter'),
},
});
You can now use a variety of data types in arguments.