Code examples for utilizing NodeJS Streams can be found in my GitHub repository.

What are Transform Streams? Link to heading

Transform streams, which are instances of the stream.Transform class, act as both Readable and Writable streams. They transform incoming data while emitting transformed output data. Transform streams take input data, perform an operation or transformation on it, and produce output data.

Transform streams inherit from the Duplex stream class, meaning they can read input data and write output data, allowing for seamless bidirectional data flow.


Implementing a Transform Stream Link to heading

To implement a custom Transform stream, you need to create a new class that extends the stream.Transform base class and implement the _transform(chunk, encoding, callback) method. This method receives the input chunk of data, along with its encoding, and a callback function to be called when the transformation is complete.

Here’s an example of a simple Transform stream that doubles the values of the incoming numbers:

const { Transform } = require('stream');

class DoubleTransform extends Transform {
  constructor(options) {
    super(options);
  }

  _transform(chunk, encoding, callback) {
    const numbers = chunk.toString().split('\n');
    const doubledNumbers = numbers.map((num) => parseInt(num, 10) * 2);
    this.push(doubledNumbers.join('\n'));
    callback();
  }
}

const doubleStream = new DoubleTransform();

doubleStream.on('data', (chunk) => {
  console.log(chunk.toString());
});

doubleStream.write('1\n2\n3\n'); // Output: 2\n4\n6
doubleStream.end();

In this example, the DoubleTransform Transform stream doubles the values of incoming numbers by splitting the chunk into separate numbers, doubling each number, and emitting the transformed output.


Common Transform Streams Link to heading

Transform streams are widely used across the Node.js ecosystem and are integrated into various libraries, frameworks, and tools. Here are a few examples of their usage:

CSV Parsers (e.g., csv-parser): CSV parsing libraries often use Transform streams to process and convert CSV data into structured formats, such as JSON objects. For instance:

const fs = require('fs');
const csv = require('csv-parser');

fs.createReadStream('data.csv')
  .pipe(csv())
  .on('data', (row) => {
    console.log(row);
  });

Compression Libraries (e.g., zlib): Libraries like zlib use Transform streams to compress or decompress data on the fly, enabling efficient data compression in various formats, such as gzip or deflate. For example:

const fs = require('fs');
const zlib = require('zlib');

const readStream = fs.createReadStream('input.txt');
const writeStream = fs.createWriteStream('output.txt.gz');

const gzip = zlib.createGzip();
readStream.pipe(gzip).pipe(writeStream);

Image Manipulation Libraries (e.g., sharp): Image processing libraries leverage Transform streams to apply transformations like resizing, cropping, or watermarking to images, enabling efficient image manipulation. Here’s an example with the sharp library:

const sharp = require('sharp');

sharp('input.jpg')
  .resize(800, 600)
  .toFile('output.jpg', (err, info) => {
    // Handle the result or any errors
  });