Stream Fundamentals — Readable, Writable, Duplex, Transform
Streams are Node.js's abstraction for working with data incrementally, without loading it all into memory. A 10GB file read with fs.readFile would crash your process; read via a stream and memory usage stays flat at a few KB.
Four Stream Types
- **Readable**: Source of data. Examples: fs.createReadStream, req (HTTP request body), process.stdin. You read from it
- **Writable**: Sink for data. Examples: fs.createWriteStream, res (HTTP response), process.stdout. You write to it
- **Duplex**: Both Readable and Writable simultaneously. Example: TCP socket — you can both read from it and write to it
- **Transform**: A Duplex that transforms data as it passes through. Examples: zlib.createGzip() (compresses input), crypto.createCipheriv() (encrypts input). Input and output are related
Creating Transform Streams
import { Transform, TransformCallback } from 'stream';
// ━━ Custom Transform stream — uppercase all text ━━
class UpperCaseTransform extends Transform {
_transform(
chunk: Buffer,
encoding: BufferEncoding,
callback: TransformCallback
): void {
// chunk is a Buffer — convert to string, transform, push downstream
this.push(chunk.toString('utf8').toUpperCase());
callback(); // signal: ready for next chunk
}
// Optional: flush at the end of stream
_flush(callback: TransformCallback): void {
this.push('--- END ---\n');
callback();
}
}
// ━━ Usage ━━
import { createReadStream, createWriteStream } from 'fs';
import { pipeline } from 'stream/promises';
await pipeline(
createReadStream('./input.txt'), // source
new UpperCaseTransform(), // transform
createWriteStream('./output.txt'), // destination
);
// stream.pipeline properly propagates errors and cleans up ALL streams
// ━━ Compose with zlib ━━
import { createGzip } from 'zlib';
await pipeline(
createReadStream('./data.csv'),
new UpperCaseTransform(),
createGzip(), // compress
createWriteStream('./data.csv.gz'),
);
console.log('Compressed!');Tip
Tip
Practice Stream Fundamentals Readable Writable Duplex Transform in small, isolated examples before integrating into larger projects. Breaking concepts into small experiments builds genuine understanding faster than reading alone.
Process data chunk-by-chunk instead of loading entire files into memory
Practice Task
Note
Practice Task — (1) Write a working example of Stream Fundamentals Readable Writable Duplex Transform from scratch without looking at notes. (2) Modify it to handle an edge case (empty input, null value, or error state). (3) Share your solution in the Priygop community for feedback.
Quick Quiz
Common Mistake
Warning
A common mistake with Stream Fundamentals Readable Writable Duplex Transform is skipping edge case testing — empty inputs, null values, and unexpected data types. Always validate boundary conditions to write robust, production-ready node code.
Key Takeaways
- Streams are Node.
- *Readable**: Source of data. Examples: fs.createReadStream, req (HTTP request body), process.stdin. You read from it
- *Writable**: Sink for data. Examples: fs.createWriteStream, res (HTTP response), process.stdout. You write to it
- *Duplex**: Both Readable and Writable simultaneously. Example: TCP socket — you can both read from it and write to it