Thursday, April 16, 2020

Node.js Streams

Node.js Streams


Very much like strings or arrays, the streams are also collections of data but generally, the streams are made available instantaneously and they don’t have to fit in the memory. In Node.js, the stream module is available that provides an API to handle the data streaming. Basically, the stream is an abstract interface and provides the API to interact with streams of data.

Many stream objects are available in Node, for example, a request to an HTTP server and process.stdout.

Streams are the objects in Node that allows us to continuously read the data from the source and write the data to the destination. The objects from this module enable us to work with the bulk of data. This data may be received from an external source delivering chunks of the data time during some time interval. The streams can be readable, writable or both.

We can include this module as:

var stream = require('stream');

Node.js stream types 

Node.js basic streams types are Readable, Writable, Duplex, and Transform streams.

  • The readable stream is used for a read operation. This stream is an abstraction for a source. For example the HTTP responses (on the client). 
  • The writable stream is used for a write operation. This is an abstraction for a destination. For example the fs.createWriteStream() method.

  • The duplex streams can be used for both Readable and Writable. For example TCP socket.
  • A Transform stream is like a duplex stream. The output is transformed or modified based on input. For example the zlib.createGzip stream. This transform stream can be used to compress the data using gzip. Sometimes the transform streams are also termed as “through streams.”

As all the streams are instances of EventEmitter, it means that the events are emitted as the data is read or written using the stream object. These stream instances throw several events, for example

  • The “data” event is fired as the data is available to read.
  • This “end” event is fired as there is no data left to read.
  • The “error” event is fired as any error occurs during the reading or writing of the data.
  • The “finish” event is fired as entire data is flushed.
  • The "drain" event is fired when the writable stream can receive more data.
Streams, events and functions
Streams, events and functions

Reading the data


In the example given below first, create a file "welcome.txt" and put text into it, then we can read the data stream from this file using a reader stream.

const fs = require("fs");
var data = '';

// readable stream
const reader = fs.createReadStream('welcome.txt');

// Let the encoding be utf8.
reader.setEncoding('UTF8');

// Handle the data, end, and error events
reader.on('data', function(chunk) {
   data += chunk;
});

reader.on('end',function() {
   console.log(data);
});

reader.on('error', function(err) {
   console.log(err.stack);
});


console.log("Reading finished");
/*
Output:
Reading finished
Welcome to Node.js
This is an event-driven, non-blocking, single-threaded architecture.
This tutorial is based on stream module
*/


Writing the data


In the example given below an output file is created with the name  "message.txt" and the output text is placed in this file, using the writer stream.

const fs = require("fs");
var data = 'This tutorial is based on stream module';

// writer stream
const writer = fs.createWriteStream('message.txt');

// Set the encoding to be utf-8
writer.write(data,'UTF8');

// Mark the end of file
writer.end();

// Handle the finish and error
writer.on('finish', function() {
   console.log("Write completed.");
});

writer.on('error', function(err) {
   console.log(err.stack);
});

console.log("Data wrote successfully");

/*
Output:
Data wrote successfully
Write completed.
*/

Piping streams


The output of one stream is passed as the input to another stream in a piping operation. There can be any number of piping operations. For example, we can copy and write the data from welcome.txt to message.txt using piping.

const fs = require("fs");
//reader stream
var reader = fs.createReadStream('welcome.txt');
//writer stream
var writer = fs.createWriteStream('message.txt');
//Pipe the read and write operations
//read welcome.txt and write the data into message.txt
reader.pipe(writer);
console.log("Piping operation completed");

/*
Output:
Piping operation completed
*/

Now, if you will open the file message.txt, it must be containing the same text as welcome.text contains.

Streams Chaining

Chaining is an operation when we provide the output of one stream to another stream multiple times to form a chain of stream operations. For example


const fs = require("fs");
var zlib = require('zlib');

// Compress the file welcome.txt to welcome.txt.gz
fs.createReadStream('welcome.txt')
   .pipe(zlib.createGzip())
   .pipe(fs.createWriteStream('welcome.txt.gz'));
 
console.log("The File is Compressed successfully");

/*
Output:
The File is Compressed successfully
*/


Reference: https://nodejs.org/api/stream.html