Node.js Stream Pipeline Lab
In this lab, you will work with Node.js streams to process data using the stream.pipeline()
method. This method is used for chaining multiple streams together, handling the flow of data and errors efficiently. Your task will be to create and use different types of streams to transform and consume data in a stream pipeline.
The stream.pipeline()
method takes a list of streams as arguments and creates a processing pipeline. As data passes through the pipeline, each stream processes it, handling errors that might occur along the way, and ensuring proper resource cleanup. Conveniently, stream.pipeline()
also returns a promise that fulfills once all streams have been processed or rejects when any error is encountered.
In this lab, you will work with three types of streams: Readable
, Writable
, and Transform
. A Readable
stream is used to read and consume data, a Writable
stream is used to write data, and a Transform
stream is used to transform data between the two.
Challenges
- Create a
ReadableStream
instance and export it. - Create a
WritableStream
instance and export it. - Create a
TransformStream
instance and export it. - Use
stream.pipeline()
to connect the three exported streams. - Export the
stream.pipeline()
function call as a promise. - Write a
.then()
function call to log the result once the pipeline processing is complete. - Write a
.catch()
function call to handle the errors that may occur during the pipeline processing.