Node.js Stream Pipeline Lab

Medium
0.0% Acceptance

In this lab, you will work with Node.js streams to process data using the stream.pipeline() method. This method is used for chaining multiple streams together, handling the flow of data and errors efficiently. Your task will be to create and use different types of streams to transform and consume data in a stream pipeline.

The stream.pipeline() method takes a list of streams as arguments and creates a processing pipeline. As data passes through the pipeline, each stream processes it, handling errors that might occur along the way, and ensuring proper resource cleanup. Conveniently, stream.pipeline() also returns a promise that fulfills once all streams have been processed or rejects when any error is encountered.

In this lab, you will work with three types of streams: Readable, Writable, and Transform. A Readable stream is used to read and consume data, a Writable stream is used to write data, and a Transform stream is used to transform data between the two.

Challenges

  1. Create a ReadableStream instance and export it.
  2. Create a WritableStream instance and export it.
  3. Create a TransformStream instance and export it.
  4. Use stream.pipeline() to connect the three exported streams.
  5. Export the stream.pipeline() function call as a promise.
  6. Write a .then() function call to log the result once the pipeline processing is complete.
  7. Write a .catch() function call to handle the errors that may occur during the pipeline processing.