{"id":34323,"date":"2023-03-25T04:30:50","date_gmt":"2023-03-25T04:30:50","guid":{"rendered":"https:\/\/www.bacancytechnology.com\/blog\/?p=34323"},"modified":"2024-12-27T08:34:43","modified_gmt":"2024-12-27T08:34:43","slug":"node-streams","status":"publish","type":"post","link":"https:\/\/www.bacancytechnology.com\/blog\/node-streams","title":{"rendered":"Node Streams: A Sneak-Peak"},"content":{"rendered":"<p style=\"color:#FFA500\"><strong><i>Quick Summary<\/i><\/strong><\/p>\n<p><i><strong>? Node Streams are an efficient way to channelize and process input and output data for Node.js application.<\/strong><\/i><\/p>\n<p><strong><i>? Using Node Js streaming, entrepreneurs can improve the performance, scalability, and maintainability of Node.js applications that function with huge amounts of data.<\/i><\/strong><\/p>\n<p><strong><i>? Find out about the types of streams in Node.js, along with their practical tutorial for better understanding. <\/i><\/strong><\/p>\n<p><strong><i>? Explore the chaining and piping of Node Streams. <\/i><\/strong><\/p>\n<h2>What are Streams in Node Js?<\/h2>\n<p>Streams are abstract interfaces for working with data that can be read or written sequentially. In Node.js, streams are a fundamental concept used to handle data flow between input and output sources.<\/p>\n<p>Streams are an important concept in Node.js because they allow for the efficient handling of large amounts of data. Instead of loading all the data into memory at once, streams process data in chunks as it becomes available. Data can be streamed from a source (like a file or a network socket) to a destination (like a response object or another file) in real-time, without buffering the whole data into memory at once. <\/p>\n<p>For instance, one may read a stream or write a stream from and to various data sources, sinks, files, network sockets, and stdin\/stdout. <\/p>\n<h3>? The Stream Module<\/h3>\n<p>The stream module in Node.js is a core module that provides a way to handle streaming data. It provides a set of APIs for creating, reading from, and writing to streams.<\/p>\n<h3>? Node Js Streaming API<\/h3>\n<p>The Stream API in Node.js is a set of APIs that provide a way to handle streaming data in Node.js. The Stream API provides a set of classes and functions for creating, reading from, and writing to Node streams.<\/p>\n<p>Here are the main components of the Stream API in Node.js:<\/p>\n<ul class=\"bullets text-left\">\n<li><strong>Stream Classes:<\/strong> The Stream API provides several classes for working with Node js streams, including the Readable, Writable, Duplex, and Transform classes. These classes provide different types of streams with varying functionality.<\/li>\n<li><strong>Stream Methods:<\/strong> The Stream API provides several methods for working with streams, including the pipe() method for connecting both readable and writable type, and the onData() method for handling a data event or more.<\/li>\n<li><strong>Events: <\/strong>The Stream API provides several events that can be emitted by streams, including &#8216;data&#8217;, &#8216;end&#8217;, &#8216;error&#8217;, and &#8216;finish&#8217;. These events can be used to handle different aspects of stream processing.<\/li>\n<li><strong>Stream Options:<\/strong> The Stream API provides options for configuring streams, such as setting the encoding for readable streams or setting the high watermark for writable ones.<\/li>\n<\/ul>\n<h2>Types of Node Streams<\/h2>\n<p>There are four different types of streams, each for a specific purpose, namely, Readable NodeJs Streams, Writable Streams, Duplex Streams, and Transform Streams for Node js applications. <\/p>\n<p>Let us understand the Readable Node Js Stream example.<\/p>\n<h3>Readable Stream<\/h3>\n<p>Readable streams are used to reading data from a source, such as a file or a network socket. They emit a \u2018data\u2019 event whenever new data is available and an \u2018end\u2019 event when the stream has ended. Examples of <a href=\"https:\/\/github.com\/nodejs\/readable-stream\" target=\"_blank\" rel=\"noopener\">readable streams<\/a> in Node.js include <mark>\u2018fs.createReadStream()\u2019<\/mark> for reading files and <mark>\u2018http.IncomingMessage\u2019<\/mark> for reading HTTP requests.<\/p>\n<p>Let us understand the Readable Node Js Stream with an example.<\/p>\n<pre>const fs = require('fs');\r\n\r\n\/\/ Create a readable stream from a file\r\nconst readStream = fs.createReadStream('example.txt', { encoding: 'utf8' });\r\n\r\n\/\/ Handle 'data' events emitted by the stream\r\nreadStream.on('data', (chunk) => {\r\n  console.log(`Received ${chunk.length} bytes of data.`);\r\n});\r\n\r\n\/\/ Handle the 'end' event emitted by the stream\r\nreadStream.on('end', () => {\r\n  console.log('End of file reached.');\r\n});\r\n\r\n\/\/ Handle errors emitted by the stream\r\nreadStream.on('error', (err) => {\r\n  console.error(`Error: ${err}`);\r\n});<\/pre>\n<p>In this example, we use the fs module to create a readable stream from a file named &#8216;example.txt&#8217;. We set the encoding option to &#8216;utf8&#8217; to read data from file as a string.<\/p>\n<p>We then handle the &#8216;data&#8217; event emitted by the stream, which is triggered every time a chunk of data is read from the file. In this case, we simply log the number of bytes received.<\/p>\n<p>We also handle the &#8216;end&#8217; event emitted by the stream, which is triggered when the end of the file is reached. Finally, we log any errors emitted by the stream to the console.<\/p>\n<p class=\"boxed bg--secondary\" style=\"border: 1px solid #c7c7c7; box-shadow: 0 0 40px rgba(0, 0, 0, 0.2);\"><strong><i><span style=\"font-size:22px; color:#000;\">Ready to harness the power of Node.js Streams?<\/span><br \/>\nHire <a href=\"https:\/\/www.bacancytechnology.com\/hire-node-developer\" target=\"_blank\" rel=\"noopener\">Node js developer<\/a> or partner with a reputable Node.js development company today and take your projects to the next level!<\/strong><\/i><\/p>\n<h3>Writable Stream<\/h3>\n<p>Writable streams are used to write data to a destination, such as a file or a network socket. They have a <mark>\u2018write()\u2019<\/mark> method to write data and an <mark>\u2018end()\u2019<\/mark> method to signal the end of the stream. Examples of writable streams in Node.js include <mark>\u2018fs.createWriteStream()\u2019<\/mark> for writing files and <mark>\u2018http.ServerResponse\u2019<\/mark> for writing HTTP responses.<\/p>\n<p>Example of NodeJs Writable Stream:<\/p>\n<pre>const fs = require('fs');\r\n\r\n\/\/ Create a writable stream to a file\r\nconst writeStream = fs.createWriteStream('output.txt', { encoding: 'utf8' });\r\n\r\n\/\/ Write data to the stream\r\nwriteStream.write('Hello, world!\\n');\r\nwriteStream.write('This is a test.\\n');\r\n\r\n\/\/ End the stream\r\nwriteStream.end();\r\n\r\n\/\/ Handle the 'finish' event emitted by the stream\r\nwriteStream.on('finish', () => {\r\n  console.log('Data written to file.');\r\n});\r\n\r\n\/\/ Handle errors emitted by the stream\r\nwriteStream.on('error', (err) => {\r\n  console.error(`Error: ${err}`);\r\n});<\/pre>\n<p>In this example, we use the fs module to create a writable stream to a file named &#8216;output.txt&#8217;. We set the encoding option to &#8216;utf8&#8217; to read data from the file as a string.<\/p>\n<p>We then write data to the stream using the write() method, calling it twice to write two lines of text. We end the stream using the end() method.<\/p>\n<p>We also handle the &#8216;finish&#8217; event emitted by the stream, triggered when all data has been written to the file. Finally, we log any errors emitted by the stream to the console.<\/p>\n<h3>Duplex Stream<\/h3>\n<p>Duplex streams are bidirectional, meaning they can read and write data. They can be used for tasks such as proxying data from one network socket to another. Duplex streams inherit from both \u2018Readable\u2019 and \u2018Writable\u2019 streams, so they have all the methods of both.<\/p>\n<p>Duplex Stream example:<\/p>\n<pre>const { Duplex } = require('stream');\r\n\r\nconst myDuplex = new Duplex({\r\n  write(chunk, encoding, callback) {\r\n    console.log(chunk.toString());\r\n    callback();\r\n  },\r\n  read(size) {\r\n    if (this.currentCharCode > 90) {\r\n      this.push(null);\r\n      return;\r\n    }\r\n    this.push(String.fromCharCode(this.currentCharCode++));\r\n  }\r\n});\r\n\r\nmyDuplex.currentCharCode = 65;\r\n\r\nprocess.stdin.pipe(myDuplex).pipe(process.stdout);<\/pre>\n<p>In this example, we create a new Duplex stream using the Duplex class from the stream module. The write method is called whenever data is written to the stream, and simply logs the chunk of data to the console. The read method is called whenever the stream is read from, and in this example, it pushes characters from the ASCII character set to the stream until the character code reaches 90, at which point it pushes null to signal the end of the stream.<\/p>\n<p>We then pipe the standard input stream (process.stdin) to our Duplex stream, and then pipe the Duplex stream to the standard output stream (process.stdout). This allows us to type input into the console, which gets written to the Duplex stream, and then the output from the Duplex stream gets written to the console.<\/p>\n<h3>Transform Stream<\/h3>\n<p>Transform streams are a type of duplex stream that can modify data as it passes through them. They can be used for compression, encryption, or data validation tasks. Transform streams inherit from \u2018Duplex\u2019, so they have both a \u2018read()\u2019 and a \u2018write()\u2019 method. When you write data to a transform stream, it will be transformed by the transform function before being emitted as output.<\/p>\n<p>Let us see an example of the transform Node.js stream.<\/p>\n<pre>const { Transform } = require('stream');\r\n\r\nclass UpperCaseTransform extends Transform {\r\n  _transform(chunk, encoding, callback) {\r\n    const upperChunk = chunk.toString().toUpperCase();\r\n    this.push(upperChunk);\r\n    callback();\r\n  }\r\n}\r\n\r\nprocess.stdin.pipe(new UpperCaseTransform()).pipe(process.stdout);<\/pre>\n<p>In this example, we create a new class called \u2018UpperCaseTransform\u2019 that extends the built-in \u2018Transform\u2019 class from the \u2018stream\u2019 module. We override the\u2019 _transform\u2019 method to convert each chunk of incoming data to uppercase using the \u2018toUpperCase\u2019 method of the string object. Then, we push the transformed chunk to the writable stream using the \u2018push\u2019 method and call the \u2018callback\u2019 function to indicate that we&#8217;re done processing the chunk.<\/p>\n<p>Finally, we pipe the \u2018stdin\u2019 readable stream into an instance of our \u2018UpperCaseTransform\u2019 class, and pipe the resulting transformed data to the \u2018stdout\u2019 writable stream. This causes all data written to \u2018stdin\u2019 to be converted to uppercase and printed to the console.<\/p>\n<p>Now that we know about the types of Nodejs Streams, let us get to the business benefits of using them.<\/p>\n<h2>Advantages of Node Js Streaming<\/h2>\n<p>Popular companies using Node.js, having humungous data such as Netflix, NASA, Uber, Walmart, etc. are leveraging streams Node JS and hence able to manage better, sustain, and perform with their applications. Here are the advantages of using Node Streams in your Node.js applications.<\/p>\n<ul class=\"bullets text-left\">\n<li><strong>Memory efficiency:<\/strong> Streams can process large amounts of data without loading everything into memory simultaneously. This means streams can handle files and data too large to fit in memory.<\/li>\n<li><strong>Performance:<\/strong> Because streams can process data in chunks, they can be faster and more efficient than other methods requiring reading or writing the entire data set simultaneously. This can be particularly useful for real-time applications that require low latency and high throughput.<\/li>\n<li><strong>Flexibility:<\/strong> Streams can be used to handle a wide range of data sources and destinations, including files, network sockets, and HTTP requests and responses. This makes streams a versatile tool for handling data in different contexts.<\/li>\n<li><strong>Modularity:<\/strong> Node Streams can be easily combined and piped, allowing for complex data processing tasks to be broken down into smaller, more manageable parts. This can make code easier to read and maintain.<\/li>\n<li><strong>Backpressure handling:<\/strong> Streams can handle backpressure by automatically slowing down the data source when the data destination cannot keep up. This can help prevent buffer overflows and other performance issues.<\/li>\n<\/ul>\n<p>Overall, the use of streams in Node.js can help improve the performance, scalability, and maintainability of applications that handle large amounts of data.<\/p>\n<p>It is time to further know about the potential and scope of implementing Node streaming, along with the use cases of Node Js Streams. <\/p>\n<h2>Piping in Node Streams<\/h2>\n<p>In Node Js Streaming, piping is a way to connect a readable stream with a writable one using the pipe() method. The pipe() method takes a writable stream as an argument and connects it to a readable stream.<\/p>\n<p>When pipe() is called, it sets up listeners on the readable stream&#8217;s &#8216;data&#8217; and &#8216;end&#8217; events, and automatically writes data from the readable stream to the writable stream until the end of the readable stream is reached. This makes it easy to chain together multiple streams and create a pipeline for processing data.<\/p>\n<p>Here&#8217;s an example of using pipe() method:<\/p>\n<pre>const fs = require('fs');\r\n\r\n\/\/ Create a readable stream from a file\r\nconst readStream = fs.createReadStream('input.txt');\r\n\r\n\/\/ Create a writable stream to a file\r\nconst writeStream = fs.createWriteStream('output.txt');\r\n\r\n\/\/ Pipe the readable stream to the writable stream\r\nreadStream.pipe(writeStream);\r\n\r\n\/\/ Handle errors emitted by either stream\r\nreadStream.on('error', (err) => {\r\n  console.error(`Error reading file: ${err}`);\r\n});\r\n\r\nwriteStream.on('error', (err) => {\r\n  console.error(`Error writing file: ${err}`);\r\n});<\/pre>\n<p>In this example, we first create readable and writable type streams using the fs module. Then, we then use the pipe() method to connect the readable stream to the writable stream.<\/p>\n<p>We also handle any errors emitted by either stream using the on(&#8216;error&#8217;) method.<\/p>\n<p>Note that pipe() is a convenient way to handle stream data flow in Node.js, but it may not always be suitable for complex stream processing scenarios. Also, Discover various debugging techniques and tools that can help you identify and fix issues quickly with <a href=\"https:\/\/www.bacancytechnology.com\/blog\/debug-node-js-application\" target=\"_blank\" rel=\"noopener\">Debug Node JS Application.<\/a><\/p>\n<h3>Pros and Cons of Piping Node Streams<\/h3>\n<style>\n.post-content table td, .post-content table th, .wpb_text_column table td, .wpb_text_column table th { vertical-align: top; }\ntable tr td:first-child {\n    min-width: 170px;\n}\n<\/style>\n<div class=\"post-content table-orange\">\n<table style=\"width:100%\">\n<tr>\n<th><strong>Benefits<\/strong><\/th>\n<th><strong>Drawbacks<\/strong><\/th>\n<\/tr>\n<tr>\n<td>Efficient processing <\/td>\n<td>Steep learning curve<\/td>\n<\/tr>\n<tr>\n<td>Easy to Use<\/td>\n<td>Not compatible with other Node.js streams<\/td>\n<\/tr>\n<tr>\n<td>Modular code<\/td>\n<td>Debugging issues<\/td>\n<\/tr>\n<tr>\n<td>Backpressure handling<\/td>\n<td>Complex control flow<\/td>\n<\/tr>\n<\/table>\n<\/div>\n<p><\/br><\/p>\n<p>Let us head to answer the question, What is Node Stream chaining?<\/p>\n<h2>Node Js Stream Chaining<\/h2>\n<p>In Node Streams, chaining is a way to connect multiple stream operations together using method chaining. Chaining allows you to easily create a pipeline of stream operations that can be applied to a readable stream, transforming or processing the data as it flows through the pipeline.<\/p>\n<p>To chain stream operations together, you simply call methods on a readable stream, which returns new stream objects that can be further manipulated or connected to other Node streams. The resulting stream operations are applied in sequentially as the data flows through the pipeline.<\/p>\n<p>Here&#8217;s an example of using chaining to create a pipeline of stream operations:<\/p>\n<pre>const fs = require('fs');\r\n\r\n\/\/ Create a readable stream from a file\r\nconst readStream = fs.createReadStream('input.txt');\r\n\r\n\/\/ Chain stream operations to transform the data\r\nreadStream\r\n  .pipe(transformStream1)\r\n  .pipe(transformStream2)\r\n  .pipe(transformStream3)\r\n  .pipe(writeStream);\r\n\r\n\/\/ Create a writable stream to a file\r\nconst writeStream = fs.createWriteStream('output.txt');\r\n\r\n\/\/ Define transform stream operations\r\nconst transformStream1 = \/\/ ...\r\nconst transformStream2 = \/\/ ...\r\nconst transformStream3 = \/\/ ...<\/pre>\n<p>In this example of Node streams, we create a readable stream from a file using the fs module. We then combine several stream operations to transform the data, using the pipe() method to connect each operation to the next.<\/p>\n<p>We define the individual transform stream operations separately and pass them as arguments to pipe(). These operations can be any stream type, including Transform, Duplex, or even other Readable streams.<\/p>\n<p>Finally, we create a writable stream to a file and connect it to the end of the pipeline using pipe().<\/p>\n<p>Note that chaining is a powerful way to process stream data in Node.js, but it may only sometimes be the most efficient or flexible approach. Also, Learn how to leverage a new version of Node.js for your projects. Follow simple steps to download and install the <a href=\"https:\/\/www.bacancytechnology.com\/blog\/whats-new-in-node-19\" target=\"_blank\" rel=\"noopener\">latest version of Node 19<\/a> and updates.<\/p>\n<h3>Pros and Cons of Chaining in Node<\/h3>\n<style>\n.post-content table td, .post-content table th, .wpb_text_column table td, .wpb_text_column table th { vertical-align: top; }\ntable tr td:first-child {\n    min-width: 170px;\n}\n<\/style>\n<div class=\"post-content table-orange\">\n<table style=\"width:100%\">\n<tr>\n<th><strong>Benefits<\/strong><\/th>\n<th><strong>Drawbacks<\/strong><\/th>\n<\/tr>\n<tr>\n<td>Flexible processing<\/td>\n<td>Complex<\/td>\n<\/tr>\n<tr>\n<td>Reusability<\/td>\n<td>Steep learning curve<\/td>\n<\/tr>\n<tr>\n<td>Improved performance<\/td>\n<td>Limited compatibility<\/td>\n<\/tr>\n<tr>\n<td>Easy debugging<\/td>\n<td>Issues with control flow<\/td>\n<\/tr>\n<\/table>\n<\/div>\n<p><\/br><\/p>\n<h2>Key Takeaway<\/h2>\n<p>Data handling with Node.js streams enables Node developers to smoothly function with the incoming and outgoing data. Entrepreneurs can better manage, function, and leverage excellent <a href=\"https:\/\/www.bacancytechnology.com\/blog\/node-js-performance\" target=\"_blank\" rel=\"noopener\">Node js performance<\/a> out of their Node application using streams, especially with better memory management. <\/p>\n<h2>Frequently Asked Questions (FAQs)<\/h2>\n<h3>What are the benefits of using Node.js Streams?<\/h3>\n<p>The benefits of using Node.js Streams include improved performance, lower memory usage, and better handling of large data sets. Streams allow you to process data in chunks, which can help avoid bottlenecks and reduce the memory needed to process data. Streams also allow you to process data as received or sent, which can help reduce latency and improve overall performance.<\/p>\n<h3>Are Node.js Streams compatible with other programming languages?<\/h3>\n<p>Yes, Node.js Streams are compatible with other programming languages and can stream data between different systems or applications.<\/p>\n<h3>When to use Node.js streams?<\/h3>\n<p>Node Streams can be a powerful tool for processing data efficiently in Node.js applications. Using streams turns out to be fruitful in the following use cases: processing large files, real-time data processing, handling HTTP requests and responses, and when transforming data. <\/p>\n","protected":false},"excerpt":{"rendered":"<p>Quick Summary ? Node Streams are an efficient way to channelize and process input and output data for Node.js application. ? Using Node Js streaming, entrepreneurs can improve the performance, scalability, and maintainability of Node.js applications that function with huge amounts of data. ? Find out about the types of streams in Node.js, along with [&hellip;]<\/p>\n","protected":false},"author":34,"featured_media":34351,"comment_status":"open","ping_status":"open","sticky":false,"template":"blog-new-template.php","format":"standard","meta":{"_acf_changed":false,"inline_featured_image":false,"_lmt_disableupdate":"no","_lmt_disable":"","footnotes":""},"categories":[483],"tags":[],"coauthors":[1568,1585],"class_list":["post-34323","post","type-post","status-publish","format-standard","has-post-thumbnail","hentry","category-node-js"],"acf":[],"modified_by":"Binal Prajapati","_links":{"self":[{"href":"https:\/\/www.bacancytechnology.com\/blog\/wp-json\/wp\/v2\/posts\/34323","targetHints":{"allow":["GET"]}}],"collection":[{"href":"https:\/\/www.bacancytechnology.com\/blog\/wp-json\/wp\/v2\/posts"}],"about":[{"href":"https:\/\/www.bacancytechnology.com\/blog\/wp-json\/wp\/v2\/types\/post"}],"author":[{"embeddable":true,"href":"https:\/\/www.bacancytechnology.com\/blog\/wp-json\/wp\/v2\/users\/34"}],"replies":[{"embeddable":true,"href":"https:\/\/www.bacancytechnology.com\/blog\/wp-json\/wp\/v2\/comments?post=34323"}],"version-history":[{"count":0,"href":"https:\/\/www.bacancytechnology.com\/blog\/wp-json\/wp\/v2\/posts\/34323\/revisions"}],"wp:featuredmedia":[{"embeddable":true,"href":"https:\/\/www.bacancytechnology.com\/blog\/wp-json\/wp\/v2\/media\/34351"}],"wp:attachment":[{"href":"https:\/\/www.bacancytechnology.com\/blog\/wp-json\/wp\/v2\/media?parent=34323"}],"wp:term":[{"taxonomy":"category","embeddable":true,"href":"https:\/\/www.bacancytechnology.com\/blog\/wp-json\/wp\/v2\/categories?post=34323"},{"taxonomy":"post_tag","embeddable":true,"href":"https:\/\/www.bacancytechnology.com\/blog\/wp-json\/wp\/v2\/tags?post=34323"},{"taxonomy":"author","embeddable":true,"href":"https:\/\/www.bacancytechnology.com\/blog\/wp-json\/wp\/v2\/coauthors?post=34323"}],"curies":[{"name":"wp","href":"https:\/\/api.w.org\/{rel}","templated":true}]}}