Network Security Internet Technology Development Database Servers Mobile Phone Android Software Apple Software Computer Software News IT Information

In addition to Weibo, there is also WeChat

Please pay attention

WeChat public account

Shulou

What about the stream streaming module in Nodejs?

2025-04-06 Update From: SLTechnology News&Howtos shulou NAV: SLTechnology News&Howtos > Development >

Share

Shulou(Shulou.com)06/02 Report--

This article will explain in detail about the stream streaming module in Nodejs. The editor thinks it is very practical, so I share it for you as a reference. I hope you can get something after reading this article.

Stream streaming module is a very core module in Node. Other modules such as fs, http and so on are based on the instance of streaming stream module.

However, for most front-end rookies in the learning process of just getting started with Node, the concept and use of flow are not clearly understood, because there seem to be few applications related to "flow" processing in the front-end work.

1. Stream, what is it?

Simply "flow" the word, we are easy to produce the concept of current, flow and so on.

Official definition: stream, which is an abstract interface for processing stream data in Node.js

From the official definition, we can see:

Stream, a tool for processing data provided by Node

Stream, which is an abstract interface in Node

Accurate understanding, flow, can be understood as data flow, it is a means to transmit data, in an application, flow is an orderly data flow with a starting point and an end point.

The main reason for our poor understanding of stream flow is that it is an abstract concept.

two。 Stream, the specific usage scenario of

In order to enable us to clearly understand the stream module, let's first explain the practical application of the stream module with specific application scenarios.

Stream stream is mainly used in Node for a large number of data processing requirements, such as fs reading and writing of large files, http request response, file compression, data encryption / decryption and other applications.

We use the picture above to illustrate the use of the flow, the bucket can be understood as the data source, the pool can be understood as the data target, the middle connected pipeline, we can understand as the data flow, through the data flow pipeline, the data flows from the data source to the data target.

3. Classification of streams

In Node, streams are divided into four categories: readable streams, writable streams, duplex flows, and conversion streams.

Writable: a stream to which data can be written

Readable: the stream from which you can read data

Duplex: the flow of Readable and Writable

Transform: a Duplex stream that can modify or transform data as it is written and read

All streams are instances of EventEmitter. That is, we can listen for changes in the data flow through the event mechanism.

4. Data mode and cache area

Before delving into the specific use of the four types of streams, we need to understand the two conceptual data schemas and caches, which will help us to better understand the flow in the following learning.

4.1 data mode

All streams created by Node.js API operate only on strings and Buffer (or Uint8Array) objects.

4.2 cache area

Both Writable and Readable streams store data in an internal buffer (buffer).

The amount of data that can be buffered depends on the highWaterMark option of the constructor passed to the stream, the highWaterMark option specifies the total number of bytes for a normal stream, and the highWaterMark option specifies the total number of objects for streams operating in object mode.

The highWaterMark option is a threshold, not a limit: it specifies the amount of data that the stream buffers before it stops requesting more data.

When the implementation calls stream.push (chunk), the data is cached in the Readable stream. If the consumer of the stream does not call stream.read (), the data resides in the internal queue until it is consumed.

Once the total size of the internal read buffer reaches the threshold specified by highWaterMark, the stream temporarily stops reading data from the underlying resources until the currently buffered data can be consumed

When the writable.write (chunk) method is called repeatedly, the data is cached in the Writable stream.

5. Readable stream

5.1 flow and pause of stream reading

Readable streams operate efficiently in one of two modes: flow and pause.

Flow mode: read data from the bottom of the system and push () to the cache. When highWaterMark is reached, push () returns false, resources stop flowing to the cache, and data event consumption data is triggered.

Pause mode: all Readable streams start in Paused pause mode, and the stream.read () method must be explicitly called to read data from the stream. Every time the data reaches the cache, a readable event is triggered, that is, every time push () triggers readable.

How to switch from pause mode to flow mode:

Add data event handle

Call the stream.resume () method

Call the stream.pipe () method to send the data to Writable

How to switch from flow mode to pause mode:

If there is no pipe target, then by calling the stream.pause () method.

If there are pipe targets, delete all pipe targets. Multiple pipe targets can be deleted by calling the stream.unpipe () method.

5.2 Common examples of readable streams

Import path from 'path';import fs, {read} from' fs';const filePath = path.join (path.resolve (), 'files',' text.txt'); const readable = fs.createReadStream (filePath); / / if the default encoding is specified for the stream using the readable.setEncoding () method, the listener callback will pass in the data block as a string; otherwise, the data will be passed in as Buffer. Readable.setEncoding ('utf8'); let str ='; readable.on ('open', (fd) = > {console.log (' start reading file')}) / / whenever the stream transfers ownership of the data block to the consumer, the 'data' event readable.on (' data', (data) = > {str + = data) is triggered The console.log ('read data')}) / / method causes the flow in flow mode to stop triggering the 'data' event and switch to pause mode. Any available data will be retained in the internal buffer. The readable.pause () / / method causes the explicitly paused Readable stream to resume triggering the 'data' event, switching the stream to flow mode. The 'pause' event is triggered when readable.resume () is called stream.pause () and readableFlowing is not false. Readable.on ('pause', () = > {console.log (' read pause')) / / when stream.resume () is called and readableFlowing is not true, the 'resume' event is triggered. Readable.on ('resume', () = > {console.log (' reflow)}) / / triggers the 'end' event when there is no more data in the stream to consume. The 'close' event is triggered when readable.on (' end', () = > {console.log ('file read');}) / / when the stream and any of its underlying resources (such as file descriptors) have been closed. Readable.on ('close', () = > {console.log (' close file reading')}) / / binds the destWritable stream to readable so that it automatically switches to flow mode and pushes all its data to the bound Writable. The data flow will be automatically managed readable.pipe (destWriteable) / / this may occur if the underlying stream cannot generate data due to an underlying internal failure, or when the stream implementation attempts to push invalid blocks. Readable.on ('error', (err) = > {console.log (err) console.log (' file read error')}) 6. Writable stream

6.1 flow and pause of writable streams

The writeable stream is similar to the readable stream. When the data stream comes, it will be written directly to the cache. When the write speed is slow or the write is paused, the data stream will be cached in the cache.

When the producer writes too fast and fills up the queue pool, there will be "back pressure". At this time, the producer needs to be told to suspend production. When the queue is released, the writable stream will send a drain message to the producer to resume production.

6.2 writable flow example

Import path from 'path';import fs, {read} from' fs';const filePath = path.join (path.resolve (), 'files',' text.txt'); const copyFile = path.join (path.resolve (), 'files',' copy.txt'); let str =''; / / create a readable stream const readable = fs.createReadStream (filePath); / / if the default encoding readable.setEncoding ('utf8') is specified for the stream using the readable.setEncoding () method / / create a writable stream const wirteable = fs.createWriteStream (copyFile); / / Encoding wirteable.setDefaultEncoding ('utf8'); readable.on (' open', (fd) = > {console.log ('start reading file')}) / / whenever the stream transfers ownership of the data block to the consumer, the 'data' event readable.on (' data', (data) = > {str + = data; console.log ('read data') is triggered) / / write wirteable.write (data, 'utf8');}) wirteable.on (' open', () = > {console.log ('start writing data')}) / / if the call to stream.write (chunk) returns false, the 'drain' event will be triggered when it is appropriate to continue writing data to the stream. / / that is, the speed of the production data is faster than the writing speed. After the cache is full, the production is suspended. / after the writeable cache is released, a drain event is sent to the producer to continue reading wirteable.on ('drain', () = > {console.log (' continue writing')}) / / after calling the stream.end () method, and all the data has been refreshed to the bottom tier system. The 'finish' event is triggered. Wirteable.on ('finish', () = > {console.log (' data written')}) readable.on ('end', () = > {/ / notify writable stream wirteable.end ()}) / / when the stream.pipe () method is called on the readable stream to add this writable stream to its target set, the' pipe' event is triggered. / / readable.pipe (destWriteable) wirteable.on ('pipe', () = > {console.log (' pipeline flow creation')}) wirteable.on ('error', () = > {console.log (' data write error')}) this is the end of the article on "how is the stream flow module in Nodejs". I hope the above content can be helpful to you, so that you can learn more knowledge, if you think the article is good Please share it for more people to see.

Welcome to subscribe "Shulou Technology Information " to get latest news, interesting things and hot topics in the IT industry, and controls the hottest and latest Internet news, technology news and IT industry trends.

Views: 0

*The comments in the above article only represent the author's personal views and do not represent the views and positions of this website. If you have more insights, please feel free to contribute and share.

Share To

Development

Wechat

© 2024 shulou.com SLNews company. All rights reserved.

12
Report