In addition to Weibo, there is also WeChat
Please pay attention
WeChat public account
Shulou
2025-03-28 Update From: SLTechnology News&Howtos shulou NAV: SLTechnology News&Howtos > Development >
Share
Shulou(Shulou.com)06/02 Report--
This article mainly introduces "document flow example Analysis in node.js". In daily operation, I believe that many people have doubts about file flow example analysis in node.js. The editor consulted all kinds of data and sorted out simple and easy-to-use operation methods. I hope it will be helpful to answer the doubts of "document flow example Analysis in node.js". Next, please follow the editor to study!
File stream
Because the speed and capacity of reading and memory of various media in the computer are different, it is possible that one side will wait for a long time in the operation.
There are mainly three kinds of file streams, which are input stream (Readable), output stream (Writeable) and duplex stream (Duplex). There is another kind of stream that is not often used, and that is Transform.
The stream module is provided in node, and there are two class instances in this module: Readable and Writable, both of which are inherited in the stream, so there will be many common methods, and so on.
Readable stream (Readable)
Input stream: data flows from the source to memory, transferring the data from the disk to memory.
CreateReadStream
Fs.createReadStream (path, configuration)
In the configuration, there are: encoding (encoding), start (start read byte), end (end read byte), highWaterMark (amount of each read)
HighWaterMark: if encoding has a value, this number represents a number of characters; if encoding is null, this number represents the number of bytes
Returns a subclass ReadStream of Readable
Const readable = fs.createReadStream (filename, {encoding: 'utf-8', start: 1, end: 2, / / highWaterMark:})
Registration event
Readable.on (event name, handler)
Readable.on ('open', (err, data) = > {/ / console.log (err); console.log (' file opened');}) readable.on ('error', (data, err) = > {console.log (data, err); console.log (' error reading file');}) readable.on ('close', (data, err) = > {/ / console.log (data, err)) Console.log ('file closed');}) readable.close () / / manually triggered to close automatically through readable.close () or after the file has been read-the autoClose configuration item defaults to truereadable.on ('data', (data) = > {console.log (data); console.log (' file is being read');}) readable.on ('end', () = > {console.log (' file read complete');})
Pause reading
Readable.pause () pauses reading, triggering the pause event
Resume reading
Readable.resume () resumes reading, triggering the resume event
Writable stream
Const ws = fs.createWriteStream (filename [, configuration])
Ws.write (data)
Write a data, data can make the string can also be Buffer, return a Boolean value.
If true is returned, it indicates that the write channel is not full, and the next data can be written directly, and the write channel is the size represented by the highWaterMark in the configuration.
If false is returned, the write channel is full and the rest of the characters start to wait, resulting in back pressure.
Const ws = fs.createWriteStream (filename, {encoding: 'utf-8', highWaterMark: 2}) const flag = ws.write (' Liu'); console.log (flag); / / false although it will only be executed once, it will continue to be written when the channel has free space, and the value is not returned. Ws.write () returns the value only once. Const flag = ws.write ('a'); console.log (flag); const flag1 = ws.write ('a'); console.log (flag1); const flag2 = ws.write ('a'); console.log (flag2); const flag3 = ws.write ('a'); console.log (flag3); output order: true, false, false, false already occupied two bytes when they were written for the second time, and directly filled up after the third write, so return false
Use streaming to copy and paste files and resolve back pressure problems
Const filename = path.resolve (_ _ dirname,'. / file/write.txt'); const wsfilename = path.resolve (_ _ dirname,'. / file/writecopy.txt'); const ws = fs.createWriteStream (wsfilename); const rs = fs.createReadStream (filename) rs.on ('data', chumk = > {const falg = ws.write (chumk); if (! falg) {rs.pause () }}) ws.on ('drain', () = > {rs.resume ();}) rs.on (' close', () = > {ws.end (); console.log ('copy end');})
Pipe
Using pipe, you can also connect the readable stream and the write stream directly, and you can also solve the back pressure problem.
Rs.pipe (ws); rs.on ('close', () = > {ws.end (); console.log (' copy end');}) so far, the study on "file stream example analysis in node.js" is over, hoping to solve everyone's doubts. The collocation of theory and practice can better help you learn, go and try it! If you want to continue to learn more related knowledge, please continue to follow the website, the editor will continue to work hard to bring you more practical articles!
Welcome to subscribe "Shulou Technology Information " to get latest news, interesting things and hot topics in the IT industry, and controls the hottest and latest Internet news, technology news and IT industry trends.
Views: 0
*The comments in the above article only represent the author's personal views and do not represent the views and positions of this website. If you have more insights, please feel free to contribute and share.
Continue with the installation of the previous hadoop.First, install zookooper1. Decompress zookoope
"Every 5-10 years, there's a rare product, a really special, very unusual product that's the most un
© 2024 shulou.com SLNews company. All rights reserved.