In addition to Weibo, there is also WeChat
Please pay attention
WeChat public account
Shulou
2025-01-30 Update From: SLTechnology News&Howtos shulou NAV: SLTechnology News&Howtos > Development >
Share
Shulou(Shulou.com)06/02 Report--
The main content of this article is to explain "what is the method of Node.js dealing with data IO in JavaScript". Interested friends may wish to have a look. The method introduced in this paper is simple, fast and practical. Next, let the editor take you to learn "what is the method of Node.js dealing with data IO in JavaScript"?
In JavaScript, the manipulation of strings has been very convenient, there are no binary data types, these simple operations on strings and DOM operations can basically meet the needs of the front end, but Node.js provides the function of dealing with files and iCandle O, so you need to deal with a lot of binary data. In Node.js, Buffer and Stream file streams provide good support for binary data processing.
Buffer cache area
The Buffer class is a core library released with the Node.js kernel to support data processing that moves during the Icano operation. Buffer class provides a way for Node.js to store raw data. This method is used to create a cache area in memory for binary data, that is to say, a certain amount of storage space is reserved in memory to temporarily store input or output data, so that Node.js can also deal with binary data.
First of all, binary is a kind of number system widely used in computer technology. it is a kind of data represented by 0 and 1 digits. Because computers can only recognize binary data, it is difficult for people to understand what the combination of 0 and 1 represents, so it is necessary to convert binary data into characters that people can recognize, which requires character coding to find the corresponding character set for conversion. The character encoding of the farthest book is ASCII.
Besides, the constructor of Buffer
New Buffer (array)
New Buffer (size)
New Buffer (str [, encoding])
New Buffer (Buffer)
New Buffer (arrayBuffer)
As you can see, the parameters are bytes, arrays, buffer objects, strings, and so on.
Another example is the syntax for writing and reading:
Write (string [, offset [, length]] [, encoding])
ToString ([encoding,state [, end]])
Concat (list [, totallength])
Stream file stream
Because the Buffer cache is limited to 1GB, files beyond 1GB cannot read and write directly. When reading and writing large files, if the read / write resources are continuous, then Node.js can not continue other work. For this reason, Node.js provides Stream file stream module.
The ideal way is to read part and write part, regardless of the size of the file, it is only a matter of time, and it will always be done, which requires the concept of stream.
File A flows through the data stream pipeline into file B, using "read part. Write part", the advantage of the stream is that the receiver can process it in advance, shorten the time, and improve the speed, just like watching video online, not all cached and then played, but watch part, cache part.
Four flow types of Stream
Readable
Writable
Duplex
Transform
When copying large files, the data is passed through the chunk parameters of the readable stream. Chunk is like a basin for connecting data, and there is a function called pipe () in the readable stream. This function is a very efficient file processing method, which can simplify the operation of copying files before. Therefore, it is also important to use pipe to read and write data.
At this point, I believe you have a deeper understanding of "what is the method of Node.js dealing with data IO in JavaScript". You might as well do it in practice. Here is the website, more related content can enter the relevant channels to inquire, follow us, continue to learn!
Welcome to subscribe "Shulou Technology Information " to get latest news, interesting things and hot topics in the IT industry, and controls the hottest and latest Internet news, technology news and IT industry trends.
Views: 0
*The comments in the above article only represent the author's personal views and do not represent the views and positions of this website. If you have more insights, please feel free to contribute and share.
Continue with the installation of the previous hadoop.First, install zookooper1. Decompress zookoope
"Every 5-10 years, there's a rare product, a really special, very unusual product that's the most un
© 2024 shulou.com SLNews company. All rights reserved.