In addition to Weibo, there is also WeChat
Please pay attention
WeChat public account
Shulou
2025-01-18 Update From: SLTechnology News&Howtos shulou NAV: SLTechnology News&Howtos > Development >
Share
Shulou(Shulou.com)06/02 Report--
This article mainly introduces the relevant knowledge of "how to transfer large files in nodejs http". The editor shows you the operation process through an actual case. The operation method is simple and fast, and it is practical. I hope that this article "how to transfer large files in nodejs http" can help you solve the problem.
The http file transfer scheme based on nodejs plays a very important role in the front and rear full stack development at the present stage. In this paper, I will transfer large files in http through several schemes. Before implementing the function, we first write a large file through the fs module of nodejs and generate a local file in the project:
Const fs = require ('fs'); const writeStream = fs.createWriteStream (_ _ dirname + "/ file.txt"); for (let I = 0 return new Promise I {return new Promise ((resolve, reject) = > {fs.readFile (paramsData, (err, data) = > {if (err) {reject (' file read error');} else {resolve (data) }})} / / Encapsulation file compression method const gzip = async (paramsData) = > {return new Promise ((resolve, reject) = > {zlib.gzip (paramsData, (err, result) = > {if (err) {reject ('file compression error');} else {resolve (result);}})})}
1. Transfer after data compression through large files
When sending a request, the browser will carry the accept and accept-* request header information, which is used to tell the server the file types supported by the current browser, the list of supported compressed formats and the supported languages. The Accept-Encoding field in the request header is used to tell the server how the content is encoded (usually some kind of compression algorithm) that the client can understand. The server selects a method supported by the client and notifies the client of the choice through the response header Content-Encoding. The response header tells the browser that the returned JS script has been processed by the gzip compression algorithm.
/ / request header accept-encoding: gzip, deflate, br// response header cache-control: max-age=2592000 content-encoding: gzip content-type: application/x-javascript
Based on the understanding of the Accept-Encoding and Content-Encoding fields, let's verify the effect of not enabling gzip and enabling gzip.
/ / implement a simple file reading server (gzip is not enabled) const server = http.createServer (async (req, res) = > {res.writeHead (200,{ "Content-Type": "text/plain;charset=utf-8",}); const buffer = await readFile (_ _ dirname +'/ file.txt'); res.write (buffer); res.end ();}) server.listen (3000, () = > {console.log (`server started successfully `)})
/ / implement a simple file reading server (open gzip) const server = http.createServer (async (req, res) = > {res.writeHead (200,{ "Content-Type": "text/plain;charset=utf-8", "Content-Encoding": "gzip"}); const buffer = await readFile (_ _ dirname +'/ file.txt'); const gzipData = await gzip (buffer); res.write (gzipData); res.end () }) server.listen (3000, () = > {console.log (`server started successfully `)})
two。 Transmission through data blocks
When a scene needs to generate a large HTML table with the data queried from the database, or when a large number of images need to be transferred, it can be achieved by block transmission.
Transfer-Encoding: chunkedTransfer-Encoding: gzip, chunked
The value of the response header Transfer-Encoding field is chunked, indicating that the data is sent in a series of chunks. It is important to note that the Transfer-Encoding and Content-Length fields are mutually exclusive, which means that these two fields cannot appear at the same time in the response message.
/ / data multiblock transmission const spilitChunks = async () = > {const buffer = await readFile (_ _ dirname +'/ file.txt'); const lines = buffer.toString ('utf-8'). Split ('\ n'); let [chunks, I, n] = [[], 0, lines.length]; while (I
< n) { chunks.push(lines.slice(i, i+= 10)); }; return chunks;}const server = http.createServer(async(req, res) =>{res.writeHead (200,{ "Content-Type": "text/plain;charset=utf-8", "Transfer-Encoding": "chunked", "Access-Control-Allow-Origin": "*",}); const chunks = await spilitChunks (); for (let I = 0; I
< chunks.length; i++) { setTimeout(() =>{let content = chunks [I] .join ("&"); res.write (`$ {content.length.toString (16)}\ r\ n$ {content}\ r\ n`);}, I * 1000);} setTimeout (() = > {res.end ();}, chunks.length * 1000);}) server.listen (3000, () = > {console.log (`server starts successfully`)})
3. Transmitted in the form of a data stream
When using Node.js to return large files to the client, returning the file stream in the form of a stream can avoid consuming too much memory when processing large files. The specific implementation is shown below. When the file data is returned in the form of a stream, the value of the Transfer-Encoding field of the HTTP response header is chunked, indicating that the data is sent in a series of chunks.
Const server = http.createServer ((req, res) = > {res.writeHead (200,{ "Content-Type": "text/plain;charset=utf-8", "Content-Encoding": "gzip", "Transfer-Encoding": "chunked"}); fs.createReadStream (_ _ dirname + "/ file.txt") .setEncoding ("utf-8") .pipe (zlib.createGzip ()) .pipe (res) }) server.listen (3000, () = > {console.log (`server starts successfully`)}) this is the end of the introduction on "how to transfer large files on http by nodejs". Thank you for your reading. If you want to know more about the industry, you can follow the industry information channel. The editor will update different knowledge points for you every day.
Welcome to subscribe "Shulou Technology Information " to get latest news, interesting things and hot topics in the IT industry, and controls the hottest and latest Internet news, technology news and IT industry trends.
Views: 0
*The comments in the above article only represent the author's personal views and do not represent the views and positions of this website. If you have more insights, please feel free to contribute and share.
Continue with the installation of the previous hadoop.First, install zookooper1. Decompress zookoope
"Every 5-10 years, there's a rare product, a really special, very unusual product that's the most un
© 2024 shulou.com SLNews company. All rights reserved.