Network Security Internet Technology Development Database Servers Mobile Phone Android Software Apple Software Computer Software News IT Information

In addition to Weibo, there is also WeChat

Please pay attention

WeChat public account

Shulou

How to use Node to realize content Compression

2025-03-18 Update From: SLTechnology News&Howtos shulou NAV: SLTechnology News&Howtos > Development >

Share

Shulou(Shulou.com)06/01 Report--

This article mainly introduces how to use Node to achieve content compression, has a certain reference value, interested friends can refer to, I hope you can learn a lot after reading this article, the following let the editor take you to understand it.

When viewing your application log, I found that it always takes a few seconds to load after entering the log page (the interface is not paged), so open the network panel to view

It was only then that I found that the data returned by the interface had not been compressed. I thought the interface used Nginx to reverse proxy, and Nginx would automatically help me do this layer (explore this piece later, it is feasible in theory)

The back end here is the Node service.

Pre-knowledge

The following clients all refer to browsers

Accept-encoding

When the client initiates a request to the server, it adds an accept-encoding field to the request header (request header), whose value indicates the compressed content encoding format supported by the client.

Content-encoding

After compressing the returned content, the server informs the browser of the coding algorithm used by the actual compression of the content by adding content-encoding in the response header (response header).

Deflate/gzip/br

Deflate is a lossless data compression algorithm that uses both LZ77 algorithm and Huffman coding (Huffman Coding).

Gzip is an algorithm based on DEFLATE

Br refers to Brotli, and this data format is designed to further improve the compression ratio. The compression of text can increase the compression density by 20% compared with deflate, while the compression and decompression speed is roughly the same.

Zlib module

Node.js includes a zlib module that provides compression using Gzip, Deflate/Inflate, and Brotli

Here we take gzip as an example to list a variety of usage methods in different scenarios. Deflate/Inflate is used in the same way as Brotli, except that API is different.

Stream-based operation

Buffer-based operation

Introduce several required modules

Const zlib = require ('zlib') const fs = require (' fs') const stream = require ('stream') const testFile =' tests/origin.log'const targetFile = `${testFile} .gz`const decodeFile =` ${testFile} .un.gz` file decompression / compression

View the decompression / compression results. Here, use the du instruction to directly count the results before and after decompression.

# execute du-ah tests# results as follows: 108k tests/origin.log.gz2.2M tests/origin.log2.2M tests/origin.log.un.gz4.6M tests stream-based operation

Using createGzip with createUnzip

Note: all zlib API, except those that are explicitly synchronized, use the Node.js internal thread pool, which can be considered asynchronous

Therefore, the compression and decompression code in the following example should be executed separately, otherwise an error will be reported.

Method 1: transfer the flow directly using the pipe method on the instance

/ / compress const readStream = fs.createReadStream (testFile) const writeStream = fs.createWriteStream (targetFile) readStream.pipe (zlib.createGzip ()) .pipe (writeStream) / / decompress const readStream = fs.createReadStream (targetFile) const writeStream = fs.createWriteStream (decodeFile) readStream.pipe (zlib.createUnzip ()) .pipe (writeStream)

Method 2: using the pipeline on stream, you can do other processing alone in the fall back.

/ / compress const readStream = fs.createReadStream (testFile) const writeStream = fs.createWriteStream (targetFile) stream.pipeline (readStream, zlib.createGzip (), writeStream, err = > {if (err) {console.error (err);}}) / / decompress const readStream = fs.createReadStream (targetFile) const writeStream = fs.createWriteStream (decodeFile) stream.pipeline (readStream, zlib.createUnzip (), writeStream, err = > {if (err) {console.error (err);}})

Method 3: Promise pipeline method

Const {promisify} = require ('util') const pipeline = promisify (stream.pipeline) / / compressed const readStream = fs.createReadStream (testFile) const writeStream = fs.createWriteStream (targetFile) pipeline (readStream, zlib.createGzip (), writeStream) .catch (err = > {console.error (err);}) / / decompress const readStream = fs.createReadStream (targetFile) const writeStream = fs.createWriteStream (decodeFile) pipeline (readStream, zlib.createUnzip (), writeStream) .catch (err = > {console.error (err)) }) Buffer-based operation

Using gzip and unzip API, these two methods include synchronous and asynchronous types

Compress

Gzip

GzipSync

Decompression

Unzip

UnzipSync

Method 1: transfer readStream to Buffer, and then do further

Gzip: asynchronous

/ / compress const buff = [] readStream.on ('data', (chunk) = > {buff.push (chunk)}) readStream.on (' end', () = > {zlib.gzip (Buffer.concat (buff), targetFile, (err, resBuff) = > {if (err) {console.error (err); process.exit ()} fs.writeFileSync (targetFile,resBuff)})})

GzipSync: synchronization

/ / compress const buff = [] readStream.on ('data', (chunk) = > {buff.push (chunk)}) readStream.on (' end', () = > {fs.writeFileSync (targetFile,zlib.gzipSync (Buffer.concat (buff)})

Method 2: read directly through readFileSync

/ / decompress const readBuffer = fs.readFileSync (testFile) const decodeBuffer = zlib.gzipSync (readBuffer) fs.writeFileSync (targetFile,decodeBuffer) / / decompress const readBuffer = fs.readFileSync (targetFile) const decodeBuffer = zlib.gzipSync (decodeFile) fs.writeFileSync (targetFile,decodeBuffer) text content decompress / compress

In addition to compressing files, sometimes it may be necessary to decompress the transferred content directly.

Here, take compressed text content as an example.

/ / Test data const testData = fs.readFileSync (testFile, {encoding: 'utf-8'}) based on stream operation

Just consider the conversion of string = > buffer = > stream.

String = > buffer

Const buffer = Buffer.from (testData)

Buffer = > stream

Const transformStream = new stream.PassThrough () transformStream.write (buffer) / / orconst transformStream = new stream.Duplex () transformStream.push (Buffer.from (testData)) transformStream.push (null)

Here is an example of writing to a file, of course, it can also be written to other streams, such as HTTP's Response (described separately later)

TransformStream .pipe (zlib.createGzip ()) .pipe (fs.createWriteStream (targetFile)) is based on Buffer operation

Also use Buffer.from to convert strings to buffer

Const buffer = Buffer.from (testData)

Then convert it directly using synchronous API, where result is the compressed content.

Const result = zlib.gzipSync (buffer)

You can write to a file or return the compressed content directly in HTTP Server

Practice in fs.writeFileSync (targetFile, result) Node Server

Here we directly use the http module in Node to create a simple Server for demonstration.

In other Node Web frameworks, the processing idea is similar, of course, there are generally ready-made plug-ins, one-click access

Const http = require ('http') const {PassThrough, pipeline} = require (' stream') const zlib = require ('zlib') / / Test const testTxt =' Test data 123'.repeat (1000) const app = http.createServer ((req) Res) = > {const {url} = req / / read supported compression algorithm const acceptEncoding = req.headers ['accept-encoding'] .match (/ (br | deflate | gzip) / g) / / default response data type res.setHeader (' Content-Type', 'application/json) Charset=utf-8') / / routing const routes for several examples = [['/ gzip', () = > {if (acceptEncoding.includes ('gzip')) {res.setHeader (' content-encoding') 'gzip') / / use synchronous API to compress text content directly res.end (zlib.gzipSync (Buffer.from (testTxt)) return} res.end (testTxt)}], [' / deflate' () = > {if (acceptEncoding.includes ('deflate')) {res.setHeader (' content-encoding') 'deflate') / / flow-based single operation const originStream = new PassThrough () originStream.write (Buffer.from (testTxt)) originStream.pipe (zlib.createDeflate ()) .pipe (res) originStream.end () return} res.end (testTxt)}] ['/ br', () = > {if (acceptEncoding.includes ('br')) {res.setHeader (' content-encoding', 'br') res.setHeader (' Content-Type', 'text/html) Charset=utf-8') / / multiple write operations based on streams const originStream = new PassThrough () pipeline (originStream, zlib.createBrotliCompress (), res, (err) = > {if (err) {console.error (err)) }}) originStream.write (Buffer.from ('BrotliCompress')) originStream.write (Buffer.from (' test data')) originStream.write (Buffer.from (testTxt)) originStream.end () return} Res.end (testTxt)}]] const route = routes.find (v = > url.startsWith (v [0])) if (route) {route [1] () return} / / res.setHeader ('Content-Type') 'text/html Charset=utf-8') res.end (`404: ${url} registered routes ${routes.map (r = > `${r [0]}`). Join ('')} `) res.end () app.listen (3000) Thank you for reading this article carefully. I hope the article "how to use Node to achieve content Compression" shared by the editor will be helpful to you. At the same time, I also hope that you will support and pay attention to the industry information channel, and more related knowledge is waiting for you to learn!

Welcome to subscribe "Shulou Technology Information " to get latest news, interesting things and hot topics in the IT industry, and controls the hottest and latest Internet news, technology news and IT industry trends.

Views: 0

*The comments in the above article only represent the author's personal views and do not represent the views and positions of this website. If you have more insights, please feel free to contribute and share.

Share To

Development

Wechat

© 2024 shulou.com SLNews company. All rights reserved.

12
Report