In addition to Weibo, there is also WeChat
Please pay attention
WeChat public account
Shulou
2025-02-24 Update From: SLTechnology News&Howtos shulou NAV: SLTechnology News&Howtos > Servers >
Share
Shulou(Shulou.com)06/01 Report--
This article mainly introduces how to use parallel to use all your CPU resources, which has a certain reference value, interested friends can refer to, I hope you can learn a lot after reading this article, the following let the editor take you to understand it.
Bash commands are usually run in a single thread. This means that all processing is performed only on a single CPU. As CPU grows in size and the number of cores increases, this means that only a small portion of CPU resources are available to handle your work.
These unused CPU resources can be very useful when our work is limited by the speed at which CPU processes data. This situation is often encountered in multimedia conversion (such as picture and video conversion) and data compression.
In this article, we will use the parallel program. Parallel takes a list as input and executes commands on all CPU cores in parallel to process the list. Parallel even outputs the results to standard output sequentially, so it can be used in pipes as standard input for other commands.
How to use parallel
Parallel reads a list as input in standard input, and then creates multiple processes that specify commands to process the list in the following format:
List | parallel command
The list here can be created by any common bash command, such as cat, grep, find. The results of these commands are piped from their standard output to parallel's standard input, like this:
Find. -type f-name "* .log" | parallel
Similar to the use of-exec in find, parallel uses {} to represent each element in the input list. In the following example, parallel uses gzip to compress all files output by the find command:
Find. -type f-name "* .log" | parallel gzip {}
The following practical examples of using parallel may be easier to understand.
Using parallel for JPEG Compression
In this example, I collected some large .jpg files (about the size of 10MB) to be processed using MozJPEG, Mozilla's JPEG image compression tool. This tool reduces the size of JPEG image files while trying to maintain image quality. This is important to reduce the loading time of web pages.
Here is a normal find command to find all the .jpg files in the current directory and then process them through the image compression tool (cjpeg) provided in the MozJPEG package:
Find. -type f-name "* .jpg"-exec cjpeg-outfile LoRes/ {} {}';'
Total time-consuming 0m44.114s.
Although eight cores are available, only a single thread is actually using a single core.
Use parallel to run the same command:
Find. -type f-name "* .jpg" | parallel cjpeg-outfile LoRes/ {} {}
This time the time for compressing all images has been reduced to 0m10.814s. The difference is clear from the top display:
All CPU cores are running at full capacity, with 8 threads using 8 CPU cores.
Parallel is used with gzip
If you need to compress multiple files instead of one large file, parallel can be used to improve processing speed. If you need to compress a single file but want to take advantage of all the CPU cores, then you should be a multithreaded alternative to gzip, pigz.
First, I created 100 files of about 1GB with random data:
For i in {1... 100}; do dd if=/dev/urandom of=file-$i bs=1MB count=10; done
However, I use the find-exec command to compress:
Find. -type f-name "file*"-exec gzip {}';'
It takes a total of time to 0m28.028s, and only uses a single core.
Change to the parallel version:
Find. -type f-name "file*" | parallel gzip {}
The time consuming has been reduced to 0m5.774s.
Thank you for reading this article carefully. I hope the article "how to use parallel to make use of all your CPU resources" shared by the editor will be helpful to you. At the same time, I also hope you will support us and pay attention to the industry information channel. More related knowledge is waiting for you to learn!
Welcome to subscribe "Shulou Technology Information " to get latest news, interesting things and hot topics in the IT industry, and controls the hottest and latest Internet news, technology news and IT industry trends.
Views: 0
*The comments in the above article only represent the author's personal views and do not represent the views and positions of this website. If you have more insights, please feel free to contribute and share.
Continue with the installation of the previous hadoop.First, install zookooper1. Decompress zookoope
"Every 5-10 years, there's a rare product, a really special, very unusual product that's the most un
© 2024 shulou.com SLNews company. All rights reserved.