In addition to Weibo, there is also WeChat
Please pay attention
WeChat public account
Shulou
2025-01-16 Update From: SLTechnology News&Howtos shulou NAV: SLTechnology News&Howtos > Servers >
Share
Shulou(Shulou.com)06/01 Report--
Linux how to install and use http_load for stress testing on the server, many novices are not very clear about this, in order to help you solve this problem, the following editor will explain in detail for you, people with this need can come to learn, I hope you can get something.
Http_load is a performance testing tool based on linux platform. Run in parallel reuse to test the throughput and load of the web server and test the performance of web pages.
1. Download
Official website: http://acme.com/software/http_load/
The code is as follows:
Cd / root
Wget http://acme.com/software/http_load/http_load-12mar2006.tar.gz
Tar xzf http_load-12mar2006.tar.gz
2. Installation
The code is as follows:
Cd http_load-12mar2006
Make
After executing make, a http_ load binary file is generated in the current directory.
3. Usage
The code is as follows:
Root@www:~/http_load-12mar2006#. / http_load--help
Usage:. / http_load [- checksum] [- throttle] [- proxy host:port] [- verbose] [- timeout secs] [- sip sip_file]
-parallel N |-rate N [- jitter]
-fetches N |-seconds N
Url_file
One start specifier, either-parallel or-rate, is required.
One end specifier, either-fetches or-seconds, is required.
Description of main parameters:
-parallel abbreviation-p: means the number of concurrent user processes.
-rate abbreviation-r: means the frequency of visits per second
-fetches abbreviation-f: means the total number of visits
-seconds abbreviation-s: means total visit time
When selecting parameters,-parallel and-rate choose one of them, and-fetches and-seconds choose one of them.
4. Example:
The code is as follows:
Http_load-parallel 50-s 10 urls.txt
This command line uses 50 processes at the same time and randomly accesses the list of URLs in urls.txt for a total of 10 seconds.
The code is as follows:
Http_load-rate 50-f 5000 urls.txt
50 requests per second, for a total of 5000 stops.
Test the average number of visits a site can take per second:
The code is as follows:
Http_load-parallel 5-fetches 1000urls.txt
This command line uses five processes at the same time and randomly accesses the list of URLs in urls.txt for a total of 1000 visits. The result after running:
1000 fetches, 5 max parallel, 6e+06 bytes, in 58.1026 seconds
6000 mean bytes/connection
17.2109 fetches/sec, 103266 bytes/sec
Msecs/connect: 0.403263 mean, 68.603 max, 0.194 min
Msecs/first-response: 284.133 mean, 5410.13 max, 55.735 min
HTTP response codes:
Code 200-1000
Judging from the above running results, the target website can only withstand 17 visits per second, which is not strong enough.
Is it helpful for you to read the above content? If you want to know more about the relevant knowledge or read more related articles, please follow the industry information channel, thank you for your support.
Welcome to subscribe "Shulou Technology Information " to get latest news, interesting things and hot topics in the IT industry, and controls the hottest and latest Internet news, technology news and IT industry trends.
Views: 0
*The comments in the above article only represent the author's personal views and do not represent the views and positions of this website. If you have more insights, please feel free to contribute and share.
Continue with the installation of the previous hadoop.First, install zookooper1. Decompress zookoope
"Every 5-10 years, there's a rare product, a really special, very unusual product that's the most un
© 2024 shulou.com SLNews company. All rights reserved.