In addition to Weibo, there is also WeChat
Please pay attention
WeChat public account
Shulou
2025-02-25 Update From: SLTechnology News&Howtos shulou NAV: SLTechnology News&Howtos > Servers >
Share
Shulou(Shulou.com)06/01 Report--
This article introduces you how to use Shell script program to monitor whether the website URL is normal, the content is very detailed, interested friends can refer to, hope to be helpful to you.
The most common way to monitor whether the website URL is normal is the wget and curl commands, both of which are so powerful that netizens can't choose to check the help. today, the old boy teacher happens to record a practical course on Shell programming.
Wget command
The wget command has a lot of parameters, at least dozens of them, but not many are commonly used in operation and maintenance. The old boy teacher listed the most useful ones as follows.
Wget command downloads web pages or files-spider simulates the crawler's behavior to visit the website, but does not download web pages-Q,-- quiet quiet access, forbids output Similar to-o / dev/null function-o,-- output-file=FILE records output to file-T,-- timeout=SECONDS timeout to visit the site-t,-- number of times tries=NUMBER retries the site when the site is abnormal
The actual monitoring method is as follows: use the return value of wget command to determine whether the website is normal or not
[root@oldboy ~] # wget-- spider-T 5-Q-t 2 www.oldboyedu.com [root@oldboy ~] # echo $? # < = use the returned value to determine whether the website is normal. 0
Curl command
The parameters of the curl command are more than those of wget, but there are not many parameters commonly used in operation and maintenance, so the list is as follows:
Curl command access website url-I/--head displays response header information-m/--max-time access timeout-o/--output records access information to a file-s/--silent silent mode access, that is, does not output information-w/--write-out is output in a fixed special format, for example:% {http_code}, output status code
Actual monitoring method
1. Use the return value of curl command to determine whether the website is normal or not.
[root@oldboy ~] # curl-s-o / dev/null www.oldboyedu.com [root@oldboy ~] # echo $? 0
2. Get the status code after the command is executed (200 indicates normal)
[root@oldboy] # curl-I-m 5-s-w "% {http_code}\ n"-o / dev/null www.baidu.com 200
3 develop Shell script to monitor whether the specified URL is normal or not
Answer: method 1: #! / bin/sh function usage () {# < = = help function echo $"usage:$0 url" exit 1} function check_url () {#
Welcome to subscribe "Shulou Technology Information " to get latest news, interesting things and hot topics in the IT industry, and controls the hottest and latest Internet news, technology news and IT industry trends.
Views: 0
*The comments in the above article only represent the author's personal views and do not represent the views and positions of this website. If you have more insights, please feel free to contribute and share.
Continue with the installation of the previous hadoop.First, install zookooper1. Decompress zookoope
"Every 5-10 years, there's a rare product, a really special, very unusual product that's the most un
© 2024 shulou.com SLNews company. All rights reserved.