In addition to Weibo, there is also WeChat
Please pay attention
WeChat public account
Shulou
2025-01-16 Update From: SLTechnology News&Howtos shulou NAV: SLTechnology News&Howtos > Development >
Share
Shulou(Shulou.com)06/03 Report--
This article introduces the knowledge of "how to understand the most frequently visited and time-consuming pages of the nginx log of shell scripts". In the operation of actual cases, many people will encounter such a dilemma, so let the editor lead you to learn how to deal with these situations. I hope you can read it carefully and be able to achieve something!
When the server is under a lot of pressure and it's hard to run. We often do site page optimization, will find those pages visit more times, and more time-consuming. Find those addresses that have a high number of visits and are more time-consuming, and the relevant optimization will achieve immediate results. The following is a shell script that I often use when doing optimization. This can also be regarded as counting the slowpage slow access pages of web pages, such as mysql slowquery.
The following is mine: nginx configuration
The code is as follows:
Log_format main'$remote_addr-$remote_user [$time_local] $request'
'"$status" $body_bytes_sent "$http_referer"'
'"$http_user_agent"$http_x_forwarded_for" $request_time'
Access_log / var/log/nginx/access.log main buffer=32k
From the configuration above, you can see that the ip is in the first column, and the page time is in the last column, separated by spaces. So in awk, you can use: $1, respectively
$NF reads the current value. Where NF is a constant that represents the entire number of columns.
The following is the shell file for the analysis code, which can be saved as slow.sh
The code is as follows:
#! / bin/sh
Export PATH=/usr/bin:/bin:/usr/local/bin:/usr/X11R6/bin
Export LANG=zh_CN.GB2312
Function usage ()
{
Echo "$0 filelog options"
Exit 1
}
Function slowlog ()
{
# set-x
Field=$2
Files=$1
End=2
Msg= ""
[[$2 = ='1']] & & field=1&&end=2&&msg= "Total visits Statistics"
[[$2 = ='2']] & & field=3&&end=4&&msg= "average access time Statistics"
Echo-e "\ r\ n\ r\ n"
Echo-n "$msg"
Seq-s'#'30 | sed-e's / [0-9] * / / g'
Awk'{split ($7 for BBB, "?"); arr [bbb [1]] = arr [bbb [1]] + $NF; arr2 [bbb [1]] = arr2 [bbb [1]] + 1;} END {for (i in arr) {print I ":" arr2 [I] ":" arr [I] ":" ARR [I] / arr2 [I]}'$1 | sort-t: + $field-$end-rn | grep "pages" | head-30 | sed'sgamma /\ tpeg'
}
[[$# < 2]] & & usage
Slowlog $1 $2
Just execute: slow.sh log file 1 or 2
1: 30 visit the most mundane pages
2: 30 pages that take the most time to visit
The implementation results are as follows:
Chmod + x. / slow.sh
Chmod + x slow.sh
. / slow.sh / var/log/nginx/
. / slow.sh / var/log/nginx/access.log 2
APCge access time Statistics # #
/ pages/#1.php 4 120.456 30.114
/ pages/#2.php 1 16.161 16.161
/ pages/#3.php 212 1122.49 5.29475
/ pages/#4.php 6 28.645 4.77417
This is the end of the content of "how to understand the most visited and time-consuming pages of the nginx log of the shell script". Thank you for reading. If you want to know more about the industry, you can follow the website, the editor will output more high-quality practical articles for you!
Welcome to subscribe "Shulou Technology Information " to get latest news, interesting things and hot topics in the IT industry, and controls the hottest and latest Internet news, technology news and IT industry trends.
Views: 0
*The comments in the above article only represent the author's personal views and do not represent the views and positions of this website. If you have more insights, please feel free to contribute and share.
Continue with the installation of the previous hadoop.First, install zookooper1. Decompress zookoope
"Every 5-10 years, there's a rare product, a really special, very unusual product that's the most un
© 2024 shulou.com SLNews company. All rights reserved.