Network Security Internet Technology Development Database Servers Mobile Phone Android Software Apple Software Computer Software News IT Information

In addition to Weibo, there is also WeChat

Please pay attention

WeChat public account

Shulou

Nginx log analysis script

2025-04-09 Update From: SLTechnology News&Howtos shulou NAV: SLTechnology News&Howtos > Servers >

Share

Shulou(Shulou.com)06/02 Report--

Operation and maintenance work is a relatively complex task, sometimes in the face of tens of thousands of logs, how to make analysis? Is it a case-by-case analysis?

Smart people will choose scripts, which is why automated operation and maintenance is advocated now. Don't talk too much nonsense and go directly to scripts.

Vim / Desc Desc: nginx log analysis script # # analysis: Bertram # # Date: 2019-12-21 # # Copyright: Personal belongs # # # public () {echo "" read-p "Please enter the access log to be analyzed:" log_file echo "" if [!-f $log_file] Then echo "not found: ${log_file}" exit 1 fi if [!-s $log_file] Then echo "${log_file} is an empty file" exit 1 fi # outputs the top top_num entries of log visits Customizable top_num=5 input_file= `echo $log_file | awk-F'/'{print $(NF)} '`analyze_dir=/home/Bertram/ `date +% F` top_ip_file=$analyze_dir/ngx_log_top_ip_$ {input_file} .txt top_src_url_file=$analyze_dir/ngx_log_top_src_url_$ {input_file} .txt top_dest_url_file=$analyze_dir/ngx_log_top_dest_ Url_$ {input_file} .txt top_code_file=$analyze_dir/ngx_log_top_code_$ {input_file} .txt top_terminal_file=$analyze_dir/ngx_log_top_terminal_$ {input_file} .txt mkdir-p $analyze_dir start_time= `head-1 $log_file | awk'{print $4}'| cut-d "["-f2`end_time= `tail-1 $log_file | awk'{print $4}'| cut-d "["-f2` Total_nums= `wc-l $log_file | awk'{print $1} '`size= `du-sh $log_file | awk' {print $1}'`# get start and end time echo "access start time: $start_time Deadline: $end_time "# get total number of rows and size echo" total visits $total_nums Log size: $size "# get the most active IP # # cat $log_file | awk'{print $1}'| sort | uniq-c | sort-rn | head-${top_num} > $top_ip_file awk'{ips [$1] +} END {for (i in ips) {print ips [I] I}'$log_file | sort | uniq-c | sort-K1-nr | head-${top_num} > $top_ip_file # get the url cat $log_file with the most access sources | awk'{print $13}'| sort | uniq-c | sort-rn | head-${top_num} > $top_src_url_file # get the url cat $log_file with the most requests | awk'{print $8}'| sort | uniq-c | sort-rn | | head-${top_num} > $top_dest_url_file # get the most returned status codes cat $log_file | awk'{print $11}'| sort | uniq-c | sort-rn | head-${top_num} > $top_code_file # get the most returned terminal types cat $log_file | awk'{print $14}'| sort | uniq-c | sort-rn | head-${top_num} > $top_terminal | _ file} simple () {echo "+-+-the following is the analysis content +-+ -" # get the most active IP printf "the first ${top_num} visit IP:\ n" cat $top_ip_file echo "" # get the most access sources url printf "the first ${top_ with the most access sources Num} url:\ n "cat $top_src_url_file echo"# get the first ${top_num} url with the most requests:\ n" cat $top_dest_url_file echo "" # get the most returned status codes printf "return the most first ${top_num} status codes:\ n" cat $top_code_file echo "" Printf "# get the most returned terminal numbers printf" returns the most first ${top_num} terminal numbers:\ n "cat $top_terminal_file echo"printf"printf" returns the first ${top_num} IP cities (query time is a bit slow) Wait patiently! ):\ n "echo 'printf"%-15s%-15s%-30s\ n "number of visits", "IP address", "place" echo "-' astat0 cat $analyze_dir/ngx_log_top_ip_$ {input_file} .txt | while read line do ip=$ (echo $line | cut-d'"- f2) count=$ (echo $line | cut-d'"-F1) printf "%-10s%-15s%-30s\ n" $count $ip $(curl-s "http://freeapi.ipip.net/$(echo $line | cut-d'"-f2) "| awk-F'\" {'print $2 "-" $4 "-" $6'}) echo'- 'let a=a+1 done echo "" printf "} case $1 in help) echo" echo-e $"Usage: $0 enter a log file\ n" ; *) public simple;; esacexit 0

Implement the function:

1. Analyze the ip addresses of the top N

2. Analyze and visit the top N url

3. Analyze the target url of the top N

4. Analyze the terminal types that visit the top N

5. Automatically match the belonging place of the top N ip.

Note: log files and analysis scripts can be placed in one directory; log files enter absolute paths.

Usage:

Welcome to subscribe "Shulou Technology Information " to get latest news, interesting things and hot topics in the IT industry, and controls the hottest and latest Internet news, technology news and IT industry trends.

Views: 0

*The comments in the above article only represent the author's personal views and do not represent the views and positions of this website. If you have more insights, please feel free to contribute and share.

Share To

Servers

Wechat

© 2024 shulou.com SLNews company. All rights reserved.

12
Report