Network Security Internet Technology Development Database Servers Mobile Phone Android Software Apple Software Computer Software News IT Information

In addition to Weibo, there is also WeChat

Please pay attention

WeChat public account

Shulou

How Nginx uses scripts to analyze logs

2025-01-28 Update From: SLTechnology News&Howtos shulou NAV: SLTechnology News&Howtos > Servers >

Share

Shulou(Shulou.com)06/03 Report--

What this article shares with you is Nginx's method of using scripts to analyze logs. I believe most people have not yet learned this skill. In order to let you learn, I have summarized the following content for you. Here is the code for log analysis scripts.

Vim / data/scripts/log_analysis.sh

#! / bin/bash### Desc: nginx log analysis script # # Author: Bertram # # Date: 2019-12-21 # # Copyright: Personal belongs # # Public () {echo "" read-p "Please enter the access log to analyze:" log_file echo "" if [!-f $log_file] Then echo "not found: ${log_file}" exit 1 fi if [!-s $log_file] Then echo "${log_file} is an empty file" exit 1 fi # outputs the top top_num entries of log visits Customizable top_num=5 input_file= `echo $log_file | awk-F'/'{print $(NF)} '`analyze_dir=/home/Bertram/ `date +% F` top_ip_file=$analyze_dir/ngx_log_top_ip_$ {input_file} .txt top_src_url_file=$analyze_dir/ngx_log_top_src_url_$ {input_file} .txt top_dest_url_file=$analyze_dir/ngx_log_top_dest_ Url_$ {input_file} .txt top_code_file=$analyze_dir/ngx_log_top_code_$ {input_file} .txt top_terminal_file=$analyze_dir/ngx_log_top_terminal_$ {input_file} .txt mkdir-p $analyze_dir start_time= `head-1 $log_file | awk'{print $4}'| cut-d "["-f2`end_time= `tail-1 $log_file | awk'{print $4}'| cut-d "["-f2` Total_nums= `wc-l $log_file | awk'{print $1} '`size= `du-sh $log_file | awk' {print $1}'`# get start and end time echo "access start time: $start_time Deadline: $end_time "# get total number of rows and size echo" total visits $total_nums Log size: $size "# get the most active IP # # cat $log_file | awk'{print $1}'| sort | uniq-c | sort-rn | head-${top_num} > $top_ip_file awk'{ips [$1] +} END {for (i in ips) {print ips [I] I}'$log_file | sort | uniq-c | sort-K1-nr | head-${top_num} > $top_ip_file # get the url cat $log_file with the most access sources | awk'{print $13}'| sort | uniq-c | sort-rn | head-${top_num} > $top_src_url_file # get the url cat $log_file with the most requests | awk'{print $8}'| sort | uniq-c | sort-rn | | head-${top_num} > $top_dest_url_file # get the most returned status codes cat $log_file | awk'{print $11}'| sort | uniq-c | sort-rn | head-${top_num} > $top_code_file # get the most returned terminal types cat $log_file | awk'{print $14}'| sort | uniq-c | sort-rn | head-${top_num} > $top_terminal | _ file} simple () {echo "+-+-the following is the analysis content +-+ -" # get the most active IP printf "the first ${top_num} visit IP:\ n" cat $top_ip_file echo "" # get the most access sources url printf "the first ${top_ with the most access sources Num} url:\ n "cat $top_src_url_file echo"# get the first ${top_num} url with the most requests:\ n" cat $top_dest_url_file echo "" # get the most returned status codes printf "return the most first ${top_num} status codes:\ n" cat $top_code_file echo "" Printf "# get the most returned terminal numbers printf" returns the most first ${top_num} terminal numbers:\ n "cat $top_terminal_file echo"printf"printf" returns the first ${top_num} IP cities (query time is a bit slow) Wait patiently! ):\ n "echo 'printf"%-15s%-15s%-30s\ n "number of visits", "IP address", "place" echo "-' astat0 cat $analyze_dir/ngx_log_top_ip_$ {input_file} .txt | while read line do ip=$ (echo $line | cut-d'"- f2) count=$ (echo $line | cut-d'"-F1) printf "%-10s%-15s%-30s\ n" $count $ip $(curl-s "http://freeapi.ipip.net/$(echo $line | cut-d'"-f2) "| awk-F'\" {'print $2 "-" $4 "-" $6'}) echo'- 'let a=a+1 done echo "" printf "} case $1 in help) echo" echo-e $"Usage: $0 enter a log file\ n" ; *) public simple;; esacexit 0

Implement the function:

1. Analyze the ip addresses of the top N

2. Analyze and visit the top N url

3. Analyze the target url of the top N

4. Analyze the terminal types that visit the top N

5. Automatically match the belonging place of the top N ip.

Note: log files and analysis scripts can be placed in one directory; log files enter absolute paths.

Usage:

This is how Nginx uses scripts to analyze logs. The code example is simple and straightforward, if you encounter this problem in your daily work. Through this article, I hope you can get something. For more details, please follow the industry information channel!

Welcome to subscribe "Shulou Technology Information " to get latest news, interesting things and hot topics in the IT industry, and controls the hottest and latest Internet news, technology news and IT industry trends.

Views: 0

*The comments in the above article only represent the author's personal views and do not represent the views and positions of this website. If you have more insights, please feel free to contribute and share.

Share To

Servers

Wechat

© 2024 shulou.com SLNews company. All rights reserved.

12
Report