In addition to Weibo, there is also WeChat
Please pay attention
WeChat public account
Shulou
2025-01-19 Update From: SLTechnology News&Howtos shulou NAV: SLTechnology News&Howtos > Servers >
Share
Shulou(Shulou.com)06/01 Report--
ELK how to analyze nginx access logs, I believe that many inexperienced people do not know what to do, so this article summarizes the causes of the problem and solutions, through this article I hope you can solve this problem.
Note: it is recommended to recreate the index after modifying the configuration
1. Nginx log file format
Log_format elk "$http_clientip | $http_x_forwarded_for | $time_local | $request | $status | $body_bytes_sent |" $request_body | $content_length | $http_referer | $http_user_agent | "$http_cookie | $remote_addr | $hostname | $upstream_addr | $upstream_response_time | $request_time"
2. The configuration file agent.conf on the logstash nginx server
Input {file {type = > "elk_frontend_access" path = > ["/ data/logs/flight1-access_log"]}} filter {ruby {init = > "@ kname = ['http_clientip','http_x_forwarded_for','time_local','request','status','body_bytes_sent','request_body','content_length','http_referer'" 'http_user_agent','http_cookie','remote_addr','hostname','upstream_addr','upstream_response_time','request_time'] "code = >" new_event = LogStash::Event.new (Hash [@ kname.zip (event.get (' message'). Split ('|')]) new_event.remove ('@ timestamp') event.append (new_event) "} if [request] {ruby {init = >" @ kname = ['method','uri' 'verb'] "code = >" new_event = LogStash::Event.new (Hash [@ kname.zip (event.get (' request'). Split ('')]) new_event.remove ('@ timestamp') event.append (new_event) "} if [uri] {ruby {init = >" @ kname = ['url_path' 'url_args'] "code = >" new_event = LogStash::Event.new (Hash [@ kname.zip (event.get (' uri'). Split ('?)]) new_event.remove ('@ timestamp') event.append (new_event) "} kv {prefix = >" url_ "source = >" url_args "field_split = >" & "remove_field = > [" url_args "," uri "," request "]}} mutate {convert = > [" body_bytes_sent " "integer", "content_length", "integer", "upstream_response_time", "float", "request_time", "float"]} date {match = > ["time_local" "dd/MMM/yyyy:hh:mm:ss Z"] locale = > "en"} grok {match = > {"message" = > "% {IP:clientip}"} geoip {source = > "clientip"}} output {redis {host = > "10.10.45.200" data_type = > "list" Key = > "elk_frontend_access:redis" port= > "5379"}}
3. The configuration file server.conf on the logstash elk server
Input {redis {host = > "10.10.45.200" data_type = > "list" key = > "elk_frontend_access:redis" port = > "5379"}} output {elasticsearch {hosts = > "10.10.45.200 host 8200" Index = > "logstash-zjzc-frontend-% {+ YYYY.MM.dd}"} stdout {codec = > rubydebug}}
Note: if the modification does not take effect, rebuild the index on kibana.
After reading the above, have you mastered how ELK analyzes nginx access logs? If you want to learn more skills or want to know more about it, you are welcome to follow the industry information channel, thank you for reading!
Welcome to subscribe "Shulou Technology Information " to get latest news, interesting things and hot topics in the IT industry, and controls the hottest and latest Internet news, technology news and IT industry trends.
Views: 0
*The comments in the above article only represent the author's personal views and do not represent the views and positions of this website. If you have more insights, please feel free to contribute and share.
Continue with the installation of the previous hadoop.First, install zookooper1. Decompress zookoope
"Every 5-10 years, there's a rare product, a really special, very unusual product that's the most un
© 2024 shulou.com SLNews company. All rights reserved.