Network Security Internet Technology Development Database Servers Mobile Phone Android Software Apple Software Computer Software News IT Information

In addition to Weibo, there is also WeChat

Please pay attention

WeChat public account

Shulou

ELK Analysis nginx Log

2025-04-05 Update From: SLTechnology News&Howtos shulou NAV: SLTechnology News&Howtos > Servers >

Share

Shulou(Shulou.com)06/02 Report--

The open source real-time log analysis ELK platform can perfectly solve the above problems. ELK is composed of three open source tools: ElasticSearch, Logstash and Kiabana. Official website: https://www.elastic.co/products

L Elasticsearch is an open source distributed search engine, its characteristics are: distributed, zero configuration, automatic discovery, index automatic slicing, index copy mechanism, restful style interface, multiple data sources, automatic search load and so on.

L Logstash is a completely open source tool that collects, analyzes, and stores your logs for later use (e.g., search).

L kibana is also an open source and free tool, and its Kibana provides a friendly Web interface for log analysis for Logstash and ElasticSearch to help you aggregate, analyze, and search important data logs.

How it works is as follows:

Open source real-time logs analyze the deployment process of the ELK platform:

(1) install Logstash dependency package JDK

The operation of Logstash depends on the running environment of Java. The latest version of Java is recommended for Logstash 1.5 or higher than java 7. Since we are just running Java programs instead of developing them, download JRE. First of all, download the new version of jre from Oracle at http://www.oracle.com/technetwork/java/javase/downloads/jre8-downloads-2133155.html

# wget http://download.oracle.com/otn-pub/java/jdk/8u45-b14/jdk-8u45-linux-x64.tar.gz# mkdir / usr/local/java# tar-zxf jdk-8u45-linux-x64.tar.gz-C / usr/local/java/# tail-3 ~ / .bash_profileexport JAVA_HOME=/usr/local/java/jdk1.8.0_45export PATH=$PATH:$JAVA_HOME/binexportCLASSPATH=.:$JAVA_HOME/lib/tools.jar:$JAVA_HOME/lib/dt .jar: $CLASSPATH# java-versionjava version "1.8.0mm 45" Java (TM) SE Runtime Environment (build 1.8.0_45-b14) Java HotSpot (TM) 64-Bit Server VM (build 25.45-b02) Mixed mode)

(2) install Logstash

Download and install Logstash. To install logstash, you only need to extract its corresponding directory, for example: / usr/local:

# https://download.elastic.co/logstash/logstash/logstash-1.5.2.tar.gz# tar zxf logstash-1.5.2.tar.gz-C / usr/local/# / usr/local/logstash-1.5.2/bin/logstash-e 'input {stdin {}} output {stdout {}}' Logstash startup completedHello Worldwaters 2015-07-15T03:28:56.938Z noc.vfast.com Hello World!

3) install Elasticsearch

After downloading Elasticsearch, extract it to the corresponding directory to complete the installation of Elasticsearch.

# tar-zxf elasticsearch-1.6.0.tar.gz-C / usr/local/

Start Elasticsearch

# / usr/local/elasticsearch-1.6.0/bin/elasticsearch

If you use a remotely connected Linux and want to run elasticsearch in the background, execute the following command:

# nohup / usr/local/elasticsearch-1.6.0/bin/elasticsearch > nohup &

Confirm that port 9200 of elasticsearch is listening, indicating that elasticsearch is running successfully

# netstat-anp | grep: 9200tcp 00: 9200:: * LISTEN 3362/java# cat logstash-es-simple.confinput {stdin {} output {elasticsearch {host = > "localhost"} stdout {codec= > rubydebug}}

Execute the following command

# / usr/local/logstash-1.5.2/bin/logstash agent-f logstash-es-simple.conf... ... Logstash startup completedhello logstash {"message" = > "hello logstash", "@ version" = > "1", "@ timestamp" = > "2015-07-15T18:12:00.450Z", "host" = > "noc.vfast.com"} # curl 'http://localhost:9200/_search?pretty' returns the result {"took": 58, "timed_out": false, "_ shards": {"total": 5, "successful": 5 "failed": 0}, "hits": {"total": 1, "max_score": 1.0, "hits": [{"_ index": "logstash-2015.07.15", "_ type": "logs", "_ id": "AU6TWiixxDXYhySMyTkP", "_ score": 1.0, "_ source": {"message": "hellologstash" "@ version": "1", "@ timestamp": "2015-07-15T20:13:55.199Z", "host": "noc.vfast.com"}]}}

5) install Kibana

After downloading kibana, extract it to the corresponding directory to complete the installation of kibana.

# tar-zxf kibana-4.1.1-linux-x64.tar.gz-C / usr/local/

Start kibana

# / usr/local/kibana-4.1.1-linux-x64/bin/kibana

Use http://kibanaServerIP: 5601 to access Kibana, after logging in, first, configure an index. By default, the data of Kibana is pointed to Elasticsearch, using the default index name of logstash-*, and it is time-based. Click "Create".

At this point, the deployment of the ELK environment is complete

The following is the configuration for analyzing nginx logs:

Define the nginx log format:

[root@vm10-100-0-5 logstash-1.5.2] # cat / etc/nginx/nginx.confuser nginx;worker_processes 1 domestic errorists log / var/log/nginx/error.log warn;pid / var/run/nginx.pid;events {worker_connections 1024;} http {include / etc/nginx/mime.types; default_type application/octet-stream Log_format logstashlog'$http_host'$remote_addr-$remote_user [$time_local]'"$request" $status $body_bytes_sent "$request_body"$http_referer", $http_user_agent "$http_x_forwarded_for"'$request_time'; access_log / var/log/nginx/access.log logstashlog; sendfile on; # tcp_nopush on; keepalive_timeout 65; # gzip on Include conf.d/*.conf } [root@vm10-1000-5 logstash-1.5.2] # cat logstash-nginx_log.confinput {file {path = > ["/ var/log/nginx/access.log"] start_position = > "beginning"} filter {grok {patterns_dir = > ['/ opt/logstash/patterns/'] match = > {"message" = > "% {NGINXACCESS}"}} Geoip {source = > "http_x_forwarded_for" target = > "geoip" database = > "/ etc/logstash/GeoLiteCity.dat" add_field = > ["[geoip] [coordinates]" "% {[geoip] [longitude]}"] add_field = > ["[geoip] [coordinates]", "% {[geoip] [latitude]}"]} mutate {convert = > ["[geoip] [coordinates]", "float"] convert = > ["response", "integer"] convert = > ["bytes" "integer"] replace = > {"type" = > "nginx_access"} remove_field = > "message"} date {match = > ["timestamp" "dd/MMM/yyyy:HH:mm:ss Z"]} mutate {remove_field = > "timestamp"}} output {elasticsearch {host = > "localhost" index = > "logstash-nginx-access-% {+ YYYY.MM.dd}"} stdout {codec = > rubydebug}} [root@vm10-100-0-5 logstash-1.5.2] # cat / opt/logstash/patterns/nginxURIPARAM1\? [A-Za -Z0-9 $. +! *'| () {} ~ @ #% & / =: * URIPARAM (?:% {URIPARAM1})? NGINXACCESS% {IPORHOST:http_host}% {IPORHOST:remote_addr} -% {USERNAME:remote_user}\ [% {HTTPDATE:time_local}\] "% {WORD:method}% {URIPATH:request}% {URIPARAM:requestparam} HTTP/% {NUMBER:http_version}"% {INT:status}% {INT:body_bytes_sent}% {QS:request_body }% {QS:http_referer}% {QS:http_user_agent}% {QS:http_x_forwarded_for}% {NUMBER:request_time:float} # bin/logstash-f logstash-nginx_log.conf# bin/kibana

The effect is as shown in the figure:

Welcome to subscribe "Shulou Technology Information " to get latest news, interesting things and hot topics in the IT industry, and controls the hottest and latest Internet news, technology news and IT industry trends.

Views: 0

*The comments in the above article only represent the author's personal views and do not represent the views and positions of this website. If you have more insights, please feel free to contribute and share.

Share To

Servers

Wechat

© 2024 shulou.com SLNews company. All rights reserved.

12
Report