Network Security Internet Technology Development Database Servers Mobile Phone Android Software Apple Software Computer Software News IT Information

In addition to Weibo, there is also WeChat

Please pay attention

WeChat public account

Shulou

How to realize log integration in EFK practice

2025-02-24 Update From: SLTechnology News&Howtos shulou NAV: SLTechnology News&Howtos > Development >

Share

Shulou(Shulou.com)06/03 Report--

This article mainly explains "how to achieve log integration in EFK practice". Interested friends may wish to have a look at it. The method introduced in this paper is simple, fast and practical. Let's let the editor take you to learn how to achieve log integration in EFK practice.

Preface

In the EFK infrastructure, we need to deploy Filebeat on the client side, collect logs through Filebeat and transfer them to LogStash. After parsing the log in LogStash, the log is transferred to ElasticSearch, and finally the log is viewed through Kibana.

The basic environment of EFK has been set up above. In this paper, we use real cases to get through the data transmission between the three and solve some common problems in the use of EFK.

First, take a look at the actual business log.

2020-01-09 10 WARN 0315 INFO = GetCostCenter Start=2020-01-09 10 WARN cost center code less than 10 digits! {"deptId": "D000004345", "companyCode": "01"} 2020-01-09 10 ERROR java.lang.IllegalStateException 22 ERROR java.lang.IllegalStateException: SessionImpl [abcpI7fK-WYnW4nzXrv7w]: can't call getAttribute () when session is no longer valid. At com.caucho.server.session.SessionImpl.getAttribute (SessionImpl.java:283) at weaver.filter.PFixFilter.doFilter (PFixFilter.java:73) at com.caucho.server.dispatch.FilterFilterChain.doFilter (FilterFilterChain.java:87) at weaver.filter.MonitorXFixIPFilter.doFilter (MonitorXFixIPFilter.java:30) at weaver.filter.MonitorForbiddenUrlFilter.doFilter (MonitorForbiddenUrlFilter.java:133)

"the format of the log composition is:"

Time log level log details

So our main task is to write this log to EFK normally.

Filebeat installation configuration

Download filebeat7.5.1

Upload the downloaded file to the server and extract tar-zxvf filebeat-7.5.1-linux-x86_64.tar.gz

Modify filebeat.yml

Filebeat.inputs:- type: log enabled: true paths:-/ app/weaver/Resin/log/xxx.log

This section configures log input and specifies the log storage path

Output.logstash: # The Logstash hosts hosts: ["172.31.0.207bureau 5044"]

This section configures the log output and specifies the Logstash storage path

Start filebeat

. / filebeat-e-c filebeat.yml

If you need to start silently, use the nohup. / filebeat-e-c filebeat.yml & command to start.

Logstash configuration

The configuration of logstash is mainly divided into three segments of input,filter,output. Input is used to specify input, mainly by opening the port to Filebeat to receive logs. Filter is used to specify filtering to parse and filter log contents. Output is used to specify the output. You can configure the address of ES directly.

Input {beats {port = > 5044} output {elasticsearch {hosts = > ["http://172.31.0.127:9200"] index = >" myindex-% {+ YYYY.MM.dd} "user = >" elastic "password = >" xxxxxx "}

After we have configured logstash, we restart logstash by command.

Docker-compose-f elk.yml restart logstash

After the above two-step configuration, the application writes a log to the log file, and filebeat writes the log to logstash. View the log results written in kibana as follows:

The log shows 2 problems:

Because the error log stack information has multiple lines, it is displayed as multiple rows in kibana, and the data view is very messy. Stack exceptions need to be displayed in a row.

The log needs to be parsed and split into the display format of "time log level log details".

Optimization and upgrade

Set merge rows in filebeat

Filebeat is transferred by line by default, but our log must be multi-line and one log. If we want to merge multiple lines together, we have to find the rule of the log. For example, our log format all begins with a time format, so we add the following lines of configuration in the filebeat.inputs area of filebeat

# prefix with date multiline.pattern: ^\ d {4} -\ d {1Magne2} -\ d {1re2} # enable multi-line merging multiline.negate: true # after merging to the previous line multiline.match: after

Set up the parsing of logs in logstash

Parse the log into the display format of "time log level log details", so we need to add a filter segment to the logstash configuration file

Filter {grok {match = > {"message" >

The main purpose here is to parse the log using the Grok syntax and filter the log through regular expressions. You can debug through the grok debugging tool in kibana.

After the configuration is completed, we re-open the kibana Discover interface to view the log, in line with expectations, perfect!

Frequently asked questions kibana garbled

The main reason is that there is a problem with the format of the client log file. You can check the encoding format of the log file through file xxx.log. If the encoding of the ISO8859 is basically garbled, we can transfer it by specifying the log code through encoding in the filebeat configuration file.

Error in filebeat.inputs:- type: log enabled: true paths:-/ app/weaver/Resin/log/xxx.log encoding: GB2312kibana extraction field

As shown above, when this exception occurs when you open the kibana Discover panel, all you have to do is delete the .kibana _ 1 index in ES and revisit Kibana.

View surrounding files

When we view a keyword in the log on the terminal, we usually check the context information to troubleshoot the problem, such as the frequently used instruction cat xxx.log | grep-C50 keyword, so how to achieve this function in Kibana.

Search for keywords in Kibana, then find the specific log records, click the left down arrow, and then click "View surrounding documents".

Dynamic index

Our log platform may need to dock multiple business systems and need to establish different indexes according to the business system.

Mark the log in filebeat

-type: log. Fields: logType: oabusiness

Generate indexes based on tags in logstash

Input {beats {port = > 5044}} filter {if [fields] [logType] = = "oabusiness" > so far, I believe you have a deeper understanding of "how to achieve log integration in EFK practice". You might as well do it in practice. Here is the website, more related content can enter the relevant channels to inquire, follow us, continue to learn!

Welcome to subscribe "Shulou Technology Information " to get latest news, interesting things and hot topics in the IT industry, and controls the hottest and latest Internet news, technology news and IT industry trends.

Views: 0

*The comments in the above article only represent the author's personal views and do not represent the views and positions of this website. If you have more insights, please feel free to contribute and share.

Share To

Development

Wechat

© 2024 shulou.com SLNews company. All rights reserved.

12
Report