In addition to Weibo, there is also WeChat
Please pay attention
WeChat public account
Shulou
2025-01-30 Update From: SLTechnology News&Howtos shulou NAV: SLTechnology News&Howtos > Servers >
Share
Shulou(Shulou.com)06/01 Report--
This article will explain in detail how to complete the deployment and use of ELK, the content of the article is of high quality, so the editor will share it with you for reference. I hope you will have some understanding of the relevant knowledge after reading this article.
How to import logs into ELK and display them graphically.
Almost all software and applications have their own log files, and containers are no exception. We already know that Docker will log the container to / var/lib/docker/containers//-json.log, so log management can be implemented as long as we can send this file to ELK.
This is not difficult to achieve, because ELK provides a companion gadget, Filebeat, which forwards log files under a specified path to ELK. At the same time, Filebeat is smart enough to monitor log files, and when the log is updated, Filebeat sends new content to ELK.
Install Filebeat
Let's install and configure Filebeat in Docker Host.
Curl-L-O https://artifacts.elastic.co/downloads/beats/filebeat/filebeat-5.4.0-amd64.debsudo dpkg-I filebeat-5.4.0-amd64.deb
By the time you read this article, Filebeat may already have an updated version, please refer to the latest installation document https://www.elastic.co/guide/en/beats/filebeat/current/filebeat-installation.html
Configure Filebeat
The configuration file for Filebeat is / etc/filebeat/filebeat.yml, and we need to tell Filebeat two things:
Which log files are monitored?
Where do I send the log?
Answer the first question first.
We configured two paths in paths:
/ var/lib/docker/containers/*/*.log is the log file for all containers.
/ var/log/syslog is the syslog of the Host operating system.
Next, tell Filebeat to send these logs to ELK.
Filebeat can send logs to Elasticsearch for indexing and saving, or it can be sent to Logstash for analysis and filtering, and then forwarded by Logstash to Elasticsearch.
In order not to introduce too much complexity, we send the log directly to Elasticsearch here.
If you want to send it to Logstash, you can refer to the comments in the second half.
The current log processing flow is shown in the following figure:
Start Filebeat
Filebeat is already registered as a systemd service at the time of installation, and the service can be started directly.
Systemctl start filebeat.service Management Log
After Filebeat starts, the monitored log is normally sent to Elasticsearch. Refresh Elasticsearch's JSON interface http://[Host IP]: 9200/_search?pretty for confirmation.
This time we can see the index of filebeat-* and the logs under the two paths monitored by Filebeat.
Well, Elasticsearch has created an index of the log and saved it, and the next step is to show the log in Kibana.
First you need to configure an index pattern, which tells Kibana to query and analyze which logs in the Elasticsearch.
Specify that index pattern is filebeat-*, which is consistent with index in Elasticsearch.
Select @ timestamp for Time-field name.
Click Create to create the index pattern.
Click the Discover menu on the left side of Kibana to see the container and syslog log information.
Let's start a new container that will print information to the console and simulate log output.
Docker run busybox sh-c 'while true; do echo "This is a log message from container busybox!"; sleep 10; done;'
Refresh the Kibana page or click in the upper right corner
Icon, you can see busybox's log right away.
Kibana also provides powerful query functions, such as typing the keyword busybox to search for all matching log entries.
Here we simply import logs into ELK and simply display them. In fact, ELK can also classify and summarize logs, analyze aggregations, create cool Dashboard, and so on. There are a lot of content that can be mined and rich ways to play. Since the focus of this tutorial is on containers, I won't expand too much here.
On how to complete the deployment and use of ELK to share here, I hope that the above content can be of some help to you, can learn more knowledge. If you think the article is good, you can share it for more people to see.
Welcome to subscribe "Shulou Technology Information " to get latest news, interesting things and hot topics in the IT industry, and controls the hottest and latest Internet news, technology news and IT industry trends.
Views: 0
*The comments in the above article only represent the author's personal views and do not represent the views and positions of this website. If you have more insights, please feel free to contribute and share.
Continue with the installation of the previous hadoop.First, install zookooper1. Decompress zookoope
"Every 5-10 years, there's a rare product, a really special, very unusual product that's the most un
© 2024 shulou.com SLNews company. All rights reserved.