Network Security Internet Technology Development Database Servers Mobile Phone Android Software Apple Software Computer Software News IT Information

In addition to Weibo, there is also WeChat

Please pay attention

WeChat public account

Shulou

How to implement the log system of kafka+ELK under windows

2025-01-18 Update From: SLTechnology News&Howtos shulou NAV: SLTechnology News&Howtos > Development >

Share

Shulou(Shulou.com)06/02 Report--

This article mainly introduces how to implement the log system of kafka+ELK under windows. It is very detailed and has certain reference value. Friends who are interested must finish it.

Software used: zookeeper, kafka, logstash (version 6.3.2), ES (version 6.3.2), Kibana (version 6.3.2). Specific installation steps are not described here, basically download and decompress, change the configuration file, you can use it. (all below are under Windows)

1 、 zookeeper:

Kafka comes with zookeeper, so you don't need to install zookeeper. If you want to install it yourself, you need to configure environment variables as follows:

ZOOKEEPER_HOME = > D:\ nomalAPP\ zookeeper-3.4.13

Add% ZOOKEEPER_HOME%\ bin to path

If a java error is not found when running zkserver in cmd after configuration, it may be due to a problem with the placement of java environment variables. You can move the location of java environment configured in path to the front.

2 、 kafka:

Kafka if the startup report cannot find the error of java, it is due to the 179th line of kafka-run-class.bat, find the% CLASSPATH% inside, and enclose it in double quotes. Change to: set COMMAND=%JAVA%% KAFKA_HEAP_OPTS%% KAFKA_JVM_PERFORMANCE_OPTS%% KAFKA_JMX_OPTS%% KAFKA_LOG4J_OPTS%-cp "% CLASSPATH%"% KAFKA_OPTS%% *

What needs to be modified in kafka's configuration file server.properties:

The directory where the log is placed (you can choose to use the default): log.dirs=D:/nomalAPP/kafka_2.12-2.0.0/kafka-logs

Connect the ip and port of zookeeper: zookeeper.connect=localhost:2181

Everything else can be configured by default.

Startup command: cmd locks the installation directory, and then.\ bin\ windows\ kafka-server-start.bat.\ config\ server.properties

3 、 logstash:

In the config directory, in logstash.yml, if you want to start more than one, you can configure http.port=9600-9700. If it is not configured, the default port 9600 will be used.

Under the config directory, add a logstash.conf configuration file to configure the data source, and the location of data filtering and data output, as follows:

Input {# = = > uses kafka logs as data source kafka {# (IP and port of kafka) bootstrap_servers = > "10.8.22.15 IP 9092" # (need to be configured when there are multiple data sources of the same type) client_id = > "test1" # (consumer grouping, which can be specified through group ID Consumption between different groups is independent of each other and isolated from each other) group_id = > "test1" auto_offset_reset = > "latest" # (number of consumer threads) consumer_threads = > 5 # (when outputting messages, they will output their own information including: the size of consumption messages Topic source and group information for consumer) decorate_events = > true # (subject) topics = > ["bas-binding-topic"] # (for ES index) type = > "bas-binding-topic" # (data format) codec = > "json" tags = > ["bas-binding-topic"]} Kafka {bootstrap_servers = > "10.8.22.15 test2 9092" client_id = > "test2" group_id = > "test2" auto_offset_reset = > "latest" consumer_threads = > 5 # decorate_events = > true topics = > ["bas-cus-request-topic"] type = > "bas-cus-request-topic" Codec = > "json" tags = > ["bas-cus-request-topic"]}} filter {filter plug-in is responsible for filtering and parsing data read by input You can use the grok plug-in to regularly parse the data, and the date plug-in to resolve the date Json plug-in parses json, etc.} output {# output to ES if "bas-binding-topic" in [tags] {elasticsearch {# ES IP and port hosts = > ["localhost:9201"] # Index name: topic + time index = > "bas-binding-topic-% {+ YYYY.MM.dd+HH:mm:ss}" Timeout = > 300} stdout {codec = > rubydebug}} if "bas-cus-request-topic" in [tags] {elasticsearch {hosts = > ["localhost:9201"] index = > "bas-cus-request-topic-% {+ YYYY.MM.dd+HH:mm:ss}" timeout = > 300} Stdout {codec = > rubydebug}

Command to start logstash:.\ bin\ logstash.bat-f.\ config\ logstash.conf

4 、 ES:

In the config directory, in the elasticsearch.yml configuration file, the default port of ES is 9200, which can be modified through http.port=9201.

Start the command:.\ bin\ elasticsearch.bat

After startup, visit http://localhost:9201, in the browser

If some message appears, it indicates that the startup was successful.

5 、 Kibana:

In the config directory, in the kibana.yml configuration file, configure the address through elasticsearch.url: "http://localhost:9201"

Start the command:.\ bin\ kibana.bat

Access "http://localhost:9201", if there is an interface that needs to enter a user name and password, it means that ES failed to start, and if the Kibana interface is displayed directly, the startup is successful."

When logstash data is exported to ES, the default mapping of ES is selected to parse the data. If you feel that the default mapping does not meet the criteria for use, you can customize the template:

Create an ES template with the postman tool, as follows:

Receiving this response indicates that it was created successfully. You can then query the template you just created through the GET request and delete the template you just created through the DELETE request.

Send log information to the kafka topic, and the sent information is displayed in the cmd window, as follows:

At the same time, you will see the corresponding index in the Kibana interface, and the index name is configured in the output of logstash.conf, as follows:

Visualize: you can choose the index you want for graphic analysis, and the results are as follows:

Dashboard: integrate the images you have done into the dashboard. The results are as follows:

In addition, you can set automatic refresh, that is, when new data is sent to kafka, the drawing will be automatically refreshed according to the theme.

In the project, you can output logs to kafka through logback. The specific configuration is as follows:

% msg%n bas-binding-topic bootstrap.servers=localhost:9092,10.8.22.13:9092

Then when keeping a log, write down the corresponding topic, and the log will be written to the corresponding location of the kafka. You can consume the topic through consumers and check whether the log has been written successfully, as follows:

@ SuppressWarnings ("resource") public static void main (String [] args) {Properties properties = new Properties (); properties.put ("bootstrap.servers", "127.0.0.1 group.id"); properties.put ("group.id", "group-1"); properties.put ("enable.auto.commit", "false"); properties.put ("auto.commit.interval.ms", "1000") Properties.put ("auto.offset.reset", "earliest"); properties.put ("session.timeout.ms", "30000"); properties.put ("key.deserializer", "org.apache.kafka.common.serialization.StringDeserializer"); properties.put ("value.deserializer", "org.apache.kafka.common.serialization.StringDeserializer"); KafkaConsumer kafkaConsumer = new KafkaConsumer (properties) KafkaConsumer.subscribe (Arrays.asList ("bas-cus-request-topic", "bas-binding-topic")); while (true) {ConsumerRecords records = kafkaConsumer.poll (Long.MAX_VALUE); System.err.println ("+"); for (ConsumerRecord record: records) {System.err.println (record.offset () + ">" + record.value ());} System.err.println ("+"); break }} these are all the contents of the article "how to implement the log system of kafka+ELK under windows". Thank you for reading! Hope to share the content to help you, more related knowledge, welcome to follow the industry information channel!

Welcome to subscribe "Shulou Technology Information " to get latest news, interesting things and hot topics in the IT industry, and controls the hottest and latest Internet news, technology news and IT industry trends.

Views: 0

*The comments in the above article only represent the author's personal views and do not represent the views and positions of this website. If you have more insights, please feel free to contribute and share.

Share To

Development

Wechat

© 2024 shulou.com SLNews company. All rights reserved.

12
Report