Network Security Internet Technology Development Database Servers Mobile Phone Android Software Apple Software Computer Software News IT Information

In addition to Weibo, there is also WeChat

Please pay attention

WeChat public account

Shulou

Example Analysis of Kafka Connect and FileConnector

2025-04-06 Update From: SLTechnology News&Howtos shulou NAV: SLTechnology News&Howtos > Internet Technology >

Share

Shulou(Shulou.com)06/01 Report--

This article will explain in detail the example analysis of Kafka Connect and FileConnector for you. The content of the article is of high quality, so the editor shares it for you as a reference. I hope you will have some understanding of the relevant knowledge after reading this article.

one。 Introduction to Kafka Connect

Kafka is a more and more widely used messaging system, especially in the development of big data (real-time data processing and analysis). Why integrate other systems and decoupled applications, often use Producer to send messages to Broker, and use Consumer to consume messages in Broker. Kafka Connect is not available until version 0.9 and greatly simplifies the integration of other systems with Kafka. Kafka Connect uses users to quickly define and implement a variety of Connector (File,Jdbc,Hdfs, etc.), these functions make it convenient for large quantities of data to import / export Kafka.

As shown in the figure, the Sources on the left is responsible for reading data from other heterogeneous systems and importing it into Kafka; the Sinks on the right writes data from Kafka to other systems.

two。 All kinds of Kafka Connector

There are many Kafka Connector, including open source and commercial versions. In the following list are the commonly used open source Connector

ConnectorsReferencesJdbcSource, SinkElastic SearchSink1, Sink2, Sink3CassandraSource1, Source 2, Sink1, Sink2MongoDBSourceHBaseSinkSyslogSourceMQTT (Source) SourceTwitter (Source) Source, SinkS3Sink1, Sink2

The commercial version is available through Confluent.io

three。 Example 3.1 FileConnector Demo

This example demonstrates how to use Kafka Connect to convert Source (test.txt) into stream data and then write it to Destination (test.sink.txt). As shown in the following figure:

This example uses two Connector:

FileStreamSource: read from test.txt and publish to Broker

FileStreamSink: the configuration file used by Source that reads data from Broker and writes to test.sink.txt file is ${KAFKA_HOME} / config/connect-file-source.properties

Name=local-file-source

Connector.class=FileStreamSource

Tasks.max=1

File=test.txt

Topic=connect-test

The configuration file used by Sink is ${KAFKA_HOME} / config/connect-file-sink.properties

Name=local-file-sink

Connector.class=FileStreamSink

Tasks.max=1

File=test.sink.txt

Topics=connect-test

The configuration file used by Broker is ${KAFKA_HOME} / config/connect-standalone.properties

Bootstrap.servers=localhost:9092key.converter=org.apache.kafka.connect.json.JsonConverter

Value.converter=org.apache.kafka.connect.json.JsonConverter

Key.converter.schemas.enable=true

Value.converter.schemas.enable=trueinternal.key.converter=org.apache.kafka.connect.json.JsonConverter

Internal.value.converter=org.apache.kafka.connect.json.JsonConverter

Internal.key.converter.schemas.enable=false

Internal.value.converter.schemas.enable=false

Offset.storage.file.filename=/tmp/connect.offsets

Offset.flush.interval.ms=10000

3.2 run Demo

Need to be familiar with some command lines of Kafka, refer to the previous article in this series, Apache Kafka series (2) Command Line tools (CLI)

3.2.1 launch Kafka Broker [root@localhost bin] # cd / opt/kafka_2.11-0.11.0.0 /

[root@localhost kafka_2.11-0.11.0.0] # ls

Bin config libs LICENSE logs NOTICE site-docs

[root@localhost kafka_2.11-0.11.0.0] #. / bin/zookeeper-server-start.sh. / config/zookeeper.properties &

[root@localhost kafka_2.11-0.11.0.0] #. / bin/kafka-server-start.sh. / config/server.properties &

3.2.2 launch Source Connector and Sink Connector [root@localhost kafka_2.11-0.11.0.0] #. / bin/connect-standalone.sh config/connect-standalone.properties config/connect-file-source.properties config/connect-file-sink.properties

3.3.3 Open console-consumer

. / kafka-console-consumer.sh-- zookeeper localhost:2181-- from-beginning-- topic connect-test

3.3.4 write to the test.txt file and observe the changes in 3.3.3 [root@Server4 kafka_2.12-0.11.0.0] # echo 'firest line' > > test.txt

[root@Server4 kafka_2.12-0.11.0.0] # echo 'second line' > > test.txt

The output of the window opened in 3.3.3 is as follows: {"schema": {"type": "string", "optional": false}, "payload": "firest line"}

{"schema": {"type": "string", "optional": false}, "payload": "second line"}

3.3.5 View test.sink.txt [root @ Server4 kafka_2.12-0.11.0.0] # cat test.sink.txt

Firest line

Second line

This is the end of the sample analysis on Kafka Connect and FileConnector. I hope the above content can be helpful to you and learn more knowledge. If you think the article is good, you can share it for more people to see.

Welcome to subscribe "Shulou Technology Information " to get latest news, interesting things and hot topics in the IT industry, and controls the hottest and latest Internet news, technology news and IT industry trends.

Views: 0

*The comments in the above article only represent the author's personal views and do not represent the views and positions of this website. If you have more insights, please feel free to contribute and share.

Share To

Internet Technology

Wechat

© 2024 shulou.com SLNews company. All rights reserved.

12
Report