In addition to Weibo, there is also WeChat
Please pay attention
WeChat public account
Shulou
2025-01-29 Update From: SLTechnology News&Howtos shulou NAV: SLTechnology News&Howtos > Internet Technology >
Share
Shulou(Shulou.com)06/03 Report--
As shown in the figure above, generally, the data produced by Kafka is provided by Sink of Flume. Here, we need to use Flume cluster to collect and distribute Agent logs to Kafka (for real-time computing processing) and HDFS (offline computing processing) through Flume cluster. Here, we use Flume as the log collection system to transfer the collected data to Kafka middleware for Storm to consume and calculate in real time. The whole process collects logs from each Web node through the Agent agent of Flume, and then aggregates them to the Flume cluster. The Sink of Flume transfers the logs to the Kafka cluster to complete the data production process. Integration case
First create a topic:
[uplooking@uplooking01] $kafka-topics.sh-- create-- topic flume-kafka-- partitions 3-- replication-factor 3-- zookeeper uplooking01:2181,uplooking02:2181,uplooking03:2181Created topic "flume-kafka". [uplooking01 @ uplooking01] $kafka-topics.sh-- describe flume-kafka-- zookeeper uplooking01:2181,uplooking02:2181 Uplooking03:2181Topic:flume-kafka PartitionCount:3 ReplicationFactor:3 Configs: Topic:flume-kafka Partition: 0 Leader: 101 Replicas: 101102103 Isr: 101102103 Topic:flume-kafka Partition: 1 Leader: 102 Replicas: 102103101 Isr: 102103101 Topic:flume-kafka Partition: 2 Leader: 103 Replicas: 103101102 Isr: 103101102
Start the kafka Consumer:
[uplooking@uplooking01] $kafka-console-consumer.sh-- topic flume-kafka-- zookeeper uplooking01:2181,uplooking02:2181,uplooking03:2181
The configuration file of Flume. Here is to listen for changes in files in a directory:
# the main function is to listen to the new files in the directory, collect the data, and output them to kafka## Note: the operation of Flume agent is mainly to configure the A1 under source channel sink##, which is the code name of agent. Source is called R1 channel called C1 sink called k1###a1.sources = r1a1.sinks = k1a1.channels = source configuration description for the new file a1.sources.r1.type = spooldira1.sources.r1.spoolDir = / home/uplooking/ in the listening directory Data/flume/sourcea1.sources.r1.fileHeader = truea1.sources.r1.fileHeaderKey = filepatha1.sources.r1.fileSuffix = .OKa1.sources.r1.deletePolicy = immediate# configuration description for sink using kafka for data consumption a1.sinks.k1.type = org.apache.flume.sink.kafka.KafkaSinka1.sinks.k1.topic = flume-kafkaa1.sinks.k1.brokerList = uplooking01:9092 Uplooking02:9092,uplooking03:9092a1.sinks.k1.requiredAcks = 1a1.sinks.k1.batchSize = 2 "configuration description for channel temporary cache of data using memory buffers a1.channels.c1.type = memorya1.channels.c1.capacity = 1000a1.channels.c1.transactionCapacity = 10" associate source R1 with sink K1 through channel C1 a1.sources.r1.channels = c1a1.sinks.k1.channel = C1
Start Flume:
Flume-ng agent-conf conf--name A1-conf-file conf/flume-kafka.conf
Add a hello file to the listening directory with the following contents:
Hello hehello mehello you
View the output of the kafka consumer side after adding:
[uplooking@uplooking01] $kafka-console-consumer.sh-- topic flume-kafka-- zookeeper uplooking01:2181,uplooking02:2181,uplooking03:2181hello hehello mehello you
This completes the integration of Kafka and Flume.
Welcome to subscribe "Shulou Technology Information " to get latest news, interesting things and hot topics in the IT industry, and controls the hottest and latest Internet news, technology news and IT industry trends.
Views: 0
*The comments in the above article only represent the author's personal views and do not represent the views and positions of this website. If you have more insights, please feel free to contribute and share.
Continue with the installation of the previous hadoop.First, install zookooper1. Decompress zookoope
"Every 5-10 years, there's a rare product, a really special, very unusual product that's the most un
© 2024 shulou.com SLNews company. All rights reserved.