In addition to Weibo, there is also WeChat
Please pay attention
WeChat public account
Shulou
2025-02-27 Update From: SLTechnology News&Howtos shulou NAV: SLTechnology News&Howtos > Internet Technology >
Share
Shulou(Shulou.com)06/03 Report--
Flume architecture and core components (1) Source collection responsible for where to collect data (2) Channel records (3) Sink output official documents
Http://flume.apache.org/FlumeUserGuide.html
Http://flume.apache.org/FlumeUserGuide.html#starting-an-agent
The idea of using Flume
The key to using flume is to write configuration files
(1) configure Source
(2) configure Channerl
(3) configure Sink
(4) string the above three components together
Example 1: collect data from a specified network port and output it to the console code: # example.conf: a single-node Flume configuration# Name the components on this agenta1.sources = r1a1.sinks = k1a1.channels = clocked Describe/configure the sourcea1.sources.r1.type = netcata1.sources.r1.bind = localhosta1.sources.r1.port = 4444 Describe the sinka1.sinks.k1.type = logger# Use a channel which buffers events in memorya1.channels.c1.type = memorya1.channels. C1.capacity = 1000a1.channels.c1.transactionCapacity = 10 minutes Bind the source and sink to the channela1.sources.r1.channels = c1a1.sinks.k1.channel = C1 start agent
Http://flume.apache.org/FlumeUserGuide.html#starting-an-agent
$bin/flume-ng agent-n $agent_name-c conf-f conf/flume-conf.properties.template-Dflume.root.logger=INFO,console
-n and-name have the same meaning, and are agent names
-c and-conf have the same meaning, to specify a configuration file
-Dflume.root.logger=INFO,console outputs execution information in the console
Use telnet to test telnet localhost 44444 output analysis Event: {headers: {} body: 68 65 6c 6c 6f 0d hello}
Event is the basic unit of Flume data transmission.
Event = optional header + byte array
Example 2: monitor a file to collect the new data in real time and output it to the console Agent selection
Exec source + memory channel + logger sink
Exec Source document address
Http://flume.apache.org/FlumeUserGuide.html#exec-source
Code implementation # Name the components on this agenta1.sources = r1a1.sinks = k1a1.channels = centering Describe/configure the sourcea1.sources.r1.type = execa1.sources.r1.command = tail-F / var/log/test.loga1.sources.r1.shell = / bin/sh-c # Describe the sinka1.sinks.k1.type = logger# Use a channel which buffers events in memorya1.channels.c1.type = memorya1.channels.c1.capacity = 1000a1.channels.c1.transactionCapacity = 10cm Bind the source and sink to The channela1.sources.r1.channels = c1a1.sinks.k1.channel = C1 sample 3: real-time collection of A-side server logs to B-side server technology selection exec source + memory channel + avro sinkavro source + memory channel + logger sink code to achieve A-side server exec-memory-avro.sources = exec-sourceexec-memory-avro.sinks = avro-sinkexec-memory-avro.channels = memory-channelexec-memory-avro.sources.exec-source.type = execexec-memory-avro.sources. Exec-source.command = tail-F / home/hadoop/data/data.logexec-memory-avro.sources.exec-source.shell = / bin/sh-cexec-memory-avro.sinks.avro-sink.type = avroexec-memory-avro.sinks.avro-sink.hostname = hadoop000exec-memory-avro.sinks.avro-sink.port = 44444exec-memory-avro.channels.memory-channel.type = memoryexec-memory-avro.sources.exec-source.channels = memory-channelexec-memory-avro.sinks.avro-sink .channel = memory-channelB server avro-memory-logger.sources = avro-sourceavro-memory-logger.sinks = logger-sinkavro-memory-logger.channels = memory-channelavro-memory-logger.sources.avro-source.type = avroavro-memory-logger.sources.avro-source.bind = hadoop000avro-memory-logger.sources.avro-source.port = 44444avro-memory-logger.sinks.logger-sink.type = loggeravro-memory-logger.channels.memory-channel.type = memoryavro-memory-logger.sources.avro- Source.channels = memory-channelavro-memory-logger.sinks.logger-sink.channel = memory-channel
Welcome to subscribe "Shulou Technology Information " to get latest news, interesting things and hot topics in the IT industry, and controls the hottest and latest Internet news, technology news and IT industry trends.
Views: 0
*The comments in the above article only represent the author's personal views and do not represent the views and positions of this website. If you have more insights, please feel free to contribute and share.
Continue with the installation of the previous hadoop.First, install zookooper1. Decompress zookoope
"Every 5-10 years, there's a rare product, a really special, very unusual product that's the most un
© 2024 shulou.com SLNews company. All rights reserved.