In addition to Weibo, there is also WeChat
Please pay attention
WeChat public account
Shulou
2025-03-04 Update From: SLTechnology News&Howtos shulou NAV: SLTechnology News&Howtos > Internet Technology >
Share
Shulou(Shulou.com)06/03 Report--
1. Log collection
The data on flume agent is generally divided into two lines, one is kafka cluster, the other is streaming (spark streaming or storm, etc.), the other is to hdfs, and the later can be processed by hive.
The industry is called lambda Architecture architecture (this architecture is used in general corporate recommendation systems)
After flume-ng agent collects and collects logs, it aggregates on one node (or does not aggregate)
Why do you want to aggregate? Why not write directly to the kafka cluster?
If the company is relatively large, there are countless flume nodes, and so many of them are connected to kafka, which will increase the complexity. There will be an aggregation node (which will be composed of multiple nodes to prevent a single node from dying), and the log format can be handled uniformly to filter unwanted data.
Hdfs can keep data permanently, and mr can process data for as long as it takes.
Kafka cluster data can be stored for a certain period of time, but not for a long time. Sparkstreaming can only handle accessing data within a certain period of time.
Storm flow
Data source nginx log, mysql log, tomcat log, etc.->
Flume->
Kafka message piece messages are sent here to cache data for a period of time->
Spark streaming+spark sql on yarn cluster (real-time computing)-> storage
1-> redis adminLTE + flask front-end components + echarts3 are integrated into the monitoring system
2-> influxdb time series distributed database grafana visualization component (the combination of the two is better)
(elk kibana)
Storage + visual analysis
Drawing tool
Http://www.processon.com/
Welcome to subscribe "Shulou Technology Information " to get latest news, interesting things and hot topics in the IT industry, and controls the hottest and latest Internet news, technology news and IT industry trends.
Views: 0
*The comments in the above article only represent the author's personal views and do not represent the views and positions of this website. If you have more insights, please feel free to contribute and share.
Continue with the installation of the previous hadoop.First, install zookooper1. Decompress zookoope
"Every 5-10 years, there's a rare product, a really special, very unusual product that's the most un
© 2024 shulou.com SLNews company. All rights reserved.