Network Security Internet Technology Development Database Servers Mobile Phone Android Software Apple Software Computer Software News IT Information

In addition to Weibo, there is also WeChat

Please pay attention

WeChat public account

Shulou

Spark+kafka+redis statistics website visitors IP

2025-01-17 Update From: SLTechnology News&Howtos shulou NAV: SLTechnology News&Howtos > Internet Technology >

Share

Shulou(Shulou.com)06/03 Report--

* the purpose is to prevent collection. Need to carry on a real-time IP access monitoring to the log information of the website.

1. Kafka version is the latest 0.10.0.0

2. Spark version is 1.61,

3. Download the corresponding spark-streaming-kafka-assembly_2.10-1.6.1.jar and put it in the lib directory under the spark directory.

Download address https://repo1.maven.org/maven2/org/apache/spark/

4. Write the nginx log to kafka using flume (follow-up supplement)

5. Write a python script named test_spark_collect_ip.py

# coding:utf-8__author__ = 'chenhuachao'''' uses pyspark to connect to kafka to count the IP information of visitors A real-time anti-collection''import sysreload (sys) sys.setdefaultencoding (' utf-8') import redisimport datetimefrom pyspark.streaming.kafka import KafkaUtilsfrom pyspark.streaming import StreamingContextfrom pyspark import SparkConf SparkContextdef parse (logstring): try: infodict = eval (logstring.encode ('utf-8')) ip = infodict.get (' ip') assert infodict ['tj-event'] = =' onload' assert ip return (ip) except: return () def insert_redis (rdd):''write the qualified result to redis''' conn = redis.Redis (IP' of host='redis Port=6380) for iMagnej in rdd.collect (): print iMagnej if j > = 3 and j! = "": conn.sadd ('cheating_ip_set_ {0}' .format (datetime.datetime.now (). Strftime ("% Y%m%d")), I) conn.expire ('cheating_ip_set' 86400) if _ name__ = "_ main__": topic = 'statis-detailinfo-pageevent' sc = SparkContext (appName= "pyspark_kafka_streaming_chc") ssc = StreamingContext (sc,10) checkpointDirectory =' / tmp/checkpoint/cp3' ssc.checkpoint (checkpointDirectory) kvs = KafkaUtils.createDirectStream (ssc, ['statis-detailinfo-pageevent'], kafkaParams= {"auto.offset.reset": "largest", "metadata.broker.list": "kafka-IP:9092" Kafka-IP:9092 "}) # kvs.map (lambda line:line [1]) .map (lambda x:parse (x)) .pprint () # the concept of a sliding window is used here For more information, please refer to http://www.kancloud.cn/kancloud/spark-programming-guide/51567 # ipcount = kvs.map (lambda line: line [1]) .map (lambda x:parse (x)) .map (lambda ip: (ip,1)) .reduceByKey (lambda ips,num:ips+num) ipcount = kvs.map (lambda line: line [1]) .map (lambda x:parse (x)) .map (lambda ip: (ip,1)) .reduceByKeyAndWindow (lambda ips,num:ips+num) 30pr 10) # preprocessing If multiple calculations are required, use the cache # to pass in the rdd, that is, the bad rdd for foreachRdd (insertRedis) ipcount.foreachRDD (insert_redis) # wordCounts.foreachRDD (lambda rdd: rdd.foreach (sendRecord)) ssc.start ()

6. Execute orders

Bin/spark-submit-- jars lib/spark-streaming-kafka-assembly_2.10-1.6.1.jar test_spark_collect_ip.py

7. Output interface

8. For more information, please refer to spark's official website http://spark.apache.org/docs/latest/api/python/pyspark.streaming.html#module-pyspark.streaming.kafka

Welcome to subscribe "Shulou Technology Information " to get latest news, interesting things and hot topics in the IT industry, and controls the hottest and latest Internet news, technology news and IT industry trends.

Views: 0

*The comments in the above article only represent the author's personal views and do not represent the views and positions of this website. If you have more insights, please feel free to contribute and share.

Share To

Internet Technology

Wechat

© 2024 shulou.com SLNews company. All rights reserved.

12
Report