In addition to Weibo, there is also WeChat
Please pay attention
WeChat public account
Shulou
2025-02-24 Update From: SLTechnology News&Howtos shulou NAV: SLTechnology News&Howtos > Network Security >
Share
Shulou(Shulou.com)06/01 Report--
has a functional requirement to crawl the traffic graph of yesterday's whole day on cacti every day, and make a table of min, max and average traffic on the traffic graph and send it by e-mail. (python2.7.5)
1. Modules to be used
#! / usr/bin/env python#-*- coding: UTF-8-*-import time,datetime,cookielib,requests,sys,re,urllib2,urllib,socket,csv,heapqimport smtplibimport mimetypesimport timefrom email.mime.text import MIMETextfrom email.mime.multipart import MIMEMultipartfrom email.mime.base import MIMEBasefrom email import encodersfrom email.MIMEImage import MIMEImagedefault_encoding = 'utf-8'if sys.getdefaultencoding ()! = default_encoding: reload (sys)
2. Login function of cacti
Def Login1 (): socket.setdefaulttimeout (10) global headers headers= {} cj = cookielib.CookieJar () global opener opener = urllib2.build_opener (urllib2.HTTPCookieProcessor (cj)) # cacti account password data = urllib.urlencode ({'action':'login','login_username':'admin') 'login_password':'123456'}) # Open the cacti home page page = urllib2.Request ("http://100.0.102.3/index.php", data, headers) # crawl page data html = opener.open (page). Read () # if there is a graph_settings.php link in the page data Indicates that the login is successful (return 1), otherwise the login fails (return 0) if re.findall ("graph_settings.php", html): return 1 else: return 0
3. Define a few more functions that need to be used
# convert date to time stamp (start and end times need to be specified for cacti image URL) def datetime_timestamp (dt): time.strptime (dt,'% Y-%m-%d% HV% MV% S') s = time.mktime (time.strptime (dt,'% Y-%m-%d% HV% MV% S') return int (s) # convert bit to Gbit/Mbit/Kbit (the difference between levels is 1000 times) Keep 2 decimal places) def tobit (x): if x > = 1000000000: return str (round (xmab 1000000000 else 2)) +'G' elif x > = 1000000: return str (round (xhammer 1000000lance 2)) +'M 'elif x > = 1000: return str (round (xapwise 1000000 else 2)) +' K'else: return str (xMagne2))
4. Start grabbing the picture and csv (as shown below, the blue download arrow on the right of the picture is csv).
Try: # if the login is successful if Login1 () = = 1: # the start time of the traffic graph, yesterday's 00:00:00 start_time=str (datetime_timestamp ((datetime.datetime.today ()-datetime.timedelta (days=1)). Strftime ('% Y-%m-%d 00start_time=str ('% Y-%m-%d 00start_time=str () # the end time of the traffic graph Today's 00:00:00 end_time=str (datetime_timestamp (datetime.datetime.today (). Strftime ('% Y-%m-%d 0000) # Traffic graph URL for the whole day yesterday (2687 is picture id) url1= "http://100.0.102.3/graph_image.php?action=zoom&local_graph_id=2687&rra_id=0&view_type=&graph_start="+start_time+"& Graph_end= "+ end_time # download pictures Save to local request = urllib2.Request (url1, None, headers) res = opener.open (request) .read () f=open ("/ myftpdir/2687.png" "wb") f.write (res) f.close () # download the csv corresponding to the picture (used to read values such as max, min, etc.) url2= "http://100.0.102.3/graph_xport.php?local_graph_id=2687&rra_id=0&view_type=&graph_start="+start_time+"&graph_end="+end_time request = urllib2.Request (url2, None) Headers) res = opener.open (request). Read () f=open ("/ myftpdir/2687.csv", "wb") f.write (res) f.close () # read the csv file f=open ('/ myftpdir/2687.csv' 'rb') reader = csv.reader (f) # csv saves the average upload (outbound) and download (inbound) rate every 5 minutes (288 lines) Generate 2 lists to store all upload and download rates inbounds= [] outbounds= [] nuplo0 for row in reader: # Lines 11 to 298 of the table are if n > = 10 and n
Welcome to subscribe "Shulou Technology Information " to get latest news, interesting things and hot topics in the IT industry, and controls the hottest and latest Internet news, technology news and IT industry trends.
Views: 0
*The comments in the above article only represent the author's personal views and do not represent the views and positions of this website. If you have more insights, please feel free to contribute and share.
Continue with the installation of the previous hadoop.First, install zookooper1. Decompress zookoope
"Every 5-10 years, there's a rare product, a really special, very unusual product that's the most un
© 2024 shulou.com SLNews company. All rights reserved.