In addition to Weibo, there is also WeChat
Please pay attention
WeChat public account
Shulou
2025-01-16 Update From: SLTechnology News&Howtos shulou NAV: SLTechnology News&Howtos > Internet Technology >
Share
Shulou(Shulou.com)06/02 Report--
In this issue, the editor will bring you about how to climb weather data in Python. The article is rich in content and analyzes and narrates it from a professional point of view. I hope you can get something after reading this article.
To draw with pygal, you need to install pip install pygal before using this module, and then import import pygal
Bar = pygal.Line () # create a line chart bar.add ('minimum temperature', lows) # add a two-line data series bar.add ('maximum temperature' Highs) # Note that lows and highs are int type lists bar.x_labels = daytimesbar.x_labels_major = daytimes [:: 30] bar.x_label_rotation = 45bar.title = cityname+' temperature trend chart for the next seven days'# set graphic title bar.x_title = 'date' # x-axis title bar.y_title = 'temperature (degrees Celsius)' # y-axis title bar.legend_at_bottom = Truebar.show_x_guides = Falsebar.show_y_guides = Truebar.render_to_file ('temperate1.svg') # Save the image as a SVG file You can parse the page module import pygalimport cityinfocityname = input through the complete browser code import csvimport sysimport urllib.requestfrom bs4 import BeautifulSoup # ("Please enter the city you want to query the weather:") if cityname in cityinfo.city:citycode = cityinfo.city [cityname] else:sys.exit () url = 'very sorry, the web page cannot be accessed' + citycode +'. Shtml'header = ("User-Agent", "Mozilla/5.0 (Windows NT 10.0)" Win64 X64) AppleWebKit/537.36 (KHTML Like Gecko) Chrome/76.0.3809.132 Safari/537.36 ") # set header information http_handler = urllib.request.HTTPHandler () opener = urllib.request.build_opener (http_handler) # modify header information opener.addheaders = [header] request = urllib.request.Request (url) # make request response = opener.open (request) # get answer packet html = response.read () # read reply packet html = html .decode ('utf-8') # set encoding Otherwise, it will be garbled # preliminary filtering based on the page information obtained final = [] # initialize a list to save data bs = BeautifulSoup (html, "html.parser") # create a BeautifulSoup object body = bs.bodydata = body.find ('div' {'id':' 7d'}) print (type (data)) ul = data.find ('ul') li = ul.find_all (' li') # crawl the data you need I = 0 # Control crawl days lows = [] # keep low temperature highs = [] # Save High temperature daytimes = [] # Save date weathers = [] # Save Weather for day in li: # easy to find Each liif I < 7:temp = [] # temporarily stores daily data date = day.find ('h2'). String # gets the date # print (date) temp.append (date) daytimes.append (date) inf = day.find_all (' p') # traversing the p tag under li there are multiple p tags that need to use find_all instead of find#print (inf [0] .stri ng) # to extract the value of the first p tag That is, weather temp.append (inf [0] .string) weathers.append (inf [0] .string) temlow = inf [1] .find ('i'). String # minimum temperature if inf [1]. Find ('span') is None: # Weather forecast may not have a maximum temperature temhigh = Nonetemperate = temlowelse:temhigh = inf [1] .fi nd (' span'). String # maximum temperature temhigh = temhigh.replace ('℃') '') temperate = temhigh +'/'+ temlow# temp.append (temhigh) # temp.append (temlow) lowStr = "" lowStr = lowStr.join (temlow.string) lows.append (int (lowStr [:-1])) # convert low temperature NavigableString to int type and store it in the low temperature list if temhigh is None:highs.append (int (lowStr [:-1]) highStr = "" highStr = highStr.join (temhigh) highs.append (int (highStr)) # The above three lines convert the high temperature NavigableString to int type and store it in the high temperature list temp.append (temperate) final.append (temp) I = I + weather to write the final acquired weather to the csv file with open ('weather.csv' 'averse, errors='ignore', newline='') as f:f_csv = csv.writer (f) f_csv.writerows ([cityname]) f_csv.writerows (final) # drawing bar = pygal.Line () # create a line chart bar.add ('minimum temperature', lows) bar.add ('maximum temperature' Highs) bar.x_labels = daytimesbar.x_labels_major = daytimes [:: 30] # bar.show_minor_x_labels = False # does not show the minimum X-axis scale bar.x_label_rotation = 45bar.title = cityname+' the temperature trend chart for the next seven days' bar.x_title = 'date' bar.y_title = 'temperature (degrees Celsius)' bar.legend_at_bottom = Truebar.show_x_guides = Falsebar.show_y_guides = Truebar. Render_to_file ('temperate.svg') Python crawl weather data example extension: import requestsfrom bs4 import BeautifulSoupfrom pyecharts import Bar ALL_DATA = [] def send_parse_urls (start_urls): headers = {"User-Agent": "Mozilla/5.0 (Windows NT 10.0) Win64 X64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/81.0.4044.122 Safari/537.36 "} for start_url in start_urls:response = requests.get (start_url,headers=headers) # solution to the coding problem response = response.text.encode (" raw_unicode_escape "). Decode (" utf-8 ") soup = BeautifulSoup (response," html5lib ") # lxml parser: performance is good Html5lib: suitable for the chaotic page structure div_tatall = soup.find ("div", class_= "conMidtab") # find () to find the first element that meets the requirements tables = div_tatall.find_all ("table") # find_all () to find the list of all elements that meet the requirements for table in tables:trs = table.find_all ("tr") info_trs = trs [2:] for index Info_tr in enumerate (info_trs): # enumerated functions The special case of index judged by index # print (index,info_tr) # print ("=" * 30) city_td = info_tr.find_all ("td") [0] temp_td = info_tr.find_all ("td") [6] # if should be later than the general case. Overwrite the previous data if index==0:city_td = info_tr.find_all ("td") [1] temp_td = info_tr.find_all ("td") [7] city=list (city_td.stripped_strings) [0] temp=list (temp_td.stripped_strings) [0] ALL_DATA.append ({"city": city "temp": temp}) return ALL_DATA def get_start_urls (): start_urls = ["http://www.weather.com.cn/textFC/hb.shtml","http://www.weather.com.cn/textFC/db.shtml","http://www.weather.com.cn/textFC/hd.shtml","http://www.weather.com.cn/textFC/hz.shtml","http://www.weather.com.cn/textFC/hn.shtml", "http://www.weather.com.cn/textFC/xb.shtml","http://www.weather.com.cn/textFC/xn.shtml","http://www.weather.com.cn/textFC/gat.shtml", ] return start_urls def main (): the main program logically shows the bar chart of the temperature rankings of the ten cities with the lowest real-time temperatures in China. # 1 get all the initial urlstart_urls = get_start_urls () # 2 send a request to get the response, parse the page data = send_parse_urls (start_urls) # print (data) # 4 data visualization # 1 sort data.sort (key=lambda data:int (data ["temp"])) # 2 slice Select the ten cities with the lowest temperature and the temperature value show_data = data [: 10] # 3 separate the city and the temperature city = list (map (lambda data:data ["city"], show_data)) temp = list (map (data ["temp"]), show_data) # 4 create a bar chart and generate the target chart chart = Bar ("China minimum temperature list") # need to install the pyechart module chart.add (", city") Temp) chart.render ("tempture.html") if _ _ name__ = ='_ _ main__':main () this is how to crawl weather data in the Python shared by the editor. If you happen to have similar doubts, you might as well refer to the above analysis to understand. If you want to know more about it, you are welcome to follow the industry information channel.
Welcome to subscribe "Shulou Technology Information " to get latest news, interesting things and hot topics in the IT industry, and controls the hottest and latest Internet news, technology news and IT industry trends.
Views: 0
*The comments in the above article only represent the author's personal views and do not represent the views and positions of this website. If you have more insights, please feel free to contribute and share.
Continue with the installation of the previous hadoop.First, install zookooper1. Decompress zookoope
"Every 5-10 years, there's a rare product, a really special, very unusual product that's the most un
© 2024 shulou.com SLNews company. All rights reserved.