In addition to Weibo, there is also WeChat
Please pay attention
WeChat public account
Shulou
2025-04-04 Update From: SLTechnology News&Howtos shulou NAV: SLTechnology News&Howtos > Internet Technology >
Share
Shulou(Shulou.com)06/01 Report--
This article mainly introduces "Python how to make and extract the air quality data of the designated site". In the daily operation, I believe that many people have doubts about how to make Python to extract the air quality data of the designated site. The editor consulted all kinds of materials and sorted out a simple and easy-to-use method of operation. I hope it will be helpful for everyone to answer the question of "how to make Python to extract the air quality data of the designated site". Next, please follow the editor to study!
Preface
For most of the data sets we download, we may need to extract the specified ones for use, such as this air quality data set, there are so many sites across the country, I only want the site data in the area I studied, however, when I opened the folder, I was disappointed because the data were all csv files. One way is that excel can use scripts to merge these individual csv into a single csv, but the possible concomitant problem is that the data exceeds the storage limit of the excel. So, we extract the data from the specified site in a different way.
The data used in this experiment are national air quality data from 2014 to 2020, with hourly resolution. Take a screenshot to see what it looks like:
If you open it one by one to extract what you need, it will be crazy. So, use the artifact-- Python to complete this experiment.
Targets is the site you specify and you want to extract. Just enter the site code of whoever you want to extract.
The overall code is as follows:
Import osimport pandas as pd# defines the relevant parameters dataPath ='. / data' # data catalog targets = ['1001A, 1002A, 1002A, 1003A, 1004A, 1005A, 1005A, 1007A, 1007A. '1008A'] # destination site result = [[] for i in range (len (targets))] # used to save the results # start traversing for filepath in os.listdir (dataPath): # traverse each folder for filename in os.listdir ('% sdebit% swatch%) (dataPath Filepath)): if not filename.endswith ('.csv'): # de-duplicating non-csv data file continue data = pd.read_csv ('% s bank%% s bank% (dataPath,filepath,filename)) for i in range (0Curren (data)) 15): for k in range (len (targets)): try: item = {'date':data [' date'] [I], # date 'hour':data [' hour'] [I]} # hours for j in range (I) Item [data ['type'] [j]] = data [targets [k]] [j] result.append (item) except: pass print ('% s processed'% filename) # Save result for i in range (len (targets)): pd .DataFrame (result [I]) .to_csv ('% s.csv'%targets [I]) Index=False)
Run can be run on startup, and the result input is the csv data of these sites, which contains the data of all the elements of the recorded time range (such as PM10, etc.)
At this point, the study on "how to make Python to extract the air quality data of a designated site" is over. I hope to be able to solve your doubts. The collocation of theory and practice can better help you learn, go and try it! If you want to continue to learn more related knowledge, please continue to follow the website, the editor will continue to work hard to bring you more practical articles!
Welcome to subscribe "Shulou Technology Information " to get latest news, interesting things and hot topics in the IT industry, and controls the hottest and latest Internet news, technology news and IT industry trends.
Views: 0
*The comments in the above article only represent the author's personal views and do not represent the views and positions of this website. If you have more insights, please feel free to contribute and share.
Continue with the installation of the previous hadoop.First, install zookooper1. Decompress zookoope
"Every 5-10 years, there's a rare product, a really special, very unusual product that's the most un
© 2024 shulou.com SLNews company. All rights reserved.