In addition to Weibo, there is also WeChat
Please pay attention
WeChat public account
Shulou
2025-01-30 Update From: SLTechnology News&Howtos shulou NAV: SLTechnology News&Howtos > Internet Technology >
Share
Shulou(Shulou.com)06/01 Report--
This article mainly introduces how python crawls QFN data, which has certain reference value. Interested friends can refer to it. I hope you will gain a lot after reading this article. Let's take you to know it.
Preface
The text and pictures of this article come from the Internet, are for study and communication only, and do not have any commercial use. The copyright belongs to the original author. If you have any questions, please contact us in time for handling.
This goal
Crawl QFN data
Https://shenzhen.qfang.com/newhouse
Crawl target data:
Community name
Sales status
Housing area
Household type
Opening time
Delivery time
Real estate address
Selling price
Estimated total price
Emmmm, let me have a look. I can't afford it.
Development tools
Python 3.6.5
Pycharm
Crawler code
Import tool
Import requestsimport parselimport csv
Parsing web pages and crawling data
For page in range (1,84): print ('= = crawling data on page {} = = '.format (page)) url =' https://shenzhen.qfang.com/newhouse/list/n{}'.format(page) headers = {'User-Agent':' Mozilla/5.0 (Windows NT 10.0) WOW64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/81.0.4044.138 Safari/537.36'} response = requests.get (url=url) Headers=headers) selector = parsel.Selector (response.text) lis = selector.css ('.list-result li') dit = {} for li in lis: title = li.css (' .list-main-header an em::text'). Get () # name dit ['title'] = title status = li.css ('.list-main-header iRanger text'). Get () # Whether dit ['property status'] = status space = li.css ('. List-main div:nth-child (1) .space span::text') .get () # sales area dit ['area of sale'] = space type_list = li.css ('.list-main.fl p:nth-child (3) span aVRV text'). Getall () # Huxing type_str ='| '.join (type_list) .strip () .replace ('\ r\ n') Replace (). '') # Household dit ['Huxing'] = type_str kp_time = li.css ('new-house-info > div:nth-child (2) > p.space.fl.clearfix > span::text'). Get () # opening time dit [' opening time'] = kp_time cs_time = li.css ('.new-house-info > div:nth-child (2) ) > p:nth-child (3) > span::text'). Get () # time of sale dit ['time of sale'] = cs_time address = li.css ('.list-main a:nth-child (3):: text'). Get () # address if not address = = None: address = address.strip () else: address = None Dit ['address'] = address Price = li.css ('. List-price. Bigger. Amount:: text'). Get () # selling price dit ['selling price'] = Price hj_Price = li.css ('.list-price .smaller:: text'). Get () # estimated total price dit [' estimated total price'] = hj_Price
Save data
F = open ('real estate data .csv', mode='a', encoding='utf-8-sig', newline='') csv_writer = csv.DictWriter (f, fieldnames= ['title', 'property status', 'area of sale', 'household type', 'opening time', 'sale time', 'address', 'price', 'estimated total price']) csv_writer.writeheader () print (dit)
Run the code, and the effect is as follows
Thank you for reading this article carefully. I hope the article "how python crawls QFN data" shared by the editor will be helpful to everyone. At the same time, I also hope you can support us and pay attention to the industry information channel. More related knowledge is waiting for you to learn!
Welcome to subscribe "Shulou Technology Information " to get latest news, interesting things and hot topics in the IT industry, and controls the hottest and latest Internet news, technology news and IT industry trends.
Views: 0
*The comments in the above article only represent the author's personal views and do not represent the views and positions of this website. If you have more insights, please feel free to contribute and share.
Continue with the installation of the previous hadoop.First, install zookooper1. Decompress zookoope
"Every 5-10 years, there's a rare product, a really special, very unusual product that's the most un
© 2024 shulou.com SLNews company. All rights reserved.