Network Security Internet Technology Development Database Servers Mobile Phone Android Software Apple Software Computer Software News IT Information

In addition to Weibo, there is also WeChat

Please pay attention

WeChat public account

Shulou

How to use Python to crawl dynamic data in one step

2025-02-25 Update From: SLTechnology News&Howtos shulou NAV: SLTechnology News&Howtos > Internet Technology >

Share

Shulou(Shulou.com)06/02 Report--

Today, I will talk to you about how to use Python to crawl dynamic data in one step. Many people may not know much about it. In order to make you understand better, the editor has summarized the following content for you. I hope you can get something according to this article.

Preface

Recently, it is the season to write graduation thesis again. Many fans and friends sent me private messages asking if my brother could help me climb some data for me to write my thesis. At this time, a little girl was planning to buy a kitten. So my brother searched the whole web and found that the following website had to use his small hands to complete this arduous task. Students who like crawlers or students who need to crawl data can talk privately, brother.

Page analysis

Let's visit the address: http://www.maomijiaoyi.com/index.php?/chanpinliebiao_pinzhong_38.html

At this time, we can see a list of cats, but watching through F12 is actually a page returned, rather than our usual Json, and we also need to open the returned page to get detailed information about specific cats, such as price, phone number, age, breed, number of visits, and so on.

What we need to do at this point

Parse the returned list

Parse the regional data

Ask for specific information about the cat

Parse the returned page

Save the data to the csv file

CSV file

The startup program will save the following contents:

Code implementation

1. Import dependent environment

`import requests # echo request pip install requests`` import parsel # html page parser pip install parsel`` import csv # text saving`

2. Get a list of cats

`url = "http://www.maomijiaoyi.com/index.php?/chanpinliebiao_pinzhong_37_"+str(i)+"--24.html"` `headers = {``'User-Agent':' Mozilla/5.0 (Windows NT 10.0; Win64) X64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/90.0.4430.72 Safari/537.36' ``}`` data = requests.get (url=url, headers=headers). Text```selector = parsel.Selector (data) ``urls = selector.css ('div. Content:nth-child (1) a::attr (href)'). Getall () `

3. According to get the specific data of the cat

`for s in regionAndURL: ``url= "http://www.maomijiaoyi.com" + s [0] ``address = s [1] ``data = requests.get (url=url) Headers=headers) .text``selector = parsel.Selector (data) ``title = selector.css ('.detail _ text. Title:: text'). Get (). Strip () # # tag ``price = selector.css ('. Info1 span:nth-child (2):: text'). Get (). Strip () # Price ``viewsNum = selector.css ('.info1 span:nth-child (4):: text'). Get () # # number of visits ``commitment = selector.css ('.info1 div:nth-child (2) span::text'). Get (). Replace ("seller promises:" ") # seller promises ``variety = selector.css ('.info2 div:nth-child (1) .red:: text') .get () # only ``variety = selector.css ('.info2 div:nth-child (3) .red:: text'). Get () # Variety ``prevention = selector.css ('.info2 div:nth-child (4) .red:: text'). Get () # pre-sold Prevent ``user _ info div:nth-child (1) .c333:: text'). Get () # contact name ``phone = selector.css ('.user _ info div:nth-child (2) .c333:: text'). Get () # phone ``shipping = selector.css ('.user_info div:nth-child (3) .c333:: text'). Get (). Strip () # Freight ``purebred = selector.css ('.item _ neirong div:nth-child (1) .c333:: text'). Get (). Strip () # whether purebred ```quantityForSale = selector.css (' .item _ neirong div:nth-child (3) .c333:: text'). Get (). Strip () # quantity for Sale ``catSex = selector.css ('.item _ neirong div:nth-child (4) .c333:: text'). Get ( ). Strip () # cat gender ``catAge = selector.css ('div.xinxi_neirong. Item:nth-child (2) div:nth-child (2) .c333:: text'). Get (). Strip () # cat age ``dewormingSituation = selector.css (``' div.xinxi_neirong .item: nth-child (2) div:nth-child (3) .c333:: text'). Get (). Strip () # deworming Case ``canWatchCatsInVideo = selector.css (``'div.xinxi_neirong. Item:nth-child (2) div:nth-child (4) .c333:: text') .get () .strip () # you can watch the cat by video`

4. Save the data as a csv file

`f = open ('cat .csv', mode='a', encoding='utf-8', newline='') ``csvHeader = csv.DictWriter (f, ``fieldnames= ['region', 'tag', 'price', 'number of visits', 'seller commitment', 'only number on sale', 'region', 'breed', 'prevention', 'contact name', 'phone', ``freight', 'purebred' 'quantity for sale', 'cat gender', 'cat age', 'deworming situation', 'video viewing cat', 'detailed address']) ``# set header ```cat () ```dis = {```region': address, ``'tag': title,``' Price': price, ``'browsing times': viewsNum, ```seller promise': commitment,``' number of onlineOnly': onlineOnly ```breed': variety, ```Preventive': prevention, ``'contact name': contactPerson, ````phone': phone,``' Freight charge': shipping, ``'whether purebred': purebred,``' quantity for sale': quantityForSale, ``'Cat gender': catSex,``' Cat age': catAge, ```'deworming condition': dewormingSituation, ``'video viewing cat 'canWatchCatsInVideo: ``'detailed address': url``} ``csvHeader.writerow (dis) `finish reading the above content Do you have any further understanding of how to use Python to crawl dynamic data in one step? If you want to know more knowledge or related content, please follow the industry information channel, thank you for your support.

Welcome to subscribe "Shulou Technology Information " to get latest news, interesting things and hot topics in the IT industry, and controls the hottest and latest Internet news, technology news and IT industry trends.

Views: 0

*The comments in the above article only represent the author's personal views and do not represent the views and positions of this website. If you have more insights, please feel free to contribute and share.

Share To

Internet Technology

Wechat

© 2024 shulou.com SLNews company. All rights reserved.

12
Report