Network Security Internet Technology Development Database Servers Mobile Phone Android Software Apple Software Computer Software News IT Information

In addition to Weibo, there is also WeChat

Please pay attention

WeChat public account

Shulou

How does Python crawl vegetable price data from food business network

2025-03-26 Update From: SLTechnology News&Howtos shulou NAV: SLTechnology News&Howtos > Internet Technology >

Share

Shulou(Shulou.com)06/02 Report--

Python how to crawl food business network vegetable price data, many novices are not very clear about this, in order to help you solve this problem, the following editor will explain for you in detail, people with this need can come to learn, I hope you can gain something.

Preface

Vegetables are a kind of plants or fungi that can cook and cook into food. Vegetables are one of the indispensable foods in people's daily diet.

Recently, vegetable prices have risen, which has aroused widespread concern. "double Festival" is coming, what is the price trend?

On September 16, the National Development and Reform Commission held a regular press conference in September. In response to the situation related to the rise in vegetable prices, National Development and Reform Commission spokesman Meng Wei said that the growth cycle of vegetables is relatively short. In the later period, with the reduction of extreme weather, vegetables are on the market one after another in autumn, and market supply is expected to recover in a relatively short period of time. The price of fresh vegetables will fall along with it.

Project goal

Crawl vegetable price data from food business network

Victim web site

Https://price.21food.cn/

Crawler code

Import tool

Import requestsimport parselimport csvimport time

Analyze the website and crawl the data

For page in range (1,19): time.sleep (1) url = 'https://price.21food.cn/guoshu-p{}.html'.format(page) headers = {' user-agent': 'Mozilla/5.0 (Windows NT 10.0 WOW64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/81.0.4044.138 Safari/537.36'} response = requests.get (url=url) Headers=headers) selector = parsel.Selector (response.text) lis = selector.css ('.gs _ top_t2_left div:nth-child (1) .sjs _ top_cent_erv ul li') dit = {} for li in lis: name = li.css. Get () dit [' breed'] = name price = li.css ('td span::text'). Get () Dit ['average price'] = price qushi = li.css ('td .sc _ up::text'). Get () if qushi = = None: dit ['trend'] = 'falling' else: dit ['trend'] = qushi csv_write.writerow (dit) print (dit)

Save data

F = open ('data.csv', mode='a', encoding='utf-8-sig', newline='') csv_write = csv.DictWriter (f, fieldnames= [' variety', 'average price', 'trend']) csv_write.writeheader () is it helpful for you to read the above? If you want to know more about the relevant knowledge or read more related articles, please follow the industry information channel, thank you for your support.

Welcome to subscribe "Shulou Technology Information " to get latest news, interesting things and hot topics in the IT industry, and controls the hottest and latest Internet news, technology news and IT industry trends.

Views: 0

*The comments in the above article only represent the author's personal views and do not represent the views and positions of this website. If you have more insights, please feel free to contribute and share.

Share To

Internet Technology

Wechat

© 2024 shulou.com SLNews company. All rights reserved.

12
Report