In addition to Weibo, there is also WeChat
Please pay attention
WeChat public account
Shulou
2025-03-28 Update From: SLTechnology News&Howtos shulou NAV: SLTechnology News&Howtos > Development >
Share
Shulou(Shulou.com)06/02 Report--
This article introduces the relevant knowledge of "how Python crawls stock trading data and visually displays". In the operation of actual cases, many people will encounter such a dilemma. Next, let the editor lead you to learn how to deal with these situations. I hope you can read it carefully and be able to achieve something!
Development environment
Interpreter version: python 3.8
Code Editor: pycharm 2021.2
Third-party module
Requests: pip install requests
Csv
Steps of a crawler case
1. Determine the url address (link address)
two。 Send a network request
3. Data parsing (filtering data)
4. Data preservation (database (mysql\ mongodb\ redis), local files)
Crawler full code analysis web page
Open the developer tool, search for keywords, and find the correct url
Import module import requests # sends network request import csv request data url = f 'https://xueqiu.com/service/v5/stock/screener/quote/list?page=1&size=30&order=desc&order_by=amount&exchange=CN&market=CN&type=sha&_=1637908787379'# camouflage headers = {# browser camouflage' User-Agent': 'Mozilla/5.0 (Windows NT 10.0; Win64 X64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/96.0.4664.45 Safari/537.36'} response = requests.get (url Headers=headers) json_data = response.json () parsing data data_list = json_data ['data'] [' list'] for data in data_list: data1 = data ['symbol'] data2 = data [' name'] data3 = data ['current'] data4 = data [' chg'] data5 = data ['percent'] data6 = data [' current_year_percent'] data7 = data ['volume'] data8 = data [' Amount'] data9 = data ['turnover_rate'] data10 = data [' pe_ttm'] data11 = data ['dividend_yield'] data12 = data [' market_capital'] print (data1 Data2, data3, data4, data5, data6, data7, data8, data9, data10, data11, data12) data_dict = {'stock symbol': data1, 'stock name': data2, 'current price': data3,'up and down': data4,'up and down': data5, 'year to date': data6, 'volume': data7, 'turnover': data8 Turnover: data9, TTM: data10, dividend yield: data11, Market capitalization: data12,} csv_write.writerow (data_dict)
Compare the url of 1, 2 and 3 pages of data to find the rules.
For page in range (1,56): url = f 'https://xueqiu.com/service/v5/stock/screener/quote/list?page={page}&size=30&order=desc&order_by=amount&exchange=CN&market=CN&type=sha&_=1637908787379' save data file = open (' data2.csv', mode='a', encoding='utf-8', newline='') csv_write = csv.DictWriter (file, fieldnames= ['stock symbol', 'stock name', 'current price','up and down','up and down' "year-to-date", "volume", "turnover", "turnover", "TTM", "dividend yield", "market capitalization"]) csv_write.writeheader () file.close () achieve results
Data visualization full code import data import pandas as pdfrom pyecharts import options as optsfrom pyecharts.charts import Bar read data data_df = pd.read_csv ('data2.csv') df = data_df.dropna () df1 = df [[' stock name'' Df2 = df1.iloc [: 20] print (df2 ['stock name'] .values) print (df2 ['stock name'] .values) visual chart c = (Bar () .add _ xaxis (list (stock ['stock name'])) .add _ yaxis ("stock trading volume") List (df2 ['Trading Volume']) .set _ global_opts (title_opts=opts.TitleOpts (title= "Trading Volume Chart-Volume chart"), datazoom_opts=opts.DataZoomOpts (),) .render ("data.html") print ('data visualization result is complete, please find the open data.html file in the current directory!') Effect display
This is the end of the content of "how Python crawls stock trading data and visualizes it". Thank you for reading. If you want to know more about the industry, you can follow the website, the editor will output more high-quality practical articles for you!
Welcome to subscribe "Shulou Technology Information " to get latest news, interesting things and hot topics in the IT industry, and controls the hottest and latest Internet news, technology news and IT industry trends.
Views: 280
*The comments in the above article only represent the author's personal views and do not represent the views and positions of this website. If you have more insights, please feel free to contribute and share.
Continue with the installation of the previous hadoop.First, install zookooper1. Decompress zookoope
"Every 5-10 years, there's a rare product, a really special, very unusual product that's the most un
© 2024 shulou.com SLNews company. All rights reserved.