In addition to Weibo, there is also WeChat
Please pay attention
WeChat public account
Shulou
2025-02-25 Update From: SLTechnology News&Howtos shulou NAV: SLTechnology News&Howtos > Internet Technology >
Share
Shulou(Shulou.com)06/01 Report--
How to use python for static crawlers and address longitude and latitude conversion, many novices are not very clear about this, in order to help you solve this problem, the following editor will explain for you in detail, people with this need can come to learn, I hope you can gain something.
"the following shows the method of static crawling with python through a small example, and the list of cities crawled down is returned to the corresponding longitude and latitude by calling Baidu map api."
Requests and bs4
1fetch requests.get (): grab all the data on the web page.
2BeautifulSoup (): after calling requests.get (), you can parse the code by calling BeautifulSoup (). At this point, a BeautifulSoup object is generated. For this object, the find_all function can help find the corresponding label, and the get_text function can return the text content of the object.
Baidu Map api Application
Call Baidu Map api interface, input address information, and get the corresponding longitude and latitude coordinates. The operation of applying for API AK is as follows:
1. Go to the official website of Baidu Map Open platform and register, website address: http://lbsyun.baidu.com/
2. Click "apply for key", fill in your personal information and register your email address.
3. In the console, click "create application", customize the application name, and select the appropriate "application type". In this case, it is called through the browser, so select "browser side". In the IP whitelist, if you do not want to restrict IP, you can write 0.0.0.0amp0.
4. After submission, the "AK" column in the application list is the key of the application.
Code example
The syntax of python2 is slightly different from that of python3. In this example, the code is run under python3.
1. Load the corresponding module
Crawler 2, define the crawler function
Open the web page we want to crawl, as shown below:
Right-click on the page to view the source code of the page, and find that the city list is recorded in the p tag, so we can find the p tag to get the corresponding text location.
In this example, we capture the first-and second-tier cities, and the crawler function code is as follows:
3. Define the function of longitude and latitude from address to latitude
First, observe that the format of longitude and latitude of the address is returned through the api call:
It can be seen that the longitude and latitude information is stored in the p tag. Next, the longitude and latitude function of address translation is defined by crawling the result of the api call:
4, define the program entry
Finally, some of the returned results are as follows:
Is it helpful for you to read the above content? If you want to know more about the relevant knowledge or read more related articles, please follow the industry information channel, thank you for your support.
Welcome to subscribe "Shulou Technology Information " to get latest news, interesting things and hot topics in the IT industry, and controls the hottest and latest Internet news, technology news and IT industry trends.
Views: 0
*The comments in the above article only represent the author's personal views and do not represent the views and positions of this website. If you have more insights, please feel free to contribute and share.
Continue with the installation of the previous hadoop.First, install zookooper1. Decompress zookoope
"Every 5-10 years, there's a rare product, a really special, very unusual product that's the most un
© 2024 shulou.com SLNews company. All rights reserved.