In addition to Weibo, there is also WeChat
Please pay attention
WeChat public account
Shulou
2025-03-01 Update From: SLTechnology News&Howtos shulou NAV: SLTechnology News&Howtos > Development >
Share
Shulou(Shulou.com)06/01 Report--
Today Xiaobian to share with you about python scrapy simple simulation login code how to write the relevant knowledge points, detailed content, clear logic, I believe that most people still know too much about this knowledge, so share this article for your reference, I hope you have something to gain after reading this article, let's take a look at it.
1. Requests module. Directly carry the cookies request page.
Find the url and send a post request to store the cookie.
2. Selenium (the browser automatically handles cookie).
Find the appropriate input tag, enter the text, and click Log in.
3. Scrapy takes cookies directly.
Find the url and send a post request to store the cookie.
#-*-coding: utf-8-*-import scrapyimport re class GithubLoginSpider (scrapy.Spider): name = "github_login" allowed_domains = ["github.com"] start_urls = ["https://github.com/login"] def parse (self) Response): # send a Post request to get Cookies authenticity_token = response.xpath ("/ / input [@ name=" authenticity_token "] / @ value"). Extract_first () utf8 = response.xpath ("/ / input [@ name=" utf8 "] / @ value"). Extract_first () commit = response.xpath ("/ / input [@ name=" commit "] / @ value"). Extract_first () form_data = {"login": "pengjunlee@163.com" "password": "123456", "webauthn-support": "supported", "authenticity_token": authenticity_token, "utf8": utf8, "commit": commit} yield scrapy.FormRequest ("https://github.com/session", formdata=form_data, callback=self.after_login) def after_login (self Response): # verify whether the request is successful print (re.findall ("Learn Git and GitHub without any code!", response.body.decode ()
Knowledge point expansion:
The parse_login method is the callback callback function that specifies the method to be executed after the form is submitted, in order to verify success. Here we directly search for the word Welcome Liu in response to prove that the login is successful.
This is easy to understand, the key point is yield from super (). Start_resquests (), which means that once the login is successful, directly take the cookie value after the login is successful, the address in the method start_urls.
In this way, the response after a successful login can be written directly in parse.
#-*-coding: utf-8-*-import scrapyfrom scrapy import FormRequest,Requestclass ExampleLoginSpider (scrapy.Spider): name = "login_" allowed_domains = ["example.webscraping.com"] start_urls = ["http://example.webscraping.com/user/profile"] login_url =" http://example.webscraping.com/places/default/user/login" def parse (self) Response): print (response.text) def start_requests (self): yield scrapy.Request (self.login_url,callback=self.login) def login (self,response): formdata= {"email": "liushuo@webscraping.com", "password": "12345678"} yield FormRequest.from_response (response,formdata=formdata Callback=self.parse_login) def parse_login (self,response): # print (">" + response.text) if "Welcome Liu" in response.text: yield from super (). Start_requests () these are all the contents of the article "how to simply simulate login code for python scrapy" Thank you for reading! I believe you will gain a lot after reading this article. The editor will update different knowledge for you every day. If you want to learn more knowledge, please pay attention to the industry information channel.
Welcome to subscribe "Shulou Technology Information " to get latest news, interesting things and hot topics in the IT industry, and controls the hottest and latest Internet news, technology news and IT industry trends.
Views: 0
*The comments in the above article only represent the author's personal views and do not represent the views and positions of this website. If you have more insights, please feel free to contribute and share.
Continue with the installation of the previous hadoop.First, install zookooper1. Decompress zookoope
"Every 5-10 years, there's a rare product, a really special, very unusual product that's the most un
© 2024 shulou.com SLNews company. All rights reserved.