In addition to Weibo, there is also WeChat
Please pay attention
WeChat public account
Shulou
2025-01-18 Update From: SLTechnology News&Howtos shulou NAV: SLTechnology News&Howtos > Network Security >
Share
Shulou(Shulou.com)06/01 Report--
Preparatory work
Install scrapyd: pip install scrapyd
Install scrapyd-client: pip install scrapyd-client
Install curl: [installation address] (configure the directory to the environment variable after the http://ono60m7tl.bkt.clouddn.com/curl.exe), installation is complete
Start deployment
Modify the scrapy.cfg file in the scrapy project directory as follows
[deploy:JD_Spider] # add target: nameurl = http://localhost:6800/ # delete the name of project = JD # project before the #, you can use the default, of course, you can change it.
Open the terminal in any directory, type scrapyd, and observe whether it runs successfully. If it runs successfully, you can open http://localhost:6800 to see whether it is displayed normally. If it is displayed normally, you can see the following picture. The JD here can only be seen after deployment, so don't worry if it doesn't appear:
Run the following command from the root directory of the project: python E:\ python2.7\ Scripts\ scrapyd-deploy target-p project, where E:\ python2.7\ Scripts\ is your python installation directory, and Scripts is a folder under the installation directory. Be sure to add that python,target is the deploy:JD_Spider set in the previous scrapy.cfg. JD_Spider is target,project is JD, so the complete command is python E:\ python2.7\ Scripts\ scrapyd-deploy JD_Spider-p JD. Now that the project is deployed on it, there will be JD on the web page. For more information, please see the figure above.
To verify the success, you can see if your project name is displayed on the web page, and type python E:\ python2.7\ Scripts\ scrapyd-deploy-l in the root directory to list all your deployed projects.
Start the crawler: curl http://localhost:6800/schedule.json-d project=myproject-d spider=spider_name, where the project is the project name, and spider_name is the name defined in your crawler. The complete code for running my example is: curl http://localhost:6800/schedule.json-d project=JD-d spider=spider. The following information will be displayed here:
# jobid is more important here. This will be used to cancel the crawler {"status": "ok", "jobid": "3013f9d1283611e79a63acb57dec5d04", "node_name": "DESKTOP-L78TJQ7"}.
Cancel the crawler: curl http://localhost:6800/cancel.json-d project=myproject-d job=jobid,jobid is mentioned above. If you cancel my example code such as: curl http://localhost:6800/cancel.json-d project=JD-d job=3013f9d1283611e79a63acb57dec5d04, its status will be as follows:
{"status": "ok", "prevstate": "running", "node_name": "DESKTOP-L78TJQ7"}
List projects: the projects you have deployed will appear under curl http://localhost:6800/listprojects.json,
Delete item: curl http://localhost:6800/delproject.json-d project=myproject
List version: curl http://localhost:6800/listversions.json?project=myproject, where project is the name of the project and is set in scrapy.cfg
List crawlers: curl http://localhost:6800/listspiders.json?project=myproject where project is the name of the project and is set in scrapy.cfg
List job:curl http://localhost:6800/listjobs.json?project=myproject where project is the name of the project and is set in scrapy.cfg
Delete version: curl http://localhost:6800/delversion.json-d project=myproject-d version=r99. Version here is the version number of your own project. You need to check the version number before deleting it.
Welcome to subscribe "Shulou Technology Information " to get latest news, interesting things and hot topics in the IT industry, and controls the hottest and latest Internet news, technology news and IT industry trends.
Views: 0
*The comments in the above article only represent the author's personal views and do not represent the views and positions of this website. If you have more insights, please feel free to contribute and share.
Continue with the installation of the previous hadoop.First, install zookooper1. Decompress zookoope
"Every 5-10 years, there's a rare product, a really special, very unusual product that's the most un
© 2024 shulou.com SLNews company. All rights reserved.