In addition to Weibo, there is also WeChat
Please pay attention
WeChat public account
Shulou
2025-03-01 Update From: SLTechnology News&Howtos shulou NAV: SLTechnology News&Howtos > Development >
Share
Shulou(Shulou.com)06/03 Report--
This article introduces the relevant knowledge of ".net how to solve the problem of multiple and repeated fetching of spider". In the operation of actual cases, many people will encounter such a dilemma. Next, let the editor lead you to learn how to deal with these situations. I hope you can read it carefully and be able to achieve something!
Reason:
In the early stage, due to the imperfection of search engine spiders, spiders are easy to get lost when crawling dynamic url due to unreasonable website programs and other reasons.
So spiders don't read dynamic url, especially bands, in order to avoid the previous phenomenon? Url of
Solution:
1): configure routin
The copy code is as follows:
Routes.MapRoute ("RentofficeList"
"rentofficelist/ {AredId}-{PriceId}-{AcreageId}-{SortId}-{SortNum} .html"
New {controller = "Home", action = "RentOfficeList"}
New [] {"Mobile.Controllers"})
The first parameter is the route name
The second parameter is the Url mode of the route, which is separated by {}-{}.
The third parameter is an object that contains the default route
The fourth parameter is a set of namespaces for the application
2): set up the connection
Default sort
Write the parameter assignment in turn against the Url mode above
3): get parameters
The copy code is as follows:
Int areaId = GetRouteInt ("AredId"); / / get parameters
/ / /
/ get the value in the route
/ / /
/ key
/ / default value
/ / /
Protected int GetRouteInt (string key, int defaultValue)
{
Return Convert.ToInt32 (RouteData.Values [key], defaultValue)
}
/ / /
/ get the value in the route
/ / /
/ key
/ / /
Protected int GetRouteInt (string key)
{
Return GetRouteInt (key, 0)
}
According to the above three steps, the url address displayed is:
Http://localhost:3841/rentofficelist/3-0-0-0-0.html
In this way, you can avoid using dynamic parameters on static pages, and the pages displayed are all static pages.
This is the end of the content of ".net how to solve the problem of multiple and repeated fetching of spider". Thank you for reading. If you want to know more about the industry, you can follow the website, the editor will output more high-quality practical articles for you!
Welcome to subscribe "Shulou Technology Information " to get latest news, interesting things and hot topics in the IT industry, and controls the hottest and latest Internet news, technology news and IT industry trends.
Views: 0
*The comments in the above article only represent the author's personal views and do not represent the views and positions of this website. If you have more insights, please feel free to contribute and share.
Continue with the installation of the previous hadoop.First, install zookooper1. Decompress zookoope
"Every 5-10 years, there's a rare product, a really special, very unusual product that's the most un
© 2024 shulou.com SLNews company. All rights reserved.