In addition to Weibo, there is also WeChat
Please pay attention
WeChat public account
Shulou
2025-01-14 Update From: SLTechnology News&Howtos shulou NAV: SLTechnology News&Howtos > Internet Technology >
Share
Shulou(Shulou.com)06/01 Report--
This article shows you how to optimize the robots settings in robots.txt, which is concise and easy to understand, which will definitely brighten your eyes. I hope you can get something through the detailed introduction of this article.
one. Do you know the meaning of robots.txt in Chinese?
two. Did you know that 404 pages should be blocked?
three. Is there a "about us" on every page of your enterprise station, and it has to be unblocked?
four. Do you know that setting robots.txt is helpful to improve the overall weight of the website?
five. You want to know the basic way to write robots.txt?
six. How to write robots.txt to optimize the best.
one. Robots.txt means "search engine Public Agreement" in Chinese. When a search engine comes to your website, it is to look at robots.txt first, and then crawl your page according to the agreement written above.
two. Every site should have 404 pages, if not, it will lead to 1) reduce site PV 2) reduce traffic 3) search engine is not good for your site, because your site user experience is not good.
Every site has a lot of error pages, each error page should be a 404 page, then every 404 page is the same. Once you have too many 404 pages, you let the search engine include them, which will cause the site to assign weights to these 404 pages. is it worth it? So we have to block 404 pages.
three. On the navigation bar of the enterprise station, there are often pages such as "online money making blog introduction", "contact September" and "about us". These pages, can be said to be the site-wide display, almost every page of the site has to be assigned weight to these pages, but are these pages worth giving weight to them? Therefore, we must also block these "unimportant content" pages, do not let search engines crawl.
four. From 3.4 points, we can draw a conclusion: if a website has too many irrelevant pages included, and you don't want users to enter these pages directly from the search engine, then the site will assign weights to these "meaningless" pages. This will reduce the weight of the overall site.
In a nutshell, the website is dragged down by these junk pages.
five. The basic way of writing robots.txt.
Create a document called robots.txt on the desktop
The following is written in order:
User-agent: *
Disallow: / wp-admin/
Disallow: / wp-content/
Disallow: / feed
Disallow: / comments/feed
Disallow: / trackback
Disallow: / guestbook
User-agent: # indicates the allowed search engine. * means to allow all. If only Baidu spiders are allowed, write baiduspider.
Disallow: # means forbidden to include
Disallow: / # means all inclusion is prohibited. Disallow directly: add a slash after it.
Disallow: / website background directory / # shielded directory format
Disallow: / 404.html # blocking 404 pages
In robots.txt, any spaces or line breaks are ignored. Robots.txt only looks at the order of commands, spaces and line breaks are all for our convenience.
six. How should robots.txt be written to optimize the best?
After reading the first 5 points, I believe you should have a good understanding of how to optimize the best.
1) block junk pages that are useless to the site, such as 404
2) think about which of your pages must appear, but you don't feel the need to let users "look at it from the search engine." Such as "about us" by Fuzhou SEO. Then you don't have to be included.
The above is how to optimize the robots settings in robots.txt. Have you learned any knowledge or skills? If you want to learn more skills or enrich your knowledge reserve, you are welcome to follow the industry information channel.
Welcome to subscribe "Shulou Technology Information " to get latest news, interesting things and hot topics in the IT industry, and controls the hottest and latest Internet news, technology news and IT industry trends.
Views: 0
*The comments in the above article only represent the author's personal views and do not represent the views and positions of this website. If you have more insights, please feel free to contribute and share.
Continue with the installation of the previous hadoop.First, install zookooper1. Decompress zookoope
"Every 5-10 years, there's a rare product, a really special, very unusual product that's the most un
© 2024 shulou.com SLNews company. All rights reserved.