Network Security Internet Technology Development Database Servers Mobile Phone Android Software Apple Software Computer Software News IT Information

In addition to Weibo, there is also WeChat

Please pay attention

WeChat public account

Shulou

What are the specific reasons why the site aggregation page or on-site search page has been reduced to the target of Baidu search?

2025-01-28 Update From: SLTechnology News&Howtos shulou NAV: SLTechnology News&Howtos > Servers >

Share

Shulou(Shulou.com)06/01 Report--

This article mainly shows you the "site aggregation page or site search page has been reduced to Baidu search target specific reasons", the content is easy to understand, clear, hope to help you solve doubts, the following let the editor lead you to study and learn "site aggregation page or site search page has been reduced to Baidu search target specific reasons for what are the specific reasons" this article.

The Baidu web search anti-cheating team recently found that some websites traverse popular keywords to produce a large number of on-site search results pages to obtain search engine traffic. among them, a large number of irrelevant content has seriously damaged the search engine user experience and encroached on the revenue of high-quality websites in the corresponding field. We will deal with this kind of website severely. I hope that the webmaster with this problem will make rectification and adjustment in time.

After the LEE announcement, according to a SEO boss and Baidu LEE had some simple exchanges, in the process of exchange, LEE said that the crackdown is obviously junk pages, some vertical or industry sites on the site search pages are actually good quality, not within the scope of the attack.

The phenomenon of using the thesaurus TAG aggregation page or on-site search page to snatch search traffic has existed for a long time, and Baidu officials have declared on different occasions that this is obvious cheating and will crack down on it.

But there is no problem with the existence of aggregation page and on-site search page itself, and it is also generated in response to the needs of users, so why are there so many websites produced aggregation pages or on-site search pages reduced to be hit by Baidu search? Let's talk about the specific reasons from the following:

I. the standardization of keywords themselves

Some websites do not distinguish between the types of keywords, are directly imported into the site, using a unified template to generate pages. For example, some websites will rigidly import "coffee machine", "coffee machine image", "coffee machine brand" and so on to generate a unified search page, but there is only the title and release time of the content in the search page, which is just a simple search. It does not show users the relevant content of "coffee machine picture" and "coffee machine brand".

When this kind of miscellaneous words in the website reaches a certain level, from a macro point of view, the whole type of page is junk pages, it is only to defraud search traffic, and does not provide corresponding content for search users, which seriously reduces the quality of search results. this kind of behavior has a great impact and is bound to be severely cracked down by search engines.

Second, the correlation between keywords and websites

Some websites do not analyze the correlation between keywords and websites, and dare to guide any words inside the station. For example, mobile websites may import some keywords such as real estate and tourism, while tourism websites may introduce keywords related to beauty, cars or industry, and even some formal content sites have introduced keywords such as "Thai shemale" and "Vietnamese bride trading" that have nothing to do with each other.

The positioning of the website itself has nothing to do with these keywords, and it is even more impossible to have relevant content in the website. No matter what means the page is made, the quality will not be any better. Such pages are bound to be rejected and severely cracked down by search engines.

Third, the imbalance between the number of keywords and the content of the website.

The types of keywords collected and processed by some websites and the relevance between keywords and site content are good, but there is no prior check to see if the capacity of the site is sufficient to support these keywords. For example, some websites have only 60,000 content pages.

But introduced 1.2 million keywords, no matter how good the keyword quality, keywords and site theme again related, with 60,000 content to integrate 1.2 million of the page is impossible, then if these pages are not empty pages (search content can not be found), macroscopically there must be a large number of duplicate pages, resulting in the entire type of pages are junk pages, it is only natural to be hit by search engines.

IV. The quality of intra-site search

Some websites may solve all the above problems and make sure that the relevant content can be found by using the keyword before releasing the keyword, but the quality of the search inside the site is too poor, which leads to the irrelevance between the content of all aggregated TAG tags or search pages and the keyword itself. this is not a special problem, and most websites will have this problem. "search" is a very difficult thing. Baidu has been searching for so many years, and the search results are often complained by everyone, not to mention the "search" done by other non-professional search companies.

But since you want to use "search" to do something, it is necessary to ensure the minimum search quality, do not appear search "accounting training" and give a bunch of "snack training", "English training", "Yang Zi SEO training" and so on.

If most websites do not have a strong technical team and strong third-party search technical support, the quality of search results on the site will not be very high, so the quality of TAG and search pages based on low-quality intra-site search technology will not be flattered. As a result, the relevant types of pages can be classified into the sequence of spam pages, and such intra-site search pages or TAG pages are targeted by search engines.

The above are all the contents of this article, "what are the specific reasons why the site aggregation page or on-site search page has been reduced to Baidu search target?" Thank you for your reading! I believe we all have a certain understanding, hope to share the content to help you, if you want to learn more knowledge, welcome to follow the industry information channel!

Welcome to subscribe "Shulou Technology Information " to get latest news, interesting things and hot topics in the IT industry, and controls the hottest and latest Internet news, technology news and IT industry trends.

Views: 0

*The comments in the above article only represent the author's personal views and do not represent the views and positions of this website. If you have more insights, please feel free to contribute and share.

Share To

Servers

Wechat

© 2024 shulou.com SLNews company. All rights reserved.

12
Report