Network Security Internet Technology Development Database Servers Mobile Phone Android Software Apple Software Computer Software News IT Information

In addition to Weibo, there is also WeChat

Please pay attention

WeChat public account

Shulou

What are the common misunderstandings of SEO

2025-01-15 Update From: SLTechnology News&Howtos shulou NAV: SLTechnology News&Howtos > Internet Technology >

Share

Shulou(Shulou.com)06/02 Report--

Editor to share with you what are the common misunderstandings of SEO, I believe that most people do not know much about it, so share this article for your reference, I hope you can learn a lot after reading this article, let's go to know it!

I. the content must be original.

About the original content, can not say that he is bad, but also can not blindly pursue, while emphasizing the fresh content, we should also learn to optimize the SEO quality of the page (such as reasonable page layout, pictures and text, etc.). For many SEO employees in the emphasis on originality, I do not know whether they have found it or not. After doing so much originality, there is still no page in the page to participate in the ranking. On the contrary, no matter how much originality is included, it is only to increase the ranking of the home page.

If it's a small corporate website, originality alone won't increase the value of your site, because in most cases it's just optimizing your home page. For those who really operate SEO word optimization, originality and inclusion are important, but basic SEO optimization is more important. Just like you have been insisting on originality every day, but ignoring the SEO core of page quality (such as the specification of images, the SEO standardization of TDK, the correct use of H tags, appropriate bold text rights, the frequency of keywords on the page, etc.). Even if it is a large website optimization, original content is also a very key core point. But you can find that any million-level large-scale website, a well-ranked site can not release 1 million of the original high-quality content, more is the page aggregation of keyword demand points, forming a huge network of intra-site pages. If you are still trying to be original, you might as well try content aggregation, perhaps the effect is better!

Second, this anti-chain is not another chain.

Whether they are SEO personnel who have been employed for a period of time or SEO personnel who have just started, SEO platforms with strong practicability such as love stations and webmaster tools are necessary products for almost everyone's SEO. But it is often these necessary products that make a lot of friends have ambiguous answers about the difference between outer chain and anti-chain. As shown in the following figure:

As of the time of submission, we can see that the anti-chain of Lu Songsong's blog is 26700, and the search results matched by the webmaster tool are exactly the same. But this kind of so-called anti-chain is mistaken for the outer chain of its own website by many SEO personnel. To understand the source of this kind of data, you must first understand an advanced domain search instruction in the search engine.

Simply understand that domain is the antilink domain of a site, which can also be called the backlink domain name of a site, and the search result is the number of backlink domain names that match it (the same site can be calculated multiple times).

If you take the backlink domain name queried by domain as the outer chain of your own website, then your understanding of the basic knowledge of SEO is too one-sided, because if what I do is the chain of anchor text, you can't retrieve it with domain at all.

Third, the ranking is unstable.

Generally speaking, there are only two kinds of friends who ask this question, one is that they have not been brushed, so you think it is a black hat behavior that leads to instability. Another is that there is no renewal after brushing the rankings, and the rankings are dropped, resulting in instability. In fact, I contributed in Lu Songsong blog about the fast ranking algorithm article, wrote in great detail, you may not understand technology, but you must know thinking. With regard to the ranking of brush clicks, if you can really simulate the authenticity of the click or set the software parameters properly, the ranking will be very stable. Of course, there is another situation that will lead to website ranking will not be very stable, that is, the site foundation is weak, need some appropriate content updates, the introduction of external links to enhance the page's keyword weight (also known as keyword weight) to promote the stability of the site weight, then your keyword ranking will become very stable.

IV. This weight is not the other weight

Even if you have done SEO for many years, you will take a look at the "weight" of the site from time to time. In fact, any search engine has its own page rating algorithm, which covers the combination of various SEO elements, and the final comprehensive score is the weight of page quality.

In spite of this, many friends who exchange links will go to see the so-called weight value to determine whether I will exchange this outer chain.

First of all, we need to understand that the weight of websites such as love stations and webmaster tools is only the so-called weight values generated by a series of traffic estimates through indexed keyword rankings, ranging from 1 to 10. And this kind of so-called weight whether it is your own website or used to exchange links does not make any sense. Because a brand word with no search volume can be brushed to the 10,000 index at any time, and the weight of 5 will appear immediately, and some highly searched words are not necessarily based on the Baidu index, but have a huge search volume, as shown in the following figure:

Therefore, many friends will give priority to operating some high-index keywords when optimizing keywords, but words that are really valuable, transformable and with a large amount of search will be ignored. And whether a website really has a high weight at least has the following points:

1. Age of domain name

It can be said that the domain name age of a site accounts for the core proportion of the SEO ranking results (it is more valuable than any other chain or content, because the site weight actually means the domain name weight). The advantage of the old station is self-evident, even if you have never done SEO, the site re-optimized for several years has an advantage over any new station. Because the search engine does not need to evaluate the behavior of the new site.

2. Page update frequency

The frequency of page updates does not determine the weight of the site, but it is a very good assessment factor for the exchange of links. What is the purpose of the page update? Nothing more than frequently attract spiders to grab, and update frequently site spider crawl is also frequent, and this kind of site exchange of links for their own site spider crawl also played a very good role.

3. Inside page ranking

Whether the weight of a site is high or not can be seen from the ranking of the inside page, especially for large websites, the domain name score is extremely high, even if the inner page does not operate any outer chain, it will be quickly included and indexed to participate in the ranking.

5. Keyword density should follow 2%, 8%.

I do not know who posted this misunderstanding, at least I have not seen any search engine say so, although this is a reference value, but for many SEO novices have given a very direct big stick, warned themselves to deliberately follow the density ratio when doing SEO optimization in the future. With the same 5% keyword density, some sites may be cheating and some are not. With the same 15% keyword density, some sites may be cheating, while others may not. The core point to measure whether a keyword is stacked on the site is definitely not to look at the keyword density. But look at the keyword frequency distribution of the web page structure itself.

Perhaps many friends know to use platforms such as Love Station or webmaster tools to query the keyword density, but if you are careful, you can find that the keyword density queried by the platform is very different, and if you pursue it, it is because the keyword density calculation formulas between platforms are different. To understand keyword density, you should first learn how to calculate the keyword density on a web page. Here are some examples of how to use Ai site and webmaster tools. As shown in the following figure:

We can see from the figure above that Lu Songsong of Lu Songsong's blog has a keyword density of 0.83% and 1.2% respectively, although the keyword density calculated by the two platforms is different. the biggest difference is that different platforms have different ways of handling keywords. Either the website or the webmaster tool will not directly calculate the keyword density of the page, but first calculate the total characters of the web page after removing all the HTML element codes.

Whether it is the keyword density calculation of platforms such as love station or webmaster tools, their calculation formula is the same, and the difference in density factor is due to the difference of keyword data after crawling. The calculation formula is as follows:

Web keyword density (percentage) = total length of key characters (length of key string * frequency of occurrence of keywords) / total length of page text

Such as love station data: about 0.082 (about 0.83%) = 96 characters (3 characters * 32 times) / 11584 characters

For example, webmaster tool data: about 0.0117 (about 1.2%) = 165characters (3 characters * 55 times) / 14070 characters

For the web page itself, it is composed of multiple DIV plates, and the common plates are made of three plates: the head, the middle and the bottom. The head can have a top, navigation, and so on, and there are multiple DIV layers in the middle according to the type of site. Most of the bottom is composed of bottom navigation and bottom links. Suppose your page appears 32 times like Lu Songsong's blog, if you put the 32 keywords into one section and thicken them, even if it is not a search engine, even users can understand that your site is doing deliberate keyword accumulation. So density doesn't matter, what matters is how you distribute the keyword frequency effectively and naturalize it, even if you're more than 8%. Because of this point will not be identified by search engines keyword stack cheating. But most SEO practitioners will not think about this problem deliberately, because the tool itself has a ready-made density calculation method.

6. This algorithm is not the other algorithm

With regard to algorithms, many friends will emphasize some algorithms published by search engines, but there are very few algorithms you really understand. For example, Baidu has the so-called blue sky algorithm, green turnip algorithm, ice bucket algorithm and so on. No matter what algorithm it is, it will always be made up of hundreds of thousands of words, and then let all SEO practitioners continue to daydream. With regard to algorithms, the real search engine algorithms are almost universal, such as HITS algorithm, HillTop algorithm and so on. Keywords are TF-IDF algorithm, web pages have document retrieval model and so on. And this kind of algorithm if you can control very cleverly, use freely, in fact, it is not difficult to find the search engine ranking direction. On the contrary, if you blindly pursue those reverie things, it is difficult to know the mystery behind the ranking of search engine results.

7. Index determines the difficulty of key words

This question should be in the first place of misunderstanding, but I put it at the end to talk about it. Whether it is Party B's SEO marketing company or personal SEO personnel. The vast majority of employees evaluate the index of keywords (Baidu Index) as the core standard of the difficulty of keyword optimization. The real degree of difficulty of keyword optimization includes at least four categories, and the index can be said to be the least valuable category. In essence, the keyword search index can only indicate the periodic heat of a keyword, but can not reflect the difficulty of the keyword. The degree of difficulty of keyword optimization has at least the following four types, from top to bottom.

1. Keyword length

Keyword optimization is easy or difficult, in fact, the length can best reflect, and should be the first. The shorter the keyword character, the more difficult it is to optimize the keyword, just like if a keyword is optimized by comparing 10 characters with one character, the latter is obviously much more difficult, even if there is no index or there is not much competition for page rankings. Because the shorter the keyword character is, the more users need to be controlled, which leads to the difficulty of keyword optimization. For example, searching for iphone may have to meet various needs, such as price, model, loophole, time to market, latest information and so on. But if you search for iphone7 prices, then the page only needs to be as clear as possible to meet the price of version 7 of iphone.

2. Number of search results

The difficulty of optimizing a keyword can often be determined by the search results of the keyword, as shown in the following figure:

As can be seen in the image above, the related result of iphone6s is 24 million, indicating that of all the web pages crawled by search engines, 24 million of the web pages collections contain the word iphone6s, while the result of iphone7 is 15.7 million. From this set of data, we can see that iphone6s is obviously more difficult to optimize than iphone7, because one word optimizes the ranking to less than 100th from a collection of 24 million documents, while the other word only needs 15.7 million.

3. The number of home page competition

There is no doubt that the number of home page competition is the focus of valuable and transformed buzzwords. The more ranking sites on the home page (independent web domain name home page), the greater the competition for keywords, even if there is no index to speak of, but the keywords are still very difficult. Such as a machinery industry keyword home page 10 locations, 8 of which are the main domain name home page ranking, there is no doubt that you use the main domain name home page to optimize the time and energy to be much more difficult than the general unpopular words.

4. Keyword index

The keyword index can only be said to be the weakest item to optimize the keyword difficulty, and even does not have much reference value. It can only be said that the keyword index can reflect the periodic index of a keyword (including transformation value, hot search, etc.).

The above is all the contents of the article "what are the common misunderstandings of SEO". Thank you for reading! I believe we all have a certain understanding, hope to share the content to help you, if you want to learn more knowledge, welcome to follow the industry information channel!

Welcome to subscribe "Shulou Technology Information " to get latest news, interesting things and hot topics in the IT industry, and controls the hottest and latest Internet news, technology news and IT industry trends.

Views: 0

*The comments in the above article only represent the author's personal views and do not represent the views and positions of this website. If you have more insights, please feel free to contribute and share.

Share To

Internet Technology

Wechat

© 2024 shulou.com SLNews company. All rights reserved.

12
Report