In addition to Weibo, there is also WeChat
Please pay attention
WeChat public account
Shulou
2025-01-18 Update From: SLTechnology News&Howtos shulou NAV: SLTechnology News&Howtos > Internet Technology >
Share
Shulou(Shulou.com)06/02 Report--
This article mainly shows you "how to analyze a website from the perspective of SEO". The content is simple and clear. I hope it can help you solve your doubts. Let the editor lead you to study and learn this article "how to analyze a website from the perspective of SEO".
I. Optimization of the necessary foundation
From a seo point of view, the most basic optimization is
1. Centralized detection
301 centralization is to unify all web sites:
For example:
Www.xxx.com and www.xxx.com/index.html
Both URLs can access the same website at the same time, but although both can be accessed, the weight of each other is dispersed, so it is a necessary optimization condition.
For settings, go to the control panel of your own space and add index.html default access files.
2. Robots settings
The function of robots is to tell spiders which directories can be accessed and which directories cannot. The finer the setting is, the more the spider will crawl according to your instructions. Here's an example.
For example, we are doing article information, so it is of no use to members, so we can use robots to shield spiders to grab relevant members' php file information, so as to save spider resources and spend energy on the knife edge.
To analyze the website to see how robots writes, can also reflect the person's seo level.
3. 404 page detection
404 pages are required, and generally can be tested by entering the correct URL and then casually adding characters
Such as
Www.abc.com is the correct URL
Www.abc.com/fer is a random URL.
See if there are settings for the 404 page.
It's simple
4. Spatial stability
The next thing to test is the stability of the server space. Just imagine if it takes half a day to open a URL, who can bear it? spiders are the same. in this respect, you can visit the test experience and use the webmaster tool to do the test.
2. Diagnosis of TDK
The so-called TDK is
1. Title (page title)
Whether the ranking of the website is good or not is closely related to the title of the webpage. At this time, it is necessary to see how its keywords are combined, whether they are suspected of piling up keywords, or whether the layout of keywords is unreasonable.
This step is crucial.
In fact, there are many ways to layout keywords. Due to the limited space, they will be introduced in detail in other articles later.
The number of layout keywords, 1-3 is enough
2. Keywords (web keywords)
A reasonable website keyword book should be 3-5 is enough, too many is not good
3. Description (webpage description)
This mainly depends on whether the description is smooth, whether the core keywords are expressed, or whether the services of the site are written out and put into the search page to be competitive.
Third, path url optimization
Path optimization can have several elements.
1. Chinese path
In general, if the URL contains a Chinese path, although it is conducive to the user experience, but there may be some bad impact on the search, so try to use the form of letters or pinyin.
2. Path deduction
For example, we visit a URL www.xxx.com/a/b/c such a path, and then we remove the c path to visit www.xxx.com/a/b to see if it can be accessed normally, this is a reasonable website path, but also in line with the user's access habits.
3. The path contains dynamic parameters
If there are too many dynamic parameters in the URL that are not good for crawling, this needs to be modified. There is a situation here, that is, if there are only a few dynamic symbols in the URL url, such as?, # is not very relevant.
4. Deep road strength
Generally speaking, if there are more than 3-tier directories, it is a deep path, but sometimes some websites are likely to develop into a 4-tier path, but it is too long. At this time, we should consider using the form of secondary domain name to shorten the depth of the URL.
This can be analyzed in detail from the website log.
6. it has been a long time, but there is no path included.
Not included, there are many cases, perhaps spiders have never been found, perhaps the quality of these pages is too low to be put into the index database.
If the spider has never found these pages through the analysis, then adding an internal link entry to him will solve the problem.
To detect these paths, use the search site command and site log analysis to synthesize the search
The above is all the content of the article "how to analyze a website from the perspective of SEO". Thank you for reading! I believe we all have a certain understanding, hope to share the content to help you, if you want to learn more knowledge, welcome to follow the industry information channel!
Welcome to subscribe "Shulou Technology Information " to get latest news, interesting things and hot topics in the IT industry, and controls the hottest and latest Internet news, technology news and IT industry trends.
Views: 0
*The comments in the above article only represent the author's personal views and do not represent the views and positions of this website. If you have more insights, please feel free to contribute and share.
Continue with the installation of the previous hadoop.First, install zookooper1. Decompress zookoope
"Every 5-10 years, there's a rare product, a really special, very unusual product that's the most un
© 2024 shulou.com SLNews company. All rights reserved.