Many beginners may often see it when reading some SEO tutorials. Optimization requires static website URLs, and even include this in a very important rank. I personally do not agree with this view. If you choose a dynamic URL or a static page, it is enough to choose a dynamic URL or a static page according to the needs of the website.
Why do many SEOers emphasize the use of static pages?
In the early days, due to the imperfect search engine Spider, the website program was unreasonable, or some people deliberately caused a spider trap, causing Spider to enter a dead loop when crawling. In order to avoid this dead loop, search engines reduce the reading of dynamic URLs, especially URLs with the "?" symbol.
With the improvement of search engines, this phenomenon has been basically solved. Spider has been able to read dynamic URL addresses normally and smoothly, as well as URLs with the "?" symbol. However, the URL addresses with too many environment variables (such as ?a=1&b=2&c=3&d=4...) are still not ideal (according to relevant information, Spider basically does not read URLs with more than 3 environment variables).
If there is a difference between a dynamic URL and a static page, what is the difference?
Lesshu believes that as far as current search engine technology is concerned, if there is a difference between dynamic URLs and static URLs, this difference is mainly reflected in the following two aspects:
1. Spider crawling efficiency: As mentioned above, the URL of multiple environment variables; the cookie data causes changes in the access page; the Session ID is given to different visitors, etc. These factors will cause Sipider to be less crawling efficient, which will restrict the performance of dynamic URLs in search engines.
2. Page trust issue: No search engine has publicly stated this issue that it has a trust value for web pages. However, through my experiments and references to relevant information, static pages are generally better than dynamic URLs in search engines under the same conditions.
How good are static pages in search engines?
For search engines, static pages are still better than dynamic URLs, but how big is the advantage? Lesshu believes that the gap is very small. If we grasp the in-site link strategy and properly optimize the URL, this gap will be extremely small.
My SEO forum has been using dynamic URLs, and the URL has not been optimized too much. Since its opening in 2005, the term SEO has basically been between the top 1 and 2 digits.
Staticization of dynamic URLs: pseudostatic
The dynamic URL does not have a specific file in the website directory, but returns the requested result from the database based on the user's request. We can simulate this process as a static page or directory, which is the common "pseudostatic".
At present, many mainstream programs support pseudo-static, and we can also DIY the programs as needed and then set the website environment accordingly. The rewrite module of mod_rewrite under Apache can be easily implemented, and there are also rewrite components in the Windows environment. There are many tutorials on this aspect on the Internet. You can search and view relevant information to implement it.
In-site optimization strategy for dynamic URLs
How to make dynamic URLs perform better? Lesishu puts forward the following three key points from the perspective of the site:
1. Navigation is very important: a perfect site navigation is conducive to Spider's efficient crawling. The entire site should do the best: start from the home page, and reach the more important position without more than 3 clicks; the most important position should have a corresponding link on the home page; if necessary, a static navigation page can be made separately.
2. Make a website map: First of all, there should be a website map page in the website, linking common locations in this page, so that users can understand the website structure as quickly as possible, and the search engine Spider can crawl the website quickly and efficiently; secondly, XML-formatted website maps are not only suitable for Google, but XML maps should be kept as updated as possible; finally, it is recommended to make a URL list (and), which is beneficial and harmless to the website.
3. URL uniformity and specification: multiple versions of the same URL (such as ?id=1 and ?id=1&page=1), multiple URL forms appear in the same location (such as ?companyname=xxx and ?companyid=123), and case problems may cause similar pages to appear in search engines, thereby affecting the performance of the website in search engines.
In short, I personally believe that dynamic URLs are not necessarily worse than static pages. As long as they are optimized properly, they will also perform well in search engines. Moreover, in many cases when you can only or best use dynamic programs (such as forums, order systems and other sites with strong interaction), there is no need to deliberately require HTML generation or make static pages.
Why do many SEOers emphasize the use of static pages?
In the early days, due to the imperfect search engine Spider, the website program was unreasonable, or some people deliberately caused a spider trap, causing Spider to enter a dead loop when crawling. In order to avoid this dead loop, search engines reduce the reading of dynamic URLs, especially URLs with the "?" symbol.
With the improvement of search engines, this phenomenon has been basically solved. Spider has been able to read dynamic URL addresses normally and smoothly, as well as URLs with the "?" symbol. However, the URL addresses with too many environment variables (such as ?a=1&b=2&c=3&d=4...) are still not ideal (according to relevant information, Spider basically does not read URLs with more than 3 environment variables).
If there is a difference between a dynamic URL and a static page, what is the difference?
Lesshu believes that as far as current search engine technology is concerned, if there is a difference between dynamic URLs and static URLs, this difference is mainly reflected in the following two aspects:
1. Spider crawling efficiency: As mentioned above, the URL of multiple environment variables; the cookie data causes changes in the access page; the Session ID is given to different visitors, etc. These factors will cause Sipider to be less crawling efficient, which will restrict the performance of dynamic URLs in search engines.
2. Page trust issue: No search engine has publicly stated this issue that it has a trust value for web pages. However, through my experiments and references to relevant information, static pages are generally better than dynamic URLs in search engines under the same conditions.
How good are static pages in search engines?
For search engines, static pages are still better than dynamic URLs, but how big is the advantage? Lesshu believes that the gap is very small. If we grasp the in-site link strategy and properly optimize the URL, this gap will be extremely small.
My SEO forum has been using dynamic URLs, and the URL has not been optimized too much. Since its opening in 2005, the term SEO has basically been between the top 1 and 2 digits.
Staticization of dynamic URLs: pseudostatic
The dynamic URL does not have a specific file in the website directory, but returns the requested result from the database based on the user's request. We can simulate this process as a static page or directory, which is the common "pseudostatic".
At present, many mainstream programs support pseudo-static, and we can also DIY the programs as needed and then set the website environment accordingly. The rewrite module of mod_rewrite under Apache can be easily implemented, and there are also rewrite components in the Windows environment. There are many tutorials on this aspect on the Internet. You can search and view relevant information to implement it.
In-site optimization strategy for dynamic URLs
How to make dynamic URLs perform better? Lesishu puts forward the following three key points from the perspective of the site:
1. Navigation is very important: a perfect site navigation is conducive to Spider's efficient crawling. The entire site should do the best: start from the home page, and reach the more important position without more than 3 clicks; the most important position should have a corresponding link on the home page; if necessary, a static navigation page can be made separately.
2. Make a website map: First of all, there should be a website map page in the website, linking common locations in this page, so that users can understand the website structure as quickly as possible, and the search engine Spider can crawl the website quickly and efficiently; secondly, XML-formatted website maps are not only suitable for Google, but XML maps should be kept as updated as possible; finally, it is recommended to make a URL list (and), which is beneficial and harmless to the website.
3. URL uniformity and specification: multiple versions of the same URL (such as ?id=1 and ?id=1&page=1), multiple URL forms appear in the same location (such as ?companyname=xxx and ?companyid=123), and case problems may cause similar pages to appear in search engines, thereby affecting the performance of the website in search engines.
In short, I personally believe that dynamic URLs are not necessarily worse than static pages. As long as they are optimized properly, they will also perform well in search engines. Moreover, in many cases when you can only or best use dynamic programs (such as forums, order systems and other sites with strong interaction), there is no need to deliberately require HTML generation or make static pages.