SEO's website page code compression conforms to Baidu's white paper's crawl diagnosis duration less than 1s. As for this crawl diagnosis, I initially thought it was useless, and it would be normal if the crawl was successful. Later, I accidentally found that there was a download duration in the crawl diagnosis. My blog download duration was more than 5~8s, and the white paper suggested that the crawl download duration be less than 1s. Three days ago, many methods were tested, Page cache, CDN, DNS preload, code compression, etc.
Baidu official said: https://ziyuan.baidu.com/college/articleinfo?id=868
Finally, when testing the code compression, the download duration was reduced from 5.5 seconds to 0.2 seconds. Then I stopped updating the website and observed it for three days. Today, I posted the results. There were spiders and crawls.
The number of spiders and crawls has also increased rapidly from single digits to dozens. This blog has no record, and they all say that it is hopeless to include them. I think it is OK to improve slowly. If your station is also empty and has daily crawls and some spiders, but there is no internal page or only a few entries included, hurry to look at the following article and set your code compression, hoping to help you!
The pagoda panel enables gzip compression to quickly improve website access speed: https://www.lkba.cn/jiaocheng/32.html
Text label: SEO optimization Gzip compression Website spider Web Crawlers Website inclusion
A little Jack https://lkba.cn
The blogger focuses on Z-Blog website construction, website optimization, computer troubleshooting, reinstallation, optimization, maintenance system, etc., QQ/WX: 2126245521 (please specify the purpose)
Copyright © 2020-2023 Jack's works Reserved.
Powered By Z-Blog Sitemap | Shu ICP Bei No. 2023025269 - 1
Processed : 0.051 SQL : 14