Provide zblog template_zblog them_wordpress template download and customization

How to write the robots. txt of zblog? Sample download of robots. txt file for zblog

Tianxing Studio 2017-08-11 21:37 course seven thousand and forty-eight 5 Comments


Robots.txt is a very important thing for webmasters who compare seo. Frequently, friends ask zblog how to write robots? This article will explain in detail.

First, let's explain What is robots.txt What does robots.txt do

Robots.txt is the first file to view when visiting a website in a search engine. When a search spider visits a site, it will first check whether robots.txt exists in the root directory of the site. If it does, the search robot will determine the scope of access according to the contents of the file; If the file does not exist, all search spiders will be able to access all pages on the website that are not protected by passwords.

The above content comes from Baidu Encyclopedia.

After reading the above content, we can understand that robots.txt tells the searchers which pages to include, so we need to know which folders in the zblog program are for spiders and which folders cannot be crawled by spiders. Please refer to“ https://www.txcstx.com/post/1085.html ”。

Let's give you a demonstration robots.txt, which is the robots.txt file of our website:

 User-agent: * Disallow: /zb_users/ Disallow: /zb_system/ Sitemap: http://www.txcstx.cn/sitemap.xml

The robots.txt above means that spiders should not climb the zb_users and zb_system folders. In addition, the "Sitemap" file on this site is generated by the "SEO Tools Collection" plug-in, and you can download other plug-ins( https://app.zblogcn.com/search.php?q=Sitemap )Generate Sitemap file.


Can't find a tutorial that can solve your problem?

You can try to search or ask questions directly online. We also provide charging technical support. If you need it, you can contact us online.

Online questions Online Service

yes five Comments from netizens:

  •  King of Struggle

    King of Struggle Five years ago (April 28, 2019) reply

    Sitemap: It doesn't matter whether you add a space or not?

  •  Sesame Whale Selection

    Sesame Whale Selection Six years ago (April 20, 2019) reply

    I'm just looking for this method. The one I saw on the home page solved my problem. Thank you

  •  North seo

    North seo Six years ago (2019-03-29) reply

    I want to ask, after blocking these two directories, what pages will Baidu Spider grab after coming? Do you think the article page, list page and other items are all in the template folder? Will blocking/zb_users/affect fetching

    •  Tianxing Studio

      Tianxing Studio Six years ago (2019-03-29) reply

      Zblog is a dynamic program. The path of articles is generated dynamically. Shielding these two files will not cause any problems

  •  I love Bei

    I love Bei Seven years ago (August 15, 2017) reply

    These have not been studied, and have always been crawling with spiders. The tutorial is very good, easy to learn.

welcome you Comment: Cancel Reply

 Please fill in the verification code
  • Latest articles
  • Hot article ranking
  • Most Comments
Label aggregation