B Store Robots.txt File Settings (In Admin)

Navigate to: Stores & Reps > B Stores > Manage > Robot.txt

The Robots.txt section allows administrators to instruct web robots (typically search engine robots) what pages of the website to crawl. Znode Admin Console provides two fields against every B Store to manage the robots.txt settings:

  1. Robots.txt
    1. Administrators can add information about the web pages they wish the web robots to crawl or not crawl.
  2. Default Page Level Robot Tag
    1. There are five options available under this field :
      1. None - This is the default option. When this option is selected, the behavior mentioned in the Robots.txt field (above) will be followed.
      2. INDEX, FOLLOW - This option allows search engines to index a web page that includes this instruction and allows all the links on the web page to be crawled.
      3. NOINDEX, NOFOLLOW - This option disallows search engines from indexing a web page that includes this instruction and disallows all the links on the web page to be crawled.
      4. NOINDEX, FOLLOW - This option disallows search engines from indexing a web page that includes this instruction and allows all the links on the web page to be crawled.
      5. INDEX, NOFOLLOW - This option allows search engines to index a web page that includes this instruction and disallows all the links on the web page from being crawled.

Did you find it helpful? Yes No

Send feedback
Sorry we couldn't be helpful. Help us improve this article with your feedback.