Navigate to: Stores & Reps > B Stores > Manage > Robot.txt
The Robots.txt section allows administrators to instruct web robots (typically search engine robots) what pages of the website to crawl. Znode Admin Console provides two fields against every B Store to manage the robots.txt settings:
- Robots.txt
- Administrators can add information about the web pages they wish the web robots to crawl or not crawl.
- Default Page Level Robot Tag
- There are five options available under this field :
- None - This is the default option. When this option is selected, the behavior mentioned in the Robots.txt field (above) will be followed.
- INDEX, FOLLOW - This option allows search engines to index a web page that includes this instruction and allows all the links on the web page to be crawled.
- NOINDEX, NOFOLLOW - This option disallows search engines from indexing a web page that includes this instruction and disallows all the links on the web page to be crawled.
- NOINDEX, FOLLOW - This option disallows search engines from indexing a web page that includes this instruction and allows all the links on the web page to be crawled.
- INDEX, NOFOLLOW - This option allows search engines to index a web page that includes this instruction and disallows all the links on the web page from being crawled.
- There are five options available under this field :