The robots.txt file contains directives for search engines such as Google and Yahoo and tells them which of your website pages to include or skip in their search results.

If you want better control of crawl requests to your site, you can edit your robots.txt file.

By default robots.txt file allows bots to search all of your site's pages. 

Please Note: Pages hidden from the search engines in the Site Editor will remain hidden with any robots.txt file.

Editing your Robots.txt File

To edit your robots.txt file:

  • Hover over the left sidebar of the Site Editor and select Settings:

  • Navigate to Search Engines:

  • Hover over the Robots.txt:

  • Disable auto-generation by clicking the button to be able to edit the Robots.txt file:

  • Confirm your decision by clicking Yes on the pop-up:

  • You can now edit the file by clicking the Edit icon on the right:

  • Click Save:

Reset the Robots.txt file by default

  • To reset the Robots.txt file to default settings, you need to click the Reset to Default button in the Robots.txt editing mode:


  • Click the Enable auto-generation button:

  • Confirm by clicking Yes on the pop-up:

Please Note: 

  • All changes to the Robots file are automatically erased if you turn on auto-generation.
  • If auto-generation of the Robots file is enabled, the Robots file can't be edited: