Views:

Important: This article only applies to customers who have purchased DIY-Website Builder after January 2020.


Overview

A Robots.txt file tells search engine crawlers which pages or files the crawler can or cannot request from a website. It is automatically generated once a website is published. An XML Sitemap is a list of a website's URLs. It acts as a roadmap to tell search engines what content is available and how to reach it. 
 
With our new update, you can now view and edit the Robots.txt file and disable/enable auto-generation of the Robots.txt file in the Site Editor, as well as hide/unhide pages in search results. You will also be able to access Sitemap.xml files to provide your website’s URL structure to search engines for crawling. (Crawlability refers to a search engine's ability to crawl through a website's entire text content to figure out what the website is all about.) 
 
To manage Sitemap.xml or Robots.txt files, click on the new Search Engines section under Settings in the Site Editor: 




Then, select Sitemap.xml or Robots.txt, depending on which one you want to update.


For more information about Sitemap.xml, click here.
For more information about Robots.txt, click here.