Change Log

  • 25-Apr-2022 | v6.5.0 | All new article

Contributors:

Adam Wilson - Logo Pogo

Sitemap.xml / Robots.txt

The Sitemap feature allows you to manage the sitemap.xml content, which lists your site’s important pages/items, their priority and last modified date, in XML format, for SEO and site indexing purposes.

Sitemap.xml

3rd party services, such as search engines, can reference the sitemap.xml contents to crawl your sites’s pages and get a better understanding of those page’s priority and relevancy.

XML content can be added here manually (such as typing it in or pasting in XML generated content from a external sitemap generator) or it can be generated by WebinOne (using the “Generate Sitemap” button) which will be based off SEO settings applied to your site’s individual pages and module items and/or the overall SEO settings found under ‘Settings’ > ‘SEO’. Here you can define excluded items, bulk assign items and set automatic sitemap generation. For more details on these settings see here.

When content is saved here, or generated automatically, a sitemap.xml file will be created in the root directory of your site which will also be accessible via the File Manager and FTP.

Any changes saved here, or generated automatically, will override any existing sitemap.xml file in the root directory.

Robots.txt

WebinOne does not generate a robots.txt file or directly provide a specific tool for managing one, however, we thought it’s worth mentioning here as it relates to the crawling of your site’s pages and files by 3rd party services, including search engines, and is often use to point to your sitemap.xml file.

While there is no tool specifically for robots.txt management, you can use the File Manager to easily and conveniently create and manage text-based files.

The robots.txt file would be created in the root directory of your site with the file name robots.txt

A minimal example of a robots.txt file which allows all crawling agents with no site restrictions and a sitemap reference is as follows:

user-agent: *
disallow:
sitemap: https://www.yourdomain.com/sitemap.xml

For more information on the robots.txt file uses, options and limitations check the ‘External Resources’ section below.