sitemap.xml support

Sitemaps allow you to inform search engines about URLs that are available for crawling. This makes your content more discoverable, and improves your Search Engine Optimization (SEO).

How it works

The sitemap.xml file is read by search engines in order to index your documentation. It contains information such as:

  • When a URL was last updated.

  • How often that URL changes.

  • How important this URL is in relation to other URLs in the site.

  • What translations are available for a page.

Read the Docs automatically generates a sitemap.xml for your project,

By default the sitemap includes:

  • Each version of your documentation and when it was last updated, sorted by version number.

This allows search engines to prioritize results based on the version number, sorted by semantic versioning.

Custom sitemap.xml

You can control the sitemap that is used via the robots.txt file. Our robots.txt support allows you to host a custom version of this file.

An example would look like:

User-agent: *
Allow: /

Sitemap: https://docs.example.com/en/stable/sitemap.xml