Documentation Hosting Features¶
The main way that users interact with your documentation is via the hosted HTML that we serve. We support a number of important features that you would expect for a documentation host.
A CDN is used for making documentation pages faster for your users. This is done by caching the documentation page content in multiple data centers around the world, and then serving docs from the data center closest to the user.
We support CDN’s on both of our sites, as we talk about below.
Sitemaps allows us to inform search engines about URLs that are available for crawling and communicate them additional information about each URL of the project:
- when it was last updated,
- how often it changes,
- how important it is in relation to other URLs in the site, and
- what translations are available for a page.
Read the Docs automatically generates a sitemap for each project that hosts
to improve results when performing a search on these search engines.
This allow us to prioritize results based on the version number, for example
stable as the top result followed by
latest and then all the project’s
versions sorted following semantic versioning.
If you want your project to use a custom page for not found pages instead of the “Maze Found” default,
you can put a
404.html at the top level of your project’s HTML output.
When a 404 is returned, Read the Docs checks if there is a
404.html in the root of your project’s output and uses it if it exists.
We recommend the sphinx-notfound-page extension,
which Read the Docs maintains.
It automatically creates a
404.html page for your documentation,
matching the theme of your project.
See its documentation for how to install and customize it.
robots.txt file will be served from the default version of your Project.
This is because the
robots.txt file is served at the top-level of your domain,
so we must choose a version to find the file in.
The default version is the best place to look for it.
Sphinx and Mkdocs both have different ways of outputting static files in the build:
Sphinx uses html_extra_path option to add static files to the output.
You need to create a
robots.txt file and put it under the path defined in