Build customization¶
Read the Docs has a well-defined build process that works for many projects, but we offer additional customization to support more uses of our platform. This page explains how to extend the build process, using user-defined build jobs to execute custom commands.
In the normal build process,
the pre-defined jobs checkout
, system_dependencies
, create_environment
, install
, build
and upload
are executed.
However, Read the Docs exposes extra jobs to users so they can customize the build process by running shell commands.
These extra jobs are:
Step |
Customizable jobs |
---|---|
Checkout |
|
System dependencies |
|
Create environment |
|
Install |
|
Build |
|
Upload |
There are no customizable jobs currently |
Note
Currently, the pre-defined jobs (checkout
, system_dependencies
, etc) executed by Read the Docs cannot be overridden or skipped.
These jobs can be declared by using a Configuration File with the build.jobs key on it. Let’s say the project requires commands to be executed before installing any dependency into the Python environment and after the build has finished. In that case, a config file similar to this one can be used:
version: 2
build:
os: "ubuntu-20.04"
tools:
python: "3.10"
jobs:
pre_install:
- bash ./scripts/pre_install.sh
post_build:
- curl -X POST \
-F "project=${READTHEDOCS_PROJECT}" \
-F "version=${READTHEDOCS_VERSION}" https://example.com/webhooks/readthedocs/
There are some caveats to knowing when using user-defined jobs:
The current working directory is at the root of your project’s cloned repository
Environment variables are expanded in the commands (see Environment Variables)
Each command is executed in a new shell process, so modifications done to the shell environment do not persist between commands
Any command returning non-zero exit code will cause the build to fail immediately
build.os
andbuild.tools
are required when usingbuild.jobs
Examples¶
We’ve included some common examples where using build.jobs will be useful. These examples may require some adaptation for each projects’ use case, we recommend you use them as a starting point.
Unshallow clone¶
Read the Docs does not perform a full clone on checkout
job to reduce network data and speed up the build process.
Because of this, extensions that depend on the full Git history will fail.
To avoid this, it’s possible to unshallow the clone done by Read the Docs:
version: 2
build:
os: "ubuntu-20.04"
tools:
python: "3.10"
jobs:
post_checkout:
- git fetch --unshallow
Generate documentation from annotated sources with Doxygen¶
It’s possible to run Doxygen as part of the build process to generate documentation from annotated sources:
version: 2
build:
os: "ubuntu-20.04"
tools:
python: "3.10"
jobs:
pre_build:
# Note that this HTML won't be automatically uploaded,
# unless your documentation build includes it somehow.
- doxygen
Use MkDocs extensions with extra required steps¶
There are some MkDocs extensions that require specific commands to be run to generate extra pages before performing the build. For example, pydoc-markdown
version: 2
build:
os: "ubuntu-20.04"
tools:
python: "3.10"
jobs:
pre_build:
- pydoc-markdown --build --site-dir "$PWD/_build/html"
Avoid having a dirty Git index¶
Read the Docs needs to modify some files before performing the build to be able to integrate with some of its features. Because of this reason, it could happen the Git index gets dirty (it will detect modified files). In case this happens and the project is using any kind of extension that generates a version based on Git metadata (like setuptools_scm), this could cause an invalid version number to be generated. In that case, the Git index can be updated to ignore the files that Read the Docs has modified.
version: 2
build:
os: "ubuntu-20.04"
tools:
python: "3.10"
jobs:
pre_install:
- git update-index --assume-unchanged environment.yml docs/conf.py
Perform a check for broken links¶
Sphinx comes with a linkcheck builder that checks for broken external links included in the project’s documentation. This helps ensure that all external links are still valid and readers aren’t linked to non-existent pages.
version: 2
build:
os: "ubuntu-20.04"
tools:
python: "3.10"
jobs:
pre_build:
- python -m sphinx -b linkcheck docs/ _build/linkcheck
Support Git LFS (Large File Storage)¶
In case the repository contains large files that are tracked with Git LFS,
there are some extra steps required to be able to download their content.
It’s possible to use post_checkout
user-defined job for this.
version: 2
build:
os: "ubuntu-20.04"
tools:
python: "3.10"
jobs:
post_checkout:
# Download and uncompress the binary
# https://git-lfs.github.com/
- wget https://github.com/git-lfs/git-lfs/releases/download/v3.1.4/git-lfs-linux-amd64-v3.1.4.tar.gz
- tar xvfz git-lfs-linux-amd64-v3.1.4.tar.gz
# Modify LFS config paths to point where git-lfs binary was downloaded
- git config filter.lfs.process "`pwd`/git-lfs filter-process"
- git config filter.lfs.smudge "`pwd`/git-lfs smudge -- %f"
- git config filter.lfs.clean "`pwd`/git-lfs clean -- %f"
# Make LFS available in current repository
- ./git-lfs install
# Download content from remote
- ./git-lfs fetch
# Make local files to have the real content on them
- ./git-lfs checkout