This is my summary of the write the docs meetup in Amsterdam at the Adyen office, november 2018.
Sven’s experience is mostly in open source projects (mainly Plone, a python CMS). He’s also involved in https://testthedocs.org and https://rakpart.testthedocs.org, a collection of tools for documentation tests. He has some disclaimers beforehand:
- There is no perfect setup.
- Automated checks can only help up to a certain level.
- Getting a CI (continuous integration) setup working is often tricky.
Before you start testing your documentation, you’ll need some insight. Start with getting an overview of your documentation. Who is committing to it? Which parts are there? Which parts of the documentation are updated most often? Are the committers native speakers yes/no? Which part of the documentation has the most bug reports. So: gather statistics.
Also: try to figure out who reads your documentation. Where do they come from? What are the search terms they use to find your documentation in google? You can use these statistics to focus your development effort.
Important: planning. If your documentation in English, plan beforehand if you want it to be in UK or US English. Define style guides. If you have automatic checks, define standards beforehand: do you want a check to fail on line length yes/no? Spelling errors? Etc. How long is the test allowed to take?
A tip: start your checks small, build them up step by step. If possible, start from the first commit. And try to be both strict and friendly:
- Your checks should be strict.
- Your error messages should be clear and friendly.
In companies it might be different, but in open source projects, you have to make sure developers are your friends. Adjust the documentation to their workflow. Use a Makefile, for instance. And provide good templates (with cookiecutter, for instance) and good examples. And especially for programmers, it is necessary to have short and informative error messages. Don’t make your output too chatty.
Standards for line lengths, paragraphs that are not too long: checks like that help a lot to keep your documentation readable and editable. Also a good idea: checks to weed out common English shortcuts ("we’re" instead of "we are") that make the text more difficult to read for non-native speakers.
Programmers are used to keeping all the code’s tests pass all the time. Merging code with failing tests is a big no-no. The same should be valid for the automatic documentation checks. They’re just as important.
Some further tips:
- Protect your branches, for instance, to prevent broken builds from being merged.
- Validate your test scripts, too.
- Keep your test scripts simple and adjustable. Make it clear how they work.
- Run your checks in a specific order, as some checks might depend on the correctness as checked by earlier tests.
- You can also check the html output instead of "only" checking the source files. You can find broken external links this way, for instance.
Something to consider: look at containerization. Dockers and so. This can run on both your local OS and on the continuous integration server. Everyone can use the same codebase and the same test setup. Less duplication of effort and more consistent results. The most important is that it is much easier to set up. Elaborate setups are now possible without scaring away everybody!
For screenshots, you could look at puppeteer for automatically generating your screenshots.
A comment from the audience: if you have automatic screenshots, there are tools to compare them and warn you if the images changed a lot. This could be useful for detecting unexpected errors in css or html.
Another comment from the audience: you can make a standard like US-or-UK-English more acceptable by presenting it as a choice. It is not that one of them is bad: we "just" had to pick one of them. "Your own preference is not wrong, it is just not the standard we randomly picked". 🙂