Skip to content

Quality Assurance

Quality assurance (QA) ensures documentation meets standards consistently. Without systematic QA, documentation quality varies by author, deadline pressure, and chance. With QA, organizations can maintain reliable quality and improve over time.

Defining Quality

Quality Dimensions

Documentation quality has multiple dimensions:

Accuracy Information is correct and up-to-date.

Completeness All necessary information is present.

Clarity Information is understandable by the target audience.

Findability Users can locate information they need.

Usability Documentation helps users accomplish tasks.

Consistency Style, terminology, and formatting are uniform.

Accessibility Content works for users with disabilities.

Quality Standards

Define what quality means for your documentation:

  • Accuracy: Procedures tested and verified
  • Completeness: All product features documented
  • Clarity: Readability score below grade 10
  • Style: Complies with style guide
  • Accessibility: Meets WCAG 2.1 AA

Explicit standards enable measurement and improvement.

QA Processes

Review Workflow

Establish a review process for all documentation:

Draft → Self-edit → Technical review → Editorial review → Publish

Self-edit: Author reviews own work - Completeness check - Style guide compliance - Grammar and spelling

Technical review: SME verifies accuracy - Factual correctness - Procedure testing - Technical completeness

Editorial review: Editor checks quality - Clarity and readability - Style consistency - User focus

Review Checklists

Standardize reviews with checklists:

Technical Review Checklist

  • All facts are accurate
  • All procedures work as documented
  • Prerequisites are complete
  • Error handling is documented
  • Code samples are correct

Editorial Review Checklist

  • Follows style guide
  • Terminology is consistent
  • Structure is logical
  • Links work
  • Images have alt text

Publishing Checklist

  • Metadata complete
  • Navigation updated
  • Search indexing configured
  • Redirects set (if URLs changed)

Testing Procedures

Test documentation like software:

Procedure testing: Follow documented steps exactly - Can someone complete the task using only the documentation? - Are any steps missing or unclear? - Do results match what is described?

Error case testing: Test documented troubleshooting - Do the documented solutions work? - Are common errors covered?

User testing: Observe real users - Can users find information? - Do users understand content? - Can users accomplish tasks?

Automated Quality Checks

Automation catches mechanical issues efficiently.

Linting

Text linters check style and grammar:

Vale (prose linter):

# .vale.ini
StylesPath = styles
MinAlertLevel = suggestion

[*.md]
BasedOnStyles = Google, Microsoft

Checks for: - Style guide violations - Passive voice - Complex sentences - Spelling errors

Markdownlint (formatting linter):

# .markdownlint.yml
default: true
MD013: false  # Allow long lines
MD033: false  # Allow inline HTML

Checks for: - Heading hierarchy - List formatting - Code block syntax

Broken links damage credibility:

# Check links in documentation
linkchecker https://docs.example.com

Run link checks: - On every build - On scheduled basis for external links - Before major releases

Accessibility Checking

Automated accessibility testing:

# Check accessibility
pa11y https://docs.example.com

Catches: - Missing alt text - Color contrast issues - Heading hierarchy problems - Missing form labels

Spell Checking

Integrated spell checking with custom dictionaries:

# cspell.json
{
  "words": [
    "webhook",
    "microservice",
    "kubernetes"
  ],
  "ignoreWords": ["yaml", "json"]
}

Add product-specific terms to avoid false positives.

CI/CD Integration

Run checks automatically:

# .github/workflows/docs-qa.yml
name: Documentation QA

on: [push, pull_request]

jobs:
  lint:
    runs-on: ubuntu-latest
    steps:
      - uses: actions/checkout@v4
      - name: Vale lint
        uses: errata-ai/vale-action@v2
      - name: Link check
        run: linkchecker ./docs
      - name: Spell check
        run: cspell "docs/**/*.md"

Failed checks block merging, ensuring standards are met.

Metrics and Measurement

Quality Metrics

Track documentation quality:

Error rates - Errors found in review - Errors reported by users - Errors found in production

Coverage - Features with documentation - Procedures tested - Pages reviewed

User feedback - Page ratings - Search success rates - Support escalations

Usage Metrics

Track how documentation is used:

Traffic - Page views - Unique visitors - Time on page

Behavior - Search queries - Navigation paths - Exit pages

Outcomes - Task completion (if measurable) - Reduced support tickets - User satisfaction scores

Metric Dashboards

Visualize metrics for monitoring:

Documentation Quality Dashboard
================================

Error Rate: 2.1% (target: <5%)
Coverage: 94% (target: >90%)
User Rating: 4.2/5 (target: >4.0)

Top Issues:
1. Outdated screenshots (15)
2. Broken links (8)
3. Missing prerequisites (5)

Action Items:
- Update screenshots for v3.0 UI changes
- Fix broken links to deprecated pages

Trend Analysis

Track quality over time:

  • Is error rate decreasing?
  • Is coverage increasing?
  • Are user ratings improving?

Trends reveal whether QA efforts are working.

Continuous Improvement

Retrospectives

Regular review of documentation quality:

  • What went well?
  • What problems occurred?
  • What can we improve?

Document findings and track action items.

Root Cause Analysis

When quality issues occur:

  1. Identify the problem
  2. Determine root cause
  3. Implement fix
  4. Prevent recurrence

Example:

Problem: Users frequently report outdated screenshots

Root cause: No process to update screenshots when UI changes

Fix: Update current outdated screenshots

Prevention: Add screenshot audit to release checklist; create ticket when UI changes affect documentation

Process Improvement

Improve QA processes based on findings:

  • Add checks that catch common issues
  • Remove checks that do not provide value
  • Streamline review processes
  • Improve tools and automation

Training

Invest in writer quality:

  • Style guide training
  • Tool training
  • Feedback on common errors
  • Best practice sharing

Better writers produce better first drafts, reducing review burden.

Governance

Roles and Responsibilities

Define who is responsible for quality:

Authors: Initial quality - Self-edit before submission - Address review feedback - Maintain own content

Reviewers: Quality verification - Technical accuracy - Editorial quality - Standard compliance

Editors/leads: Quality oversight - Define standards - Monitor metrics - Drive improvement

Quality Gates

Require quality checks before publication:

Gate 1: Self-review complete Gate 2: Technical review passed Gate 3: Editorial review passed Gate 4: Automated checks passed Gate 5: Final approval

Content cannot progress without passing each gate.

Escalation

Define how to handle quality issues:

  • Minor issues: Author fixes
  • Significant issues: Team discussion
  • Critical issues: Escalate to leadership

Documentation of QA

Document your QA processes:

  • Standards and definitions
  • Review checklists
  • Tool configurations
  • Metrics and targets

Documented processes enable consistency and improvement.

Summary

Documentation quality assurance ensures consistent, reliable content:

  • Define quality standards for accuracy, clarity, consistency
  • Establish processes for review and testing
  • Automate checks for mechanical issues
  • Measure quality through metrics and user feedback
  • Improve continuously based on findings
  • Govern quality through clear roles and gates

Quality does not happen by chance. Systematic QA makes quality reliable and improvable.


This concludes the Fundamentals section. Explore Document Types for guidance on specific documentation formats, or Tools for the software that supports technical writing workflows.