IntermediateCoding
Documentation Review & Repair
Audit every markdown file, README, and documentation artifact in a project — verify accuracy against the actual codebase, fix stale content, and fill gaps.
Prompt
You are a documentation quality auditor. Review every markdown file, README, and documentation artifact in this project. Verify each claim against the actual codebase and produce a comprehensive report.
PHASE 1 — DOCUMENTATION INVENTORY
Find and count every documentation file:
- README.md (root and subdirectories)
- All .md files anywhere in the project (count total)
- Documentation directories (docs/, doc/, wiki/, guides/)
- Configuration docs: .env.example, docker-compose comments, CI/CD workflow descriptions
- API documentation: OpenAPI/Swagger specs, API reference docs
- Changelogs, contributing guides, security policies
Produce an inventory table: | Category | Count | Location |
PHASE 2 — ACCURACY VERIFICATION
For each key documentation file, verify EVERY factual claim against actual code. Present findings as:
| Claim | Status | Evidence |
where Status is: ACCURATE, STALE, WRONG, or PHANTOM (references something that doesn't exist).
Specific checks:
- README/project docs: do setup commands match package.json? Are listed features implemented? Is tech stack version-accurate?
- Environment variables: grep for ALL env var reads in code (process.env, os.environ, etc.) and compare against documented vars. List every undocumented var.
- File path references: do referenced files actually exist at those paths?
- Code examples: do they use current imports, API signatures, and patterns?
- Numeric claims (test counts, dependency counts): verify by actually running commands or counting
- Route tables: compare documented routes against actual route files
PHASE 3 — GAP ANALYSIS
Identify missing documentation, organized by severity:
CRITICAL (blocks onboarding or causes misconfiguration):
- Undocumented environment variables that would cause runtime failures
- Wrong variable names that would cause deployment failures
- Missing setup prerequisites
WARNING (causes confusion but not breakage):
- Stale version numbers or test counts
- Outdated file paths
- Missing documentation for major features
INFO (nice to have):
- Minor formatting inconsistencies
- Spelling/grammar
- Missing but non-essential docs
PHASE 4 — REPORT
Produce a documentation health report table:
| File | Status | Critical | Warning | Info |
Assign a documentation health score (1-10) with justification.
Produce "Top 5 Fixes" — the documentation changes with the highest impact:
| # | Fix | File | Severity | Why It Matters |
CONSTRAINTS:
- Do not modify any files — read-only analysis producing a report only
- Do not invent features or capabilities — document only what exists
- Flag ambiguous claims rather than guessing the correct information
- Verify numeric claims (test counts, version numbers) by checking actual sources, not trusting docs
PROJECT TO REVIEW:
{{Paste the project path or describe the project whose documentation needs review}}How to Use
How to Use
Run inside a project directory. The AI will find every documentation file, verify it against the actual code, fix what it can, and report what needs human attention. Great for pre-release documentation audits or onboarding preparation.
Tips & Warnings
Tips
Run this after any major refactor or dependency update. Code changes often leave documentation behind — this prompt catches the drift.
The gap analysis is especially valuable for open source projects. Missing documentation is the number one barrier to contribution.
devopsdocumentationmarkdownreviewquality