Engineering

Building a Requirements Coverage Matrix That Satisfies Auditors

A coverage matrix is the single artifact auditors scrutinize most during a DO-178C or ISO 26262 audit. This guide explains what it is, how coverage is correctly calculated, what 94%+ coverage looks like in practice, and how to stop building it by hand.

March 24, 20268 min read

What a Requirements Coverage Matrix Is

A requirements coverage matrix (RCM) is a structured document — or a live view generated by a tool — that maps every requirement to its corresponding verification evidence. At minimum, it shows:

  • Requirement identifier and title
  • Requirement type or level (system, HLR, LLR)
  • Verification method (test, analysis, inspection, demonstration)
  • The identifier of the test case, analysis report, or inspection record that provides the evidence
  • Verification status (passed, failed, not run, not applicable)

In a well-maintained system, every row in the matrix corresponds to a leaf-level requirement — a requirement with no children. Parent requirements (requirements that exist solely to structure child requirements) are not directly verified; they are considered covered when all their children are covered.

Leaf Nodes vs. Parent Nodes: The Most Common Miscount

The single most frequent mistake engineers make when computing coverage is counting parent nodes. If your requirement tree looks like this:

  • SYS-001 Autopilot System (parent)
  • SYS-001.1 Attitude Hold (parent)
  • SYS-001.1.1 ±0.5° roll accuracy in level flight (leaf)
  • SYS-001.1.2 ±0.5° pitch accuracy in level flight (leaf)

Then your denominator for coverage is 2, not 4. SYS-001 and SYS-001.1 are structural nodes — they exist to organize, not to be directly verified. Including them in coverage calculations artificially inflates the denominator, making your coverage percentage lower than it actually is, which can cause alarm during audits for the wrong reason.

Conversely, some teams exclude "informational" requirements from coverage counts because they feel those requirements don't need test cases. This is a risk unless those exclusions are explicitly documented with a rationale that your DER agrees with. Undocumented exclusions are findings.

What 94%+ Coverage Looks Like

Ninety-four percent is not a magic number — certification programs do not have a minimum threshold written into the standard. What matters is that your coverage evidence is complete, consistent, and defensible. However, in practice, experienced DERs expect to see coverage in the high 90s for DAL A and DAL B software, with every gap (every requirement below 100%) having an explicit documented rationale.

A realistic coverage breakdown for a mature DAL B avionics program might look like:

  • 97% of HLRs: covered by verified test cases
  • 2% of HLRs: covered by accepted analysis (e.g., timing analysis, fault tree analysis)
  • 1% of HLRs: not yet verified — these are in the audit finding list with a planned verification date

The 1% that is not covered is not automatically a problem — as long as it is tracked, assigned, and has a closure plan. What is a problem is discovering untracked gaps one week before a milestone review.

Typical Auditor Questions

Understanding what a DER or certification authority reviewer will ask helps you design your matrix to answer those questions without additional back-and-forth:

"Show me all requirements with no verification evidence." This is the primary coverage gap query. Your tool or matrix should produce this list in under 30 seconds. If the answer requires a manual spreadsheet scan, you are losing time and credibility.

"For requirement X, what test cases verify it?" Forward trace query. Every leaf requirement should map to one or more test case identifiers. If a requirement links only to other requirements (no terminal test), it is not verified.

"For test case T-042, which requirement does it verify?" Backward trace query. Orphaned test cases — tests with no requirement link — are a red flag. They suggest that verification effort is happening outside the controlled requirements baseline.

"What changed between Baseline 3 and Baseline 4?" Diff query. Auditors want to understand what changed between review milestones. Snapshot comparison features that highlight added, removed, and modified requirements — along with any coverage changes — are essential for answering this cleanly.

"Were any requirements modified after their initial verification? Were the tests re-run?" Suspect link query. If a requirement's acceptance criteria changed after its test case was approved, the test case may no longer be valid. Suspect link tracking answers this automatically.

Why Manual Matrices Collapse

The fundamental problem with Excel-based coverage matrices is that they are static snapshots disconnected from the live requirement state. The moment a requirement changes — its title, its description, its parent — the matrix is stale. Engineers manually updating cross-references in a 300-row spreadsheet will inevitably miss something. And that something will be exactly what an auditor asks about.

Beyond staleness, manual matrices have no enforcement mechanism. You can write "verified" in column F without actually having a test case. There is no link, no reference, no evidence chain — just a word in a cell. Experienced DERs know this and will ask to see the underlying test records. If those records cannot be produced or traced back to the specific requirement, the "verified" status is meaningless.

Automating the Matrix with a Dedicated Tool

A purpose-built requirements traceability tool changes the coverage matrix from a document you maintain to a view you query. Key capabilities that eliminate manual work:

Live coverage computation. The tool counts leaf nodes, counts those with at least one verification link in "passed" status, and computes the percentage. This number updates the moment any link or verification status changes — no batch refresh required.

Gap list on demand. A single query surfaces every leaf requirement with zero verification coverage. During a certification run, this list should be empty. During development, it is your daily work queue.

Snapshots for milestone evidence. Before every major design review or milestone submission, take a snapshot. The snapshot freezes the coverage state — who verified what, when. If an auditor asks about coverage as of a specific date, you return the snapshot rather than reconstructing state from change logs.

Audit log for every change. Every modification to a requirement, every link creation or deletion, every verification status change is logged with a timestamp and the identity of the engineer who made it. The audit log is the legal record that your process was followed.

PDF compliance report export. For submission artifacts, the tool should be able to generate a formatted coverage matrix as a PDF — with your organization name, project name, baseline version, and date. This becomes part of the Software Accomplishment Summary (SAS) evidence package.

Building the Matrix Right the First Time

If you are starting a new program, the practical setup sequence is:

  1. Import or enter all requirements into the tree before the first design review. Do not start creating test cases until the requirement baseline is stable — test cases written against unstable requirements create suspect links immediately.
  2. Define your verification method for each leaf requirement when the requirement is written, not after. This forces the engineer writing the requirement to consider how it will be verified — which often surfaces untestable requirements early.
  3. Link test cases to requirements as the test cases are created, not in a batch at the end of a test phase. Batch linkage at the end of a phase is where errors concentrate.
  4. Run the coverage report weekly during active development. Treat any requirement with 0% coverage as a P1 issue if you are within 60 days of a review milestone.
  5. Take a named snapshot at every major milestone. Name it something unambiguous: "CDR Baseline 2026-04-01" — not "Baseline v3 Final Final."

A coverage matrix built this way — incrementally, linked in real time, audited continuously — is not the deliverable you scramble to assemble before a review. It is the living evidence of your engineering process. That is exactly what certification authorities want to see.


Ready to modernize your requirements process?

Reqlume gives aerospace engineering teams bidirectional traceability, live coverage dashboards, and compliance-ready exports — without the complexity of legacy tools.