Under active development Content is continuously updated and improved

KSI-CMT-VTDValidating Throughout Deployment

LOW
MODERATE

Formerly KSI-CMT-03

>Control Description

Automate persistent testing and validation of changes throughout deployment.
Defined terms:
Persistent Validation
Persistently

>NIST 800-53 Controls

>Trust Center Components
3

Ways to express your implementation of this indicator — approaches vary by organization size, complexity, and data sensitivity.

From the field: Mature implementations express testing rigor through pipeline metrics — unit, integration, security, and acceptance test pass rates published as dashboard indicators. Test gates are enforced in CI/CD pipelines, with coverage thresholds blocking deployment when testing falls below acceptable levels.

Test Coverage Reports

Evidence Artifacts

Automated testing coverage reports expressing validation rigor — generated from CI/CD pipelines with pass rates and coverage metrics

Automated: CI/CD APIs verify test suites ran and passed before deployment

Pre-Production Environment Architecture

Architecture & Diagrams

Architecture expressing staging/pre-prod environments used for change validation — shows environment parity with production

Testing and Validation Framework

Processes & Procedures

How changes are validated before production deployment — testing pyramid, security scan requirements, and acceptance criteria

>Programmatic Queries

Beta
CI/CD

CLI Commands

List workflow runs with status
gh run list --json name,status,conclusion,createdAt --limit 20
Check required status checks for a commit
gh api repos/{owner}/{repo}/commits/<sha>/check-suites --jq '.check_suites[] | {app: .app.name, status: .status, conclusion: .conclusion}'
View a specific workflow run
gh run view <run-id> --json jobs --jq '.jobs[] | {name,status,conclusion,startedAt,completedAt}'

>20x Assessment Focus Areas

Aligned with FedRAMP 20x Phase Two assessment methodology

Completeness & Coverage:

  • Does automated testing cover all deployment stages — build, integration, staging, canary, and production — or are some stages validated only manually?
  • How do you ensure test coverage includes security validation (SAST, DAST, dependency scanning) in addition to functional and performance testing?
  • Are infrastructure changes (Terraform, CloudFormation) subject to the same automated validation pipeline as application code changes?
  • When a new type of deployment artifact is introduced (e.g., a new microservice, a new cloud resource type), how do you ensure validation tests are created before the first deployment?

Automation & Validation:

  • What happens when an automated validation gate fails — is the deployment blocked, rolled back, or only flagged, and what evidence shows the gate is enforced?
  • How do you detect if automated tests themselves are broken, flaky, or silently passing when they should fail?
  • What automated rollback or circuit-breaker mechanism activates when post-deployment validation detects a regression in production?
  • How do you validate that security-specific tests (e.g., scanning for exposed secrets, misconfigured permissions) actually catch real issues — do you run red-team or mutation testing against the pipeline?

Inventory & Integration:

  • What CI/CD platforms (GitHub Actions, GitLab CI, Jenkins, ArgoCD) run your validation pipeline, and how do you ensure all deployable artifacts pass through them?
  • How do test results from different stages and tools aggregate into a single pass/fail decision for each deployment?
  • Are deployment validation results integrated with your change management records so every change ticket links to its test results?
  • How do you track test coverage metrics across the entire deployment pipeline, and are there components with no automated tests?

Continuous Evidence & Schedules:

  • How do you demonstrate that every production deployment in the past 90 days passed through the full automated validation pipeline?
  • Is deployment validation history (test results, gate decisions, rollback events) available via API or structured logs?
  • How do you measure and demonstrate that test coverage and validation rigor are improving over time rather than degrading?
  • What evidence shows that failed validation gates actually prevented problematic changes from reaching production?

Update History

2026-02-04Removed italics and changed the ID as part of new standardization in v0.9.0-beta; no material changes.

Ask AI

Configure your API key to use AI features.