Under active development Content is continuously updated and improved · Last updated Feb 18, 2026, 2:55 AM UTC

KSI-AFR-PVAPersistent Validation and Assessment

LOW
MODERATE

Formerly KSI-AFR-09

>Control Description

Persistently validate, assess, and report on the effectiveness and status of security decisions and policies that are implemented within the cloud service offering in alignment with the FedRAMP 20x Persistent Validation and Assessment (PVA) process, and persistently address all related requirements and recommendations.
Defined terms:
Cloud Service Offering
Persistent Validation
Persistently

>FRMR Requirements
21

Normative requirements from the FedRAMP Requirements and Recommendations document — 15 mandatory, 3 recommended, 3 optional.

Mandatory15
MUST

Persistent Validation

Providers MUST persistently perform validation of their Key Security Indicators; this process is called persistent validation and is part of vulnerability detection.

PVA-CSX-VAL
Providers
MUST

Issues As Vulnerabilities

Providers MUST treat issues detected during persistent validation and failures of the persistent validation process as vulnerabilities, then follow the requirements and recommendations in the FedRAMP Vulnerability Detection and Response process for such findings.

PVA-CSX-FAV
Providers
MUST

Report Persistent Validation

Providers MUST include persistent validation activity in the reports on vulnerability detection and response activity required by the FedRAMP Vulnerability Detection and Response process.

PVA-CSX-RPV
Providers
MUST

Independent Verification and Validation

Providers MUST have the implementation of their goals and validation processes assessed by a FedRAMP-recognized independent assessor OR by FedRAMP directly AND MUST include the results of this assessment in their authorization data without modification.

PVA-CSX-IVV
Providers

The option for assessment by FedRAMP directly is limited to cloud services that are explicitly prioritized by FedRAMP, in consultation with the FedRAMP Board and the federal Chief Information Officers Council. During 20x Phase Two this includes AI services that meet certain criteria as shown at https://fedramp.gov/ai.

FedRAMP recognized assessors are listed on the FedRAMP Marketplace.

MUST

Non-Machine Validation

Providers MUST complete the validation processes for Key Security Indicators of non-machine-based information resources at least once every 3 months.

PVA-CSX-NMV
Providers
MUST

Persistent Machine Validation

Providers MUST complete the validation processes for Key Security Indicators of machine-based information resources at least once every 3 days.

PVA-CSX-PMV
Varies by level: low MUST · moderate MUST · high SHOULD
Providers
MUST

Underlying Processes

Assessors MUST verify and validate the underlying processes (both machine-based and non-machine-based) that providers use to validate Key Security Indicators; this should include at least:

PVA-TPX-UNP
Assessors
  • The effectiveness, completeness, and integrity of the automated processes that perform validation of the cloud service offering's security posture.
  • The effectiveness, completeness, and integrity of the human processes that perform validation of the cloud service offering's security posture
  • The coverage of these processes within the cloud service offering, including if all of the consolidated information resources listed are being validated.
MUST

Processes Derived from Key Security Indicators

Assessors MUST verify and validate the implementation of processes derived from Key Security Indicators to determine whether or not the provider has accurately documented their process and goals.

PVA-TPX-PDK
Assessors
MUST

Outcome Consistency

Assessors MUST verify and validate whether or not the underlying processes are consistently creating the desired security outcome documented by the provider.

PVA-TPX-OUC
Assessors
MUST

Mixed Methods Evaluation

Assessors MUST perform evaluation using a combination of quantitative and expert qualitative assessment as appropriate AND document which is applied to which aspect of the assessment.

PVA-TPX-MME
Assessors
MUST

Procedure Adherence

Assessors MUST assess whether or not procedures are consistently followed, including the processes in place to ensure this occurs, without relying solely on the existence of a procedure document for assessing if appropriate processes and procedures are in place.

PVA-TPX-PAD
Assessors
MUST

Assessment Summary

Assessors MUST deliver a high-level summary of their assessment process and findings for each Key Security Indicator; this summary will be included in the authorization data for the cloud service offering.

PVA-TPX-SUM
Assessors
MUST NOT

Static Evidence

Assessors MUST NOT rely on screenshots, configuration dumps, or other static output as evidence EXCEPT when evaluating the accuracy and reliability of a process that generates such artifacts.

PVA-TPX-STE
Assessors
MUST NOT

No Overall Recommendation

Assessors MUST NOT deliver an overall recommendation on whether or not the cloud service offering meets the requirements for FedRAMP authorization.

PVA-TPX-NOR
Assessors
MUST

Implementation Summaries

Providers MUST maintain simple high-level summaries of at least the following for each Key Security Indicator:

KSI-CSX-SUM
Providers
  • Goals for how it will be implemented and validated, including clear pass/fail criteria and traceability
  • The consolidated _information resources_ that will be validated (this should include consolidated summaries such as "all employees with privileged access that are members of the Admin group")
  • The machine-based processes for _validation_ and the _persistent_ cycle on which they will be performed (or an explanation of why this doesn't apply)
  • The non-machine-based processes for _validation_ and the _persistent_ cycle on which they will be performed (or an explanation of why this doesn't apply)
  • Current implementation status
  • Any clarifications or responses to the assessment summary
Recommended3
SHOULD

Provide Technical Evidence

Providers SHOULD provide technical explanations, demonstrations, and other relevant supporting information to all necessary assessors for the technical capabilities they employ to meet Key Security Indicators and to provide validation.

PVA-CSX-PTE
Providers
SHOULD

Provider Experts

Assessors SHOULD engage provider experts in discussion to understand the decisions made by the provider and inform expert qualitative assessment, and SHOULD perform independent research to test such information as part of the expert qualitative assessment process.

PVA-TPX-PEX
Assessors
SHOULD

Application within MAS

Providers SHOULD apply ALL Key Security Indicators to ALL aspects of their cloud service offering that are within the FedRAMP Minimum Assessment Scope.

KSI-CSX-MAS
Providers
3 optional guidance (MAY)
Optional Guidance3
MAY

Receiving Advice

Providers MAY ask for and accept advice from their assessor during assessment regarding techniques and procedures that will improve their security posture or the effectiveness, clarity, and accuracy of their validation and reporting procedures for Key Security Indicators, UNLESS doing so might compromise the objectivity and integrity of the assessment (see also PVA-TPX-AMA).

PVA-CSX-RAD
Providers
MAY

Sharing Advice

Assessors MAY share advice with providers they are assessing about techniques and procedures that will improve their security posture or the effectiveness, clarity, and accuracy of their validation and reporting procedures for Key Security Indicators, UNLESS doing so might compromise the objectivity and integrity of the assessment (see also PVA-CSX-RIA).

PVA-TPX-SHA
Assessors
MAY

AFR Order of Criticality

Providers MAY use the following order of criticality for approaching Authorization by FedRAMP Key Security Indicators for an initial authorization package:

KSI-CSX-ORD
Providers
  • Minimum Assessment Scope (MAS)
  • Authorization Data Sharing (ADS)
  • Using Cryptographic Modules (UCM)
  • Vulnerability Detection and Response (VDR)
  • Significant Change Notifications (SCN)
  • Persistent Validation and Assessment (PVA)
  • Secure Configuration Guide (RSC)
  • Collaborative Continuous Monitoring (CCM)
  • FedRAMP Security Inbox (FSI)
  • Incident Communications Procedures (ICP)

>Trust Center Components
3

Ways to express your implementation of this indicator — approaches vary by organization size, complexity, and data sensitivity.

From the field: Mature implementations express persistent validation through automated evidence pipelines — GRC platforms collecting machine-generated evidence continuously, compliance dashboards showing control status derived from live system state, and assessment results published as OSCAL artifacts. Per ADS-CSO-CBF, automation must ensure consistency between formats, making point-in-time manual assessments supplementary to continuous automated validation.

Continuous Assessment Dashboard

Dashboards

Dashboard expressing ongoing validation posture — automated compliance checks, evidence freshness, and assessment status as a living view

Automated: GRC platform APIs verify evidence collection recency and completeness

Automated Compliance Evidence

Evidence Artifacts

Machine-generated evidence demonstrating continuous compliance validation — the artifacts that feed the dashboard

Automated: Evidence collection completeness verified via GRC platform APIs

Assessment Cadence and Methodology

Processes & Procedures

How persistent validation is maintained between annual assessments — the cadence and methodology behind automated evidence collection

>Programmatic Queries

Beta
Cloud

CLI Commands

Get overall compliance summary
aws configservice get-compliance-summary-by-config-rule --output table
Check rule evaluation status
aws configservice describe-config-rule-evaluation-status --query "ConfigRulesEvaluationStatus[].{Rule:ConfigRuleName,LastRun:LastSuccessfulEvaluationTime,Compliant:FirstEvaluationStarted}" --output table

>20x Assessment Focus Areas

Aligned with FedRAMP 20x Phase Two assessment methodology

Completeness & Coverage:

  • Which KSIs or security controls are currently validated only through point-in-time assessment rather than persistent validation, and what is the plan to close those gaps?
  • How do you ensure persistent validation covers all system components — including those managed by third parties or inherited from IaaS/PaaS providers?
  • Are there security decisions or policies that are not yet subject to automated effectiveness measurement, and how are those exceptions tracked?
  • When new controls or KSIs are added to the assessment scope, how quickly are they incorporated into persistent validation?

Automation & Validation:

  • What happens if a persistent validation check returns a false-positive or false-negative — how do you detect and correct inaccurate results?
  • How do you validate that your validation tools themselves are working correctly (i.e., who watches the watchers)?
  • What automated remediation is triggered when persistent validation identifies a control that is no longer effective?
  • If a validation data source goes offline, how quickly is the gap detected and what interim measures apply?

Inventory & Integration:

  • What tools compose your persistent validation stack (CSPM, CWPP, compliance-as-code, custom scripts), and how do they feed into a unified view?
  • How does persistent validation data integrate with your ADS and CCM reporting to FedRAMP?
  • Are there resources or environments (e.g., staging, DR sites) that are not covered by your validation tooling, and how do you account for them?
  • How do validation results flow into your risk register or GRC platform for tracking and decision-making?

Continuous Evidence & Schedules:

  • How do you demonstrate that persistent validation is truly continuous rather than just running on a daily or weekly batch schedule?
  • Is validation evidence available in machine-readable format via API, or does it require manual export and formatting for assessors?
  • How do you detect when the gap between your reported security posture and actual control effectiveness is widening?
  • What evidence shows the cadence and results of persistent validation activities over the past 90 days?

Update History

2026-02-04Removed italics and changed the ID as part of new standardization in v0.9.0-beta; no material changes.

Ask AI

Configure your API key to use AI features.