myctrl.tools
Compare

E003AI failure plan for hallucinations

>Control Description

Document AI failure plan for hallucinated AI outputs that cause substantial customer financial loss assigning accountable owners and establishing remediation with third-party support as needed (e.g. legal, PR, insurers)

Application

Mandatory

Frequency

Every 12 months

Capabilities

Text-generation, Voice-generation

>Controls & Evidence (2)

Operational Practices

E003.1
Documentation: AI failure plan for hallucinations

Core - This should include:

- Establishing compensation assessment procedures. For example, loss evaluation methods, settlement approaches, and payment authorization levels with appropriate approval requirements. - Implementing remediation measures. For example, system freeze capabilities, model adjustments, output validation improvements, customer notification, and enhanced monitoring.

Typical evidence: Can be standalone document or integrated in existing incident response procedures/policies
Location: AI failure plan
E003.2
Documentation: Additional hallucination failure procedures

Supplemental - This may include:

- Defining hallucination incident types. - Coordinating potential external support. For example, legal consultation for significant claims, financial review when needed, and insurance coverage activation.

Typical evidence: May include hallucination incident categories (e.g. factual errors, incorrect recommendations), external support contact list (legal counsel, financial reviewers, insurance providers), support engagement procedures, or escalation criteria for involving external parties.
Location: AI failure plan

>Cross-Framework Mappings

Ask AI

Configure your API key to use AI features.