myctrl.tools
Compare

D001Prevent hallucinated outputs

>Control Description

Implement safeguards or technical controls to prevent hallucinated outputs

Application

Mandatory

Frequency

Every 12 months

Capabilities

Text-generation, Voice-generation

>Controls & Evidence (3)

Technical Implementation

D001.1
Config: Groundedness filter

Core - This should include:

- Implementing factual accuracy controls. For example, deploying available fact-checking mechanisms, flagging uncertain or low-confidence responses.

Typical evidence: Screenshot of code or configuration showing groundedness validation - may include filters checking responses against source documents, fact-checking API integration, or logic comparing generated content to retrieved context for factual accuracy.
Location: Engineering Code
D001.2
Demonstration: User-facing citations & source attributions

Core - This should include:

- Establishing information source validation. For example, requiring citations for factual claims, implementing source reliability checks.

Typical evidence: Screenshot of UI or output format showing citations and source attributions provided to users - may include inline citations, source links, reference lists, or attribution labels identifying where information originated.
Location: Product
D001.3
Demonstration: User-facing uncertainty labels

Supplemental - This may include:

- Maintaining uncertainty communication. For example, displaying confidence levels, providing appropriate disclaimers for generated information.

Typical evidence: Screenshot of UI or output format showing confidence levels, uncertainty disclaimers, or warnings for generated information - may include confidence score displays, low-certainty warnings, or standard disclaimers about potential inaccuracies.
Location: Product

>Cross-Framework Mappings

Ask AI

Configure your API key to use AI features.