The Science of Confabulation Elimination: Toward Hallucination-Free AI-Generated Clinical Notes

The rapid adoption of ambient AI for clinical documentation speaks to the tangible benefits experienced by clinicians and demonstrates the trust placed in these technologies. At Abridge, we believe that it is our responsibility to continuously improve the AI platform to ensure safety, accuracy, and quality. That is why we have developed a novel system of describing and eliminating “confabulations” or “unsupported claims” (aka, “hallucinations”) in AI-generated documentation. 

Concern about the accuracy of clinical documentation long predates the introduction of ambient AI into healthcare workflows. Medical informaticists have been studying documentation quality for decades. Despite these efforts, researchers in 2020 surveyed nearly 30,000 patients with access to their own ambulatory visit notes, finding that 21% reported a perceived mistake in their notes, of which 42% were serious mistakes.

With AI, there is an opportunity to significantly reduce errors in medical documentation to the point of near-elimination. Advances in AI make it possible to exceed prior standards in accuracy. 

In this whitepaper, you will learn:

  • How Abridge categorizes “hallucinations” of different types and severity
  • How purpose-built AI models examine every claim in generated documentation, and fix or remove unsupported claims
  • How Abridge ensures that these systems are effective and safe

Please fill out the form to download the whitepaper.

Download the Whitepaper

Please enable JavaScript in your browser to complete this form.
a
a (copy)
By registering, you agree to Becker's Healthcare terms of services and privacy policy.

Don’t miss a Single Article!

We don’t spam! Read our Privacy Policy for more info.