BLOG

How AI Powers Explainable and Auditable Medical Coding

By Kacie Geretz, Director of Growth Enablement

June 9, 2025

Key Takeaway: When auditors demand evidence behind your coding decisions, the scramble to reconstruct months-old logic can trigger millions in penalties. This article explores how transparent AI is changing the compliance landscape, creating instant audit trails that eliminate the guesswork and help hospitals defend their billing decisions with confidence.

Skip to Specific Sections

Why is AI Transparency and Auditability So Critical in Medical Coding?

Nym's Approach to Transparent Autonomous Coding

FAQ About AI Explainability in Medical Coding

Put simply, today’s healthcare payers and regulators expect hospitals to pull back the curtain on every billing decision. CMS’s Program Integrity Manual, the False Claims Act’s 60‑day overpayment rule, and the 21st Century Cures Act’s information‑blocking provisions all converge on the same demand: show your work. Hospitals must demonstrate the clinical evidence, guideline logic, and version history behind each ICD‑10 or CPT® code and do so quickly enough to survive audits, denials, and quality‑measure reviews.

Why is AI Transparency and Auditability So Critical in Medical Coding?

When you can’t produce that evidence on demand, the fallout can be swift and expensive:

  • Claw-back & fines: The Medicare CERT (Comprehensive Error Rate Testing) program calculated over $31 billion in improper payments in 2024, and medical coding errors were indicated as one of the key drivers (1). A RAC (Recovery Audit Contractor) focused review of just a few miscoded charts can balloon into million‑dollar paybacks, and False Claims Act penalties can triple the bill.
  • Denials: When a payer rejects or “pends” a code assignment in a claim, it can delay cash flow by up to 30 days (2). In addition to delayed payments, the process of appealing a claim (pulling the chart, assembling documentation, writing the letter, and tracking the case) costs hospitals an average of $118 per claim (3).
  • Quality penalties: Inaccurate coding that misrepresents patient risk or safety events contributes to widespread financial impact: over 70% of hospitals incurred penalties in 2025 under Medicare’s value-based programs, and some lost as much as 3% of inpatient revenue (4-7).

Do Traditional Processes Enable Transparent Medical Coding?

Faced with these costly risks, many organizations look to their existing coding approaches for reassurance, only to discover new gaps in transparency. Manual and first‑generation computer‑assisted coding (CAC) workflows often fall short of the transparency that auditors now expect. In purely manual processes, coders may jot only a brief note—or none at all—explaining why a diagnosis or procedure was chosen, which means that months later, a full chart re‑read is required to reconstruct intent during an audit. Early CAC tools speed throughput, but most expose only a probability score, not the underlying rule logic or guideline reference that regulators require.

This is where autonomous medical coding can change the game.

Nym's Approach to Transparent Autonomous Coding

Unlike other autonomous coding solutions that rely mainly on machine learning (ML), Nym’s autonomous medical coding engine leverages a combination of proprietary ML models and rules-based clinical and medical coding ontologies. This approach enables Nym to produce fully transparent, traceable audit trails for every code it assigns.

Nym’s audit trails include:

  • Supporting documentation from the patient encounter, provided side-by-side with every diagnosis code assigned
  • Links to the specific guidelines referenced by the engine to provide transparency into coding decisions
  • Justification for every Evaluation and Management (E/M) level assigned
  • Audit trail provided for both professional fee (profee) and facility coding results (i.e., single-path coded charts)

Preview a Nym audit trial for emergency department facility coding

What Is the Impact of AI-Powered Transparency?

By including each of these components, Nym’s audit trails create a truly comprehensive, actionable source that RCM staff can turn to in the event of an audit, denial, or other matters related to compliance.

Nym customers can pull up comprehensive evidence for any coding decision in just a few clicks—no scrambling through charts or reconstructing decision logic months after the fact. The resource-intensive manual processes that once consumed entire teams for audit preparation have been replaced by instant access to detailed, defensible documentation. From audit facilitation to denials management, Nym customers across the country are discovering that transparent coding doesn't just meet regulatory demands—it makes compliance efficient, predictable, and far less painful.

FAQ About AI Explainability in Medical Coding

How does explainable AI work in medical coding?

Explainable AI systems analyze clinical documentation and provide transparent reasoning for each code assignment. Rather than simply generating codes without context, these systems identify relevant information within documentation, apply corresponding coding guidelines, and document their reasoning alongside results, enabling healthcare professionals to understand, verify, and defend coding decisions.

Can auditors trust AI-generated codes with proper documentation?

When AI coding systems provide detailed explanations, supporting evidence, and comprehensive audit trails, auditors can verify AI-generated codes using similar processes as human-coded charts. The key requirement is comprehensive documentation that links each coding decision to specific clinical information and applicable coding guidelines.

What should healthcare organizations look for in explainable AI coding systems?

Healthcare organizations should evaluate AI coding systems based on their ability to provide comprehensive audit trails that include source documentation links, guideline references, decision logic, and quality metrics. Effective systems should generate actionable documentation that supports compliance requirements while integrating seamlessly with existing workflows.

How do explainable AI systems improve coding compliance?

Explainable AI systems improve compliance by providing the detailed documentation and reasoning required for regulatory requirements and audit defense. These systems create comprehensive records of how coding decisions were made, which guidelines were applied, and what clinical information supported each code assignment, directly addressing compliance mandates while enabling proactive identification of potential issues.

Sources:

  1. Centers for Medicare & Medicaid Services. (2024, November). Fiscal Year 2024 Improper Payments Fact Sheet. Retrieved from https://www.cms.gov/newsroom/fact-sheets/fiscal-year-2024-improper-payments-fact-sheet
  2. WhiteSpace Health. (2024, September 23). Payers' claim denials & delays hurt healthcare revenue cycles. WhiteSpace Health Blog. Retrieved from https://whitespacehealth.com/blogs/payers-rising-claim-denials/
  3. Erkan, L., & Beaudoin, B. (2024, April). Key medical coding audit topics compliance auditors should consider. Protiviti Insights. Retrieved from https://www.protiviti.com/sites/default/files/2024-04/insight-key-medical-coding-risks-for-compliance-auditors-2024-protiviti.pdf
  4. Centers for Medicare & Medicaid Services. (n.d.). Hospital value-based purchasing. CMS Medicare Quality Value-Based Programs. Retrieved from https://www.cms.gov/medicare/quality/value-based-programs/hospital-purchasing
  5. Fierce Healthcare. (2023, July 10). CMS' value-based performance programs ding hospitals for health equity factors outside of their control, study finds. Fierce Healthcare. Retrieved from https://www.fiercehealthcare.com/providers/cms-value-based-programs-penalize-hospitals-health-equity-factors-outside-their-control
  6. Watson, S. (2024). How does a DRG determine how much a hospital gets paid? Verywell Health. Retrieved from https://www.verywellhealth.com/how-does-a-drg-determine-how-much-a-hospital-gets-paid-1738874 


Photo of Kacie Geretz, Director of Growth Enablement

Kacie Geretz, Director of Growth Enablement

Kacie Geretz, RHIA, CPMA, CPC, CCA is the Director of Growth Enablement at Nym, where she aligns Nym’s product roadmap with the evolving needs of health system partners and serves as the externally-facing expert on Nym’s autonomous medical coding engine. A graduate of The Ohio State University’s Health Information Management program, Kacie brings deep expertise across the revenue cycle—having led revenue integrity programs, built managed care contracting and credentialing infrastructure, and driven denials and A/R process improvement initiatives. She is passionate about advancing healthcare automation and regularly shares insights on coding innovation and RCM transformation.

See it in action