How to Bring AI Into HIPAA Compliance With Confidence

Discover why AI tools must be included in HIPAA policies, risk analyses, and vendor management.

AI technology connections

Introduction

Artificial intelligence is showing up across the healthcare ecosystem at a remarkable pace. From clinical decision support to real-time documentation help, automated scheduling, revenue-cycle automation, and generative AI copilots, these tools are quickly becoming part of everyday workflows. Recent industry surveys indicate that a large share of healthcare organizations are either piloting or actively using AI tools today, including generative AI applications that didn’t exist in most environments just a year or two ago.

What hasn’t changed is the regulatory expectation: HIPAA still applies. And as the technology evolves, regulators and accreditors are sharpening their focus on how AI interacts with electronic protected health information (ePHI).

Regulators Are Signaling New Expectations

The proposed updates to the HIPAA Security Rule underscore a key message: healthcare organizations need a complete, accurate, and continuously updated understanding of the technology they use.

Two elements matter especially for AI:

1. A Written Technology Asset Inventory

The proposal calls for organizations to maintain a documented inventory of any technologies that create, receive, maintain, or transmit ePHI. That includes:

  • AI-enhanced software already built into existing tools
  • Stand-alone AI applications
  • Generative AI tools used for documentation, summarization, or triage
  • Third-party AI services integrated through APIs or plugins

2. A Documented, Repeatable Risk Analysis

Organizations must understand the risks associated with each tool—especially when new technologies, like AI, are introduced into clinical or administrative workflows. A risk analysis should evaluate:

  • What data the AI tool can access
  • How the data flows in and out of the system
  • Whether PHI is used to train or improve models
  • Gaps or vulnerabilities created by new workflows
  • Technical safeguards like encryption, access controls, and audit logs


At the same time, new guidance from The Joint Commission and the Coalition for Health AI (CHAI) emphasizes governance, privacy, security, and monitoring as essential to safe and trustworthy AI adoption. The common message across these groups: AI use must be intentional, well-controlled, and transparently managed.

AI Should Not Live Outside Your HIPAA Program

For many healthcare organizations—and the consultants, IT teams, and compliance partners who support them—the takeaway is clear:

AI can’t be treated as a separate experiment. It needs to be fully integrated into existing HIPAA compliance policies and procedures.

That begins with two foundational steps:

1. Include AI Systems in the Security Risk Analysis

Every AI system that touches ePHI should be evaluated just like any other information system. This includes tools that:

  • Generate clinical summaries
  • Support diagnostic decision-making
  • Automate coding or documentation
  • Process patient messages or forms
  • Analyze operational data
  • Help staff manage communication or scheduling


If it handles ePHI in any way, it belongs in the risk analysis.

2. Add AI Tools to the Asset Inventory

An accurate technology inventory is the backbone of a strong HIPAA program. As organizations adopt new AI features, sometimes bundled into existing platforms, it’s important to ensure those tools are added to the inventory and linked to the corresponding risk analysis and mitigation strategies.

Set Clear Rules About What AI Tools Are Approved

As AI becomes more accessible, staff may try tools that weren’t vetted or approved, especially consumer-grade generative AI platforms. Organizations need clear, written policies defining:

  • Which AI tools are approved for clinical or operational workflows
  • Where PHI is allowed (enterprise-grade, secure AI environments)
  • Where PHI is not allowed (public chatbots, non-compliant apps)
  • Standard operating procedures for using approved AI tools safely
  • Documentation requirements for AI-mediated actions


For example, an organization may explicitly prohibit entering patient information into public generative AI tools, while approving use of a secure, HIPAA-aligned AI documentation assistant within the EHR.

These policies help protect patients, reduce risk, and prevent shadow IT.

Vendor Management Must Evolve With AI

Many AI tools rely on external vendors, cloud services, or model providers. When these vendors handle PHI, even temporarily, they are considered business associates under HIPAA. That means organizations must have:

  • Signed Business Associate Agreements (BAAs)
  • Contract language addressing AI-specific concerns
  • Transparency around how data is used or retained
  • Clear statements on whether PHI is used to train models
  • Defined security controls and incident-response expectations


Stronger vendor management not only reduces risk but also gives healthcare organizations confidence that their data is protected across the full AI ecosystem.

The Goal Is Not To Slow AI Down - It’s To Make AI Safer and More Scalable

Healthcare organizations are eager to adopt AI because the benefits are real: reduced administrative burden, improved documentation, more efficient workflows, and the potential for better patient outcomes.

Integrating AI into the HIPAA program is not about adding friction. It’s about ensuring:

  • Clear documentation
  • Transparent governance
  • Secure data handling
  • Consistent risk monitoring
  • Trust among patients, providers, and regulators


A strong, proactive compliance foundation makes AI adoption faster, because the guardrails are already in place.

Preparing for What Comes Next

Expectations around AI governance in healthcare will continue to rise. Regulators, accreditors, and patients want to know that organizations are using AI responsibly and safely.

By updating HIPAA policies now, especially around asset inventories, risk analysis, AI governance, and vendor management, healthcare organizations position themselves to innovate confidently rather than cautiously.

AI is here to stay. A thoughtful, well-documented compliance approach ensures it strengthens care instead of complicating it. Contact our team to simplify your compliance program today!