HB Ad Slot
HB Mobile Ad Slot
HIPAA Compliance Risks with AI Scribes in Health Care: What Digital Health Leaders Need to Know
Monday, June 9, 2025

AI scribes are quickly becoming the digital sidekick of modern health care. They promise to reduce clinician burnout, streamline documentation, and improve the patient experience. But as health care providers and digital health companies race to implement AI scribe solutions, one major concern keeps surfacing: What are the HIPAA risks?

The HIPAA risk is highly dependent on how the AI solution is trained, deployed, integrated, and governed. If your company is exploring or already using an AI scribe solution, pressure test the risk with this blog as your roadmap.

What is an AI Scribe?

AI scribes use machine learning models to listen to (or process recordings of) patient-provider encounters and generate structured clinical notes. These tools are marketed to seamlessly integrate into the electronic health record (EHR), reduce the need for manual charting, and allow physicians to focus more on the patient during visits.

Behind the scenes, AI scribes handle a high volume of protected health information (PHI) in real time, across multiple modalities (e.g., audio, transcripts, structured EHR data, etc.). As a result, AI scribes will be regulated by HIPAA.

HIPAA Pitfalls in the AI Scribe Lifecycle

Below are the most common HIPAA pitfalls we have encountered while advising digital health clients, health systems, and AI vendors rolling out scribe technologies.

1. Training AI on PHI Without Proper Authorization

Many AI scribes are “fine-tuned” or retrained using real-world data, including prior encounters or clinician-edited notes. That data typically contains PHI. Under HIPAA, using PHI for purposes beyond a covered entity health care provider’s treatment, payment, or health care operations generally requires patient authorization. As a result, use cases like model training or product improvement require a strong case that the activity qualifies as the covered entity health care provider’s health care operations — otherwise it will require patient authorization.

Risk: If an AI vendor is training its model on customer data without patient authorization or on behalf of the customer on a defensible treatment, payment, or health care operations basis, that use would need to be assessed as a potential HIPAA violation. Also consider any other consents that may be required to deploy this technology, such as consents for patients or providers to be recorded under state recording laws.

2. Improper Business Associate Agreements (BAAs)

An AI scribe vendor that accesses, stores, or otherwise processes PHI on behalf of a covered entity or another business associate is almost always a business associate under HIPAA. Yet we have seen vendor contracts that either (a) lack a compliant BAA, (b) contain overbroad indemnity disclaimers essentially eliminating liability for the vendor, or (c) fail to define permitted uses and disclosures or include uses and disclosures not permitted by HIPAA (such as allowing the vendor to train AI models on PHI without proper authorization or otherwise meeting a HIPAA exception).

Tip: Scrutinize every AI vendor agreement. Ensure the BAA or underlying services contract clearly defines: the data being accessed, stored, or otherwise processed, how the data may be used, including what data may be used for training, and whether data is de-identified or retained after service delivery.

3. Lacking Security Safeguards 

AI scribe platforms are high-value targets for attackers. The platforms may capture real-time audio, store draft clinical notes, or integrate via APIs into the EHR. If those platforms are not properly secured, and a data breach occurs as a result, the risks include regulatory fines and penalties, class action lawsuits, and reputational damage.

HIPAA Requirement: Covered entities and business associates must implement “reasonable and appropriate” technical, administrative, and physical safeguards to protect PHI. HIPAA regulated entities must also update their risk analyses to include the use of AI scribes.

4. Model Hallucinations and Misdirected Outputs

AI scribes, especially those built on generative models, can “hallucinate” or fabricate clinical information. Worse, they can misattribute information to the wrong patient if transcription errors or patient mismatches occur. That is not just a workflow issue. If PHI is inserted into the wrong chart or disclosed to the wrong individual, that could be a breach under HIPAA and state data breach laws (and even potentially detrimental to the patient if future care is impacted by an erroneous entry).

Risk Management: Implement human-in-the-loop review for all AI-scribed notes. Make sure providers are trained to confirm the accuracy of notes before entry into the patient’s record.

5. De-Identification Fallacies

Some vendors claim their AI solution is “HIPAA-compliant” because the data is de-identified. However, vendors often fail to strictly follow either of the two permissible methods of de-identification under HIPAA at 45 C.F.R. § 164.514: the Expert Determination or the Safe Harbor method. If the data is not fully de-identified under one of those methods, the data is not de-identified under HIPAA.

Compliance Check: If a vendor claims their system is outside of HIPAA’s reach because only de-identified data is used, press for: the method of de-identification used, evidence of re-identification risk analysis, and credentials of the de-identification expert used (if applicable). Also, note that if the vendor is given PHI to de-identify, there must be a BAA in place with the provider and vendor for that de-identification. 

Practical Next Steps for Health Systems and Digital Health Companies

For companies evaluating or already implementing an AI scribe, here are tips to mitigate risk without stifling innovation:

  1. Vet vendors thoroughly
  2. Build governance into your EHR workflows
  3. Limit secondary use/training without authorization
  4. Update your risk analysis
  5. Train your providers

The Bottom Line

AI scribes are transforming clinical documentation. However, with great automation comes great accountability, especially under HIPAA. Vendors, digital health companies, and health systems must treat AI scribes not just as software, but as data stewards embedded into patient care. By building strong contractual safeguards, limiting use of PHI to what is permitted under HIPAA, and continuously assessing downstream risks, digital health leaders can embrace innovation without inviting unwarranted risk.

HTML Embed Code
HB Ad Slot
HB Ad Slot
HB Mobile Ad Slot

More from Foley & Lardner LLP

HB Ad Slot
HB Mobile Ad Slot
 
NLR Logo
We collaborate with the world's leading lawyers to deliver news tailored for you. Sign Up for any (or all) of our 25+ Newsletters.

 

Sign Up for any (or all) of our 25+ Newsletters