Skip to main content
Back to blog
AI in DentistryPractice GrowthDental Marketing

5 HIPAA Mistakes Dental Practices Make With AI (And What They Cost)

PJ

Pete Johnson

8 min read
Healthcare professional typing on laptop at dental office desk with stethoscope, representing HIPAA compliance with AI tools

In 2022, three dental practices reached HIPAA settlements totaling $142,500 for things like sharing patient information on social media and using PHI in marketing without consent.

That was before AI.

Now your front desk is pasting patient names into ChatGPT. Your scribe vendor is processing appointment audio on someone else's servers. Your recall system has API access to your PMS without anyone knowing if there's a Business Associate Agreement (BAA) in place.

I wrote last week about the AI stack every dentist should steal from me. The feedback was great — but a few people messaged asking the same question:

"How do I actually stay HIPAA-compliant while doing this?"

This post is the answer. Five mistakes I see constantly, what each one can cost you, and exactly how to fix it.

None of this is legal advice. If you're implementing AI at scale in your practice, get a healthcare compliance attorney involved. But here's the map.

First, The Stakes

Let me calibrate you before we get into the mistakes.

HIPAA violation fines in 2026 range from $141 per violation for unknowing infractions up to $2.1 million per violation category for willful neglect. And that's just the federal penalty.

When you factor in everything else — breach notification costs ($50-$150 per affected patient), state attorney general actions, lawsuits, and lost patients — the average cost of a dental practice data breach lands between $150,000 and $500,000.

For a single-location practice, that's practice-ending money.

And here's the part that matters for this post: every company that touches your patient data needs a signed Business Associate Agreement. Your IT support. Your PMS vendor. Your scheduling software. Your scribe tool. Your AI tools.

No BAA = no permission to touch PHI. No exceptions.

Now let's go.

Mistake #1: Pasting PHI Into Consumer ChatGPT

This is the most common, and the easiest to fix.

Someone at your practice — maybe a doctor drafting a treatment explanation, maybe a front desk staffer writing an insurance appeal letter — opens ChatGPT or Claude and types something like:

"Help me write an appeal letter for Jane Smith, DOB 3/12/1974, procedure D2750, denied by Delta Dental..."

That's a HIPAA violation the second you hit enter. Consumer versions of ChatGPT, Claude, and Gemini are not HIPAA-compliant and do not offer BAAs. The data you paste in can be used to train the model. Full stop.

Real cost: If this becomes a pattern at your practice and triggers an OCR investigation, you're looking at multiple violations across multiple patients. At $1,000–$50,000 per violation, you can math out where that lands.

The fix:

  • Train your team: no patient names, DOBs, insurance IDs, diagnoses, or anything identifiable in any public AI tool.
  • Have AI write templates and frameworks. Your team fills in patient details inside your existing PMS or email tool.
  • If you need AI that handles PHI directly, use a healthcare-grade tool with a signed BAA. OpenAI launched a healthcare product in January 2026. There are also dental-specific AI platforms built around HIPAA from day one.

Mistake #2: Using AI Scribes Without a BAA

AI scribes and note-takers are exploding in dentistry. They listen to appointments, transcribe the clinical conversation, and auto-populate your PMS notes. Huge time-saver for your team.

They're also one of the highest-risk AI categories in your practice — because they're processing actual patient conversations in real time.

Half the scribe vendors I talk to during sales demos can't immediately answer "do you sign a BAA?" If the answer isn't a clean "yes, and here it is," that's your sign to walk.

Real cost: A single scribe vendor without a BAA, processing audio for 20 patients a day, means ~5,000 PHI disclosures per year. If that vendor gets breached or uses that data improperly, you're on the hook for every single one.

The fix:

  • Before you sign with any scribe vendor, get the BAA in writing.
  • Ask where the audio is stored, who has access, and how long it's retained.
  • Verify they have SOC 2 Type II attestation (minimum bar for any healthcare vendor handling audio).
  • If the vendor can't check all three boxes, walk. There are good ones that can.

Mistake #3: Giving AI Tools Access to Your PMS Without Documentation

This one is sneaky because it looks like routine integration work.

You sign up for a new recall automation tool. A new online booking tool. A new insurance verification AI. Each one needs API access to your practice management software (Dentrix, Eaglesoft, Open Dental). Someone on your team enters the API credentials, flips it on, and it works.

You just gave an AI vendor persistent, authenticated access to every patient record in your PMS. That's PHI access, even if the tool "only looks at appointments." Once it's connected, it can see the full dataset.

Did you document:

  • Who the vendor is?
  • What data they can access?
  • Do they have a signed BAA?
  • What happens to the data after it leaves your network?
  • Who internally has the credentials?

Most practices can't answer any of those questions for most of their integrations.

Real cost: The average practice has 8–15 active third-party integrations with their PMS. If one of them gets breached and you can't produce a BAA or documentation, regulators treat that as willful neglect — the highest penalty tier.

The fix:

  • Build a living inventory of every tool that touches your PMS. Vendor name, what data they access, BAA status, primary contact.
  • Review it quarterly.
  • Any new integration requires a signed BAA before the credentials are entered.
  • Tools like OpenClaw that run on-premise are lower-risk because the data doesn't leave your network — but you still need to document the deployment.

Mistake #4: Assuming "Cloud-Based" Means "HIPAA-Secure"

This one drives me nuts because it's marketed as a feature.

A vendor tells your practice manager "we're built on AWS, so we're enterprise-grade secure." Your team hears "secure" and thinks "compliant."

They are not the same thing.

AWS, Google Cloud, and Azure can host HIPAA-compliant workloads — but only if the vendor has configured them correctly, signed a BAA with the cloud provider, and implemented the required technical safeguards: end-to-end encryption, role-based access, authenticated APIs, and detailed logging of every data access event.

Running on AWS without those safeguards is like saying "we're safe because we have a front door." Sure. But is it locked?

Real cost: If your cloud-based vendor isn't actually compliant and gets breached, you're responsible for every affected patient. At $50–$150 per notification, plus lost trust, plus lawsuits — a 3,000-patient breach can cost $450K+ before a single fine is assessed.

The fix: When evaluating any cloud-based AI tool, ask specifically:

  • Do you sign a BAA with us? (Non-negotiable.)
  • Do you have a BAA with your cloud provider? (Should be yes.)
  • Is all PHI encrypted in transit and at rest?
  • Do you maintain audit logs of every access event for at least 6 years?
  • Do you have SOC 2 Type II or HITRUST certification?

Five yes answers, or walk.

Mistake #5: Not Training the Team On What's Allowed

You can have the best BAAs in the world. You can have a perfect vendor inventory. None of it matters if one hygienist on lunch break uses ChatGPT on her personal phone to draft a message about a patient's difficult insurance case.

That's still a violation. And your practice still owns it.

HIPAA compliance isn't a tech problem. It's a people problem with a tech surface.

Real cost: Most of the dental HIPAA violations that make the news start here. An unauthorized tool, an untrained staffer, a well-intentioned mistake. The ADA has extensive guidance on penalties — and nearly every violation was preventable with a 20-minute training session.

The fix:

  • Write a one-page AI usage policy. What tools are approved. What data can and cannot go into them. What the consequences are for violations.
  • Train every team member on it — not just the doctors. Front desk, hygienists, treatment coordinators, assistants. Everyone.
  • Repeat the training quarterly. HIPAA expects evidence of ongoing training, not a one-time orientation.
  • Document everything. If regulators ever come knocking, the documented training is your single best defense.

What To Do This Week

You don't need to boil the ocean. You need four concrete actions:

  1. Inventory your AI tools. Write down every AI-powered tool anyone at the practice uses, officially or unofficially. Include the ChatGPT accounts your team set up without telling you.
  2. Get BAAs signed. For every tool that touches patient data, request a BAA in writing. If a vendor refuses or stalls, cut them. There are compliant alternatives for everything.
  3. Write a one-page AI policy. What's approved, what's banned, what the consequences are. Use a compliance consultant or attorney to draft or review it — the cost is a fraction of a breach.
  4. Train the team. Every team member. 20 minutes. Document it.

The Bigger Picture

AI is going to keep expanding into your practice — scribes, scheduling, phones, patient communication, ad targeting, clinical decision support. Every one of those categories will have both HIPAA-compliant and non-compliant vendors fighting for your business.

The practices that win aren't the ones who avoid AI. They're the ones who adopt it fast and compliantly, while their competitors are still paralyzed or blissfully breaching.

Get the foundation right now, and AI becomes one of the biggest operational advantages you'll ever have.

Get it wrong, and the same tools that promised to save you hours will cost you your practice.


If you want help thinking through which AI tools actually make sense for your practice — or you'd like me to walk your team or conference audience through this playbook — let's talk.

Sources

Want to see this in action for your practice?

Book a free discovery call and I'll run a competitive analysis — on the house.

Book a Discovery Call
Ready to grow your practice?

Let's find your practice's
hidden growth.

Every discovery call starts with a free competitive analysis of your practice. No obligation, no pressure — just data and honest conversation about what's possible.