Every AI-handled call produces several data streams: audio, transcript, structured extracted data, and metadata. All of it is PHI under HIPAA and flows through a controlled chain of BAA-covered services. Nothing leaves that chain without your explicit authorization. Here's what actually happens to each stream — start to finish.
The diagram is less complicated than it sounds. Five stops. Three are always yours (your phone system, your PMS, your dashboard access). Two are the vendor's (voice infrastructure, AI inference).
The Five-Stop Data Flow
1. Telephony ingestion
Call arrives at your phone number. Your phone system (RingCentral, Nextiva, etc.) forwards it to the vendor's voice infrastructure (typically Twilio, Telnyx, or equivalent). This leg uses standard PSTN + SIP encryption. Your phone provider has its own BAA for the PHI in transit.
2. Voice processing
Audio streams into the vendor's processing pipeline:
- Speech-to-text (transcription)
- Language model inference (the AI's responses)
- Text-to-speech (the AI's voice)
Each of these components is BAA-covered. Audio is transient in this leg — it's processed in memory, not persisted at each step.
3. Structured data extraction
The AI extracts structured fields from the conversation: patient name, phone, DOB, insurance carrier, appointment type, scheduling preferences. These are validated and prepared for writing to your PMS.
4. PMS write
Extracted data is written to your practice management system. This is the authoritative long-term home for the patient information; the vendor is a pass-through. Appointments appear in your PMS the same way they would from a human booking — visible to staff, subject to your PMS's own retention and access controls.
5. Vendor storage (for audit and dashboard)
Audio, transcript, and call metadata are stored in the vendor's encrypted object storage. This is what powers your admin dashboard — reviewing calls, pulling transcripts, analytics. Retention is configurable (typically 12 months default). After retention, cryptographic destruction.
What Data Lands Where
| Data | Your PMS | Vendor storage | Retention |
|---|---|---|---|
| Appointment record | Yes (permanent) | Reference only | Your PMS policy |
| Patient demographics | Yes | Transient | Your PMS policy |
| Insurance details | Yes | Transient | Your PMS policy |
| Call audio | No | Yes (encrypted) | Configurable, default 12 months |
| Call transcript | Optional (summary) | Yes (encrypted) | Configurable, default 12 months |
| Structured extraction | Yes (fields) | Yes (audit) | Configurable |
| Call metadata | Optional | Yes (analytics) | Configurable |
What the Vendor Never Does With Your Data
- Never trains public AI models on your patient data. Contractually prohibited in the BAA. This is important because many consumer AI tools have vague language here; healthcare vendors must be explicit.
- Never sells aggregated or de-identified data to third parties. "Industry benchmarks" are built from opt-in anonymous telemetry only.
- Never exposes PHI to other practices on the platform. Multi-tenant isolation is enforced at the storage, compute, and network level.
- Never processes data outside the U.S. without specific authorization. Most vendors keep all PHI in U.S. regions.
The Vendor Chain (Under-Appreciated Detail)
Your AI receptionist vendor uses subcomponents — telephony provider, language model provider, cloud hosting, storage, monitoring. Each of these must be covered by a BAA with your vendor (not directly with you, unless you're unusually large).
Ask for the full list:
- Telephony (Twilio, Telnyx)
- Language model (Anthropic, OpenAI, Azure OpenAI, or equivalent with BAA)
- Speech-to-text (Deepgram, AssemblyAI, internal)
- Text-to-speech (ElevenLabs, PlayHT, Azure, internal)
- Cloud hosting (AWS, GCP)
- Storage (S3, managed databases)
- Monitoring (Datadog, Sentry, configured for PHI-safe telemetry)
If the vendor can't produce the list with BAA status, that's a red flag.
Your Rights as a Practice
- Request a full data export at any time
- Request deletion of specific calls or the full data store when you leave the platform
- Audit access logs to see who at the vendor accessed your data and when
- Be notified of any breach within 24 hours
- Configure retention policy to match your state's recordkeeping rules
Patient Rights and the Practice-Patient Relationship
HIPAA's patient-rights obligations (right-of-access, correction, accounting of disclosures) apply. You, the practice, are the covered entity responsible for fulfilling them. The vendor supports you by providing data exports, audit logs, and deletion on request.
Practical impact: if a patient requests a copy of their call audio, you pull it from the vendor dashboard and provide it per your standard records-request process.
FAQ
Does the AI read my existing patient records when handling a call?
Yes — limited scope. It reads the patient's upcoming appointments, scheduling history, and basic demographics to handle the current call. It doesn't read clinical notes, treatment plans, or imaging.
What happens to call audio when we switch vendors?
You export anything you want to retain, then request full deletion. Most vendors will provide a final data export and a deletion certificate within 30 days of offboarding.
Do analytics expose individual patient information to aggregate dashboards?
Good platforms separate PHI from aggregate metrics. Analytics show "37 new-patient calls this week"; the identities are only visible when you drill into specific calls with proper access controls.
Is this more secure than our current answering service?
Modern AI vendors are typically more rigorously encrypted and audit-logged than traditional answering services, yes. Whether your current answering service has a BAA at all is a good question to verify regardless.
What about data from telemedicine calls or video visits?
Those are separate systems. AI receptionists handle the phone channel only. Integration with your telemedicine platform is possible for appointment booking but audio handling is typically separated.