We finalized a Business Associate Agreement (BAA) with OpenAI, complete with the HIPAA amendment that governs how electronic protected health information (ePHI) is handled inside our AI stack. This codifies the safeguards we already built for healthcare and research clients while opening the door to deploy GPT‑powered copilots in regulated workflows.
The agreement obligates OpenAI and Alshival to maintain the physical, administrative, and technical controls required under HIPAA, including explicit breach notification windows, access logging, and encryption requirements. In practice that means every prompt and response that contains patient identifiers stays inside a monitored, audit-ready environment.
What clients can expect
- HIPAA-ready copilots: Voice and chat interfaces that can safely look up charts, summarize encounters, or draft letters without exporting PHI outside covered systems.
- Secure data pipelines: Traceable ingestion and tokenization layers that pass compliance reviews with hospital privacy, IRB, and security teams.
- Shared accountability: Joint incident response timelines with OpenAI so escalation paths stay clear if something ever looks off.
We paired the legal paperwork with engineering work: environment-level encryption keys, row-level auditing inside our feature store, and automated red-team tests that stress prompts for data leakage. The goal is not just paperwork but dependable operations.
Next steps
If you are exploring clinical automation, quality review agents, or any workflow that touches PHI, we can now launch pilots under a single stack. Contact the team to scope an engagement or request a redacted copy of the amendment.