Is It Possible to Use ChatGPT in Compliance with HIPAA?

by

It is possible to use ChatGPT in compliance with HIPAA, but – until such time as OpenAI makes ChatGPT HIPAA compliant – there are risks associated with implementing anonymizer software to ensure Protected Health Information is not impermissibly disclosed to ChatGPT.

It its current state, ChatGPT does not support HIPAA compliance. The program does not have the necessary safeguards to ensure the confidentiality, integrity, and availability of Protected Health Information (PHI) and OpenAI – the developers of ChatGPT will not enter into a Business Associate Agreement with HIPAA covered entities and business associates.

However, this may soon change. In a recent article published on the OpenAI website, the company stated it was working on offering support for HIPAA compliance in the future. The article also revealed that OpenAI will enter into Business Associate Agreements for certain APIs whose endpoints are eligible for zero data retention.

Until such time as OpenAI makes ChatGPT HIPAA compliant, healthcare professionals can use the platform, but cannot disclose PHI to the platform when doing so. This limits the uses of ChatGPT for healthcare, but there is a solution – HIPAA compliant anonymizing software that removes PHI before prompts are received by ChatGPT.

How to Use ChatGPT in Compliance with HIPAA

There are several software vendors that offer HIPAA compliant anonymizing software and most software of this nature works in the same way:

  • A healthcare professional writes a prompt that includes PHI (i.e., name, age, and symptoms).
  • The software identifies PHI and replaces it with tokens (i.e., “Mr. Smith” becomes “Name”)
  • The anonymized prompt is forwarded to ChatGPT and an anonymized output is returned.
  • The software reverses the anonymization and replaces the tokens with data from the original prompt.

Not only does the software have the necessary safeguards to ensure the confidentiality, integrity, and availability of PHI, but the vendors of the software are willing to enter into a Business Associate Agreement as required by §164.308(b) of the Security Rule. (Note: in some cases, it may be necessary to subscribe to a specific plan).

The implementation of anonymizer software enables healthcare professionals to use ChatGPT in compliance with HIPAA because PHI is not being impermissibly disclosed to the platform. However, when implementing anonymizer software it is advisable not to rely on ChatGPT for diagnosing health conditions because of the risks.

The Risks Associated with Anonymizer Software

Older versions of ChatGPT have a poor record of diagnosing health conditions. If identifiers such as age, ethnicity, and gender are removed from prompts, even newer versions of ChatGPT might struggle to correctly diagnose Kawasaki disease, CJD, or prostate cancer – or prescribe appropriate courses of treatment.

For this reason, it is a best practice in healthcare to use ChatGPT to increase confidence in diagnoses rather than make diagnoses. Healthcare organizations looking to use ChatGPT in compliance with HIPAA should consider the risks associated with anonymizer software before deploying the software in a clinical environment.

James Keogh

James Keogh has been writing about the healthcare sector in the United States for several years and is currently the editor of HIPAAnswers. He has a particular interest in HIPAA and the intersection of healthcare privacy and information technology. He has developed specialized knowledge in HIPAA-related issues, including compliance, patient privacy, and data breaches. You can follow James on Twitter https://x.com/JamesKeoghHIPAA and contact James on LinkedIn https://www.linkedin.com/in/james-keogh-89023681 or email directly at [email protected]