AI Chatbot Warning

 

 

Everyone’s talking about AI: its uses, dangers, and many applications within different industries. Some healthcare organizations have begun to consider ways of leveraging AI tools in their workplace to supercharge efficiency.

Chatbots like “Bard” and “ChatGPT” are dramatically changing the landscape in companies everywhere, but they pose significant risks when used incorrectly.

The functionality of AI to improve efficiency and effectiveness in healthcare isn’t a new concept. Covered entities have Business Associate Agreements (BAA) in place with data analytics firms, making it possible for those third-parties to analyze electronic health records in order to improve the results of patient care.

The chat feature of these tools is new, and that’s where the compliance and security issues come into play. The key piece of this is the missing BAA. Providers might be able to create organized medical notes on a patient meeting incredibly quickly by throwing the transcript into ChatGPT, but by doing so they just gave personally identifiable information to the company that owns the tool, OpenAI. That information is now stored on one of OpenAI’s servers, and the individual or organization has no further control over the security of that private data.

Some argue that it’s possible to simply deidentify any patient data you put into a chatbot, and to limit the use of chatbots only to employees who have been trained to use it appropriately. Others hold that it is a provider’s responsibility to their patients to protect patient data to the best of their ability, avoiding even the possible risks of providing data without identifying factors.

HIPAA has no rules that specifically deal with the application of AI chatbots, so it may seem like there is no way to confirm whether or not the use of these chatbots is actually compliant. However, HIPAA was designed intentionally so that despite constant changes in technology, providers have a responsibility to put appropriate security measures in place to protect patient data.

Protections like these, that fall into the “technical safeguards” category of HIPAA, should be assessed and documented often as part of your organization’s regular required security risk assessment. The Medcurity platform offers the most intuitive and effective SRA to healthcare practices and hospitals. Conduct an assessment of your technical, physical, and administrative safeguards using our easy-to-use platform, and instantly gain access to a prioritized list of action items from your assessment results. Our assessment contains helpful guidance and definition throughout, to bring clarity and confidence to your assessment. Save time and ditch the complicated spreadsheets for an all-in-one compliance platform. To find out more about the Medcurity Security Risk Assessment module, you can view a demo here. Find out why Medcurity is the leading HIPAA compliance platform today!

If you have questions on this topic or any other related to HIPAA compliance, don’t hesitate to reach out to your team at Medcurity. We're here to help!