Legal Expert Raises Confidentiality Concerns Over Employees Inputting Sensitive Data into ChatGPT

by | Mar 31, 2023

Share this article

Richard Forrest, Legal Director at the UK’s leading data breach law firm, Hayes Connor, discusses the potential risks of employees sharing sensitive data on chatbots, as well as providing actionable tips on how to use ChatGPT safely in the workplace.

In light of recent ChatGPT concerns in the news, Richard Forrest expresses major apprehensions that a considerable proportion of the population lacks a proper understanding of how generative AI, such as ChatGPT, operates. This situation, he fears, could lead to the inadvertent disclosure of private information, and therefore a breach of GDPR.

As such, he urges businesses to implement compliance measures to ensure employees in all sectors, including healthcare and education, are remaining compliant.

This comes after a recent investigation by Cyberhaven revealed that sensitive data makes up 11% of what employees copy and paste into ChatGPT. In one instance, the investigation provided details of a medical practitioner who inputted private patient details into the chatbot, the repercussions of which are still unknown. Richard Forrest says this raises serious GDPR compliance and confidentiality concerns.

 
 

Due to the chatbot’s recent appraisals of being able to assist business growth and efficiency, there has been an increase in users across many sectors. However, concerns have arisen after a number of employees have been found to be negligently submitting sensitive corporate data to the chatbot, as well as sensitive patient and client information.

As a result of these ongoing privacy fears, several large-scale companies, including JP Morgan, Amazon, and Accenture, have since restricted the use of ChatGPT by employees.

Richard Forrest weighs in on the matter: “ChatGPT, and other similar Large Language Models (LLMs), are still very much in their infancy. This means we are in unchartered territory in terms of business compliance, and regulations surrounding their usage.

 
 

“The nature of LLMs, like ChatGPT, has sparked ongoing discussions about the integration and retrieval of data within these systems. If these services do not have appropriate data protection and security measures in place, then sensitive data could become unintentionally compromised.

“The issue at hand is that a significant proportion of the population lacks a clear understanding of how LLMs function, which can result in the inadvertent submission of private information. What’s more, the interfaces themselves may not necessarily be GDPR compliant. If company or client data becomes compromised due its usage, current laws are blurred in terms of which party may be liable.

“Businesses that use chatbots like ChatGPT without proper training and caution may unknowingly expose themselves to GDPR data breaches, resulting in significant fines, reputational damage, and legal action. As such, usage as a workplace tool without proper training and regulatory measures is ill-advised.

 
 

“It is the onus of businesses to take action to ensure regulations are drawn up within their business, and to educate employees on how AI chatbots integrate and retrieve data. It is also imperative that the UK engages in discussions for the development of a pro-innovation approach to AI regulation.”

To ensure that your company and client data is not compromised and your company is not in breach of GDPR using LLMs, Richard Forrest provides actionable tips on how businesses and employees can remain vigilant. These include:  

  • Assume that anything you enter could later be accessible in the public domain
  • Don’t input software code or internal data
  • Revise confidentiality agreements to include the use of AI 
  • Create an explicit clause in employee contracts
  • Hold sufficient company training on the use of AI
  • Create a company policy and an employee user guide

Currently, one of the biggest causes of data breaches in the UK across most sectors is human error. As AI is being utilised more frequently in the corporate sphere, it is important to make training a priority.

By making it clear what constitutes as private and confidential data and explaining the legal consequences of sharing such sensitive information, you should be able to drastically reduce the risk of this data being leaked. 

Hayes Connor Solicitors have significant expertise and experience supporting clients who have had their data exposed due to data protection negligence. They can support claims for privacy loss, identity theft, and financial losses.

Share this article

Related articles

Sign up to the IFA Magazine Newsletter

Trending articles

IFA Talk logo

IFA Talk is our flagship podcast, that fits perfectly into your busy life, bringing the latest insight, analysis, news and interviews to you, wherever you are.

IFA Talk Podcast - listen to the latest episode

x