Lawyer warns of data breach risk from ChatGPT
Data breach experts are increasingly concerned about the risks from confidential client data being input into artificial intelligence systems such as ChatGPT, according to a legal director at a law firm with a specialism in data breaches.
ChatGPT’s recent advent in assisting businesses including legal firms with administrative tasks and improving efficiency has caused a huge increase in users. However, Richard Forrest of Hayes Connor has issued a warning after a recent investigation by Cyberhaven revealed that sensitive data make up 11% of what employees submit to ChatGPT.
Likewise, confidentiality agreements with clients may be at risk, as sensitive information may be entered by employees into the chatbot, as could trade secrets, including codes and business plans, putting employees potentially in breach of their contracts.
Several large-scale companies, including JP Morgan, Amazon, and Accenture, have now restricted the use of ChatGPT by employees.
Mr Forrest urges all businesses who use ChatGPT to implement various measures to ensure employees are remaining GDPR compliant. He suggests:
- Assume that anything you enter could later be accessible in the public domain.
- Don’t input software code or internal data.
- Revise confidentiality agreements to include the use of AI.
- Create an explicit clause in employee contracts.
- Hold sufficient company training on the use of AI.
- Create a company policy and an employee user guide.
“ChatGPT, and other similar large language models (LLMs), are still very much in their infancy stages”, Mr Forrest commented. “This means businesses incorporating the chatbot into work processes are in uncharted territory in terms of GDPR compliance.
“Businesses that use ChatGPT without proper training and caution may unknowingly expose themselves to GDPR data breaches, resulting in significant fines, reputational damage, and legal action taken against them. As such, usage as a workplace tool without sufficient training and regulatory measures is ill-advised.”