Let’s chat ChatGPT....
On 30 November 2022, a company called OpenAI took the technology world by storm, launching a tech-changing AI chatbot which has the ability to explain, near-instantaneously, complex concepts for a range of topics and formulate nice easy answers from scratch. This is why the legal industry has been checking its potential, and there is plenty of speculation regarding its impact.
How does it function?
Chat GPT, or, fully, “Chat Generative Pre-trained Transformer”, is an AI tool which uses a large language model trained by OpenAI and relies on a training method known as reinforcement learning from human feedback.
However, for ChatGPT to generate any kind of content, it first has to be asked an appropriate question. There are two circumstances where it can be used. The first one is asking specific commands where the intention is to generate entirely new content, and the second is specific data, content or text input where it is asked to modify, expand or correct data which is protected or subject to special laws, such as personal data.
The chatbot is specifically trained, through the use of machine learning algorithms, to analyse large amounts of data and recognise language patterns and structures. Its model is designed to answer to human input in a conversational way and generate human-like text based on the context of the conversation.
Its capacity to answer follow-up questions, challenge its own mistakes, and learn from past conversations makes it a truly innovative instrument in the tech realm. It can be used in various dialogue applications such as virtual assistants or customer service.
With this perspective, ChatGPT can be used “limitlessly”. According to the range of provided information and how it is provided, it can involve different risks for users.
Opportunities in law?
Chat GPT has flexible uses, from complicated matters such as explaining coding software, to useful suggestions such as providing a recipe for what is left over in your fridge.
The use of AI is not new in the legal scene, notably through research and legal databases. In some respects ChatGPT is no different. It has been fed information through varied sources such as the internet, books, articles, magazines, etc, that has helped it learn what “usually” will come after a word – it is completing information based on what others have published in the various sources from which it has been fed. Some suggest that ChatGPT could be used to identify litigation strategies, review certain contracts and even draft legal documentation.
Lawyers’ work depends hugely on up-to-date research and information to complete a well-founded argument and give daily advice, which is why there are certain limits to ChatGPT that should be taken into account. OpenAI has pointed out that ChatGPT cannot provide accurate information beyond 2021, which corresponds to its knowledge cutoff, or time stamp. As laws and guidance change constantly, ChatGPT would not have access to any of the most current and updated legal data sites, therefore the information provided will most likely be inaccurate and out of date. Furthermore, ChatGPT requires critical thinking, especially when the task given does not merely entail summarising information; it is more prone to provide false information or incomplete feedback that still remains plausible.
Even though ChatGPT answers can sound convincing, the capabilities of the platform are still very limited. In this respect, the system is not designed to give any advice. Lawyers should be aware that ChatGPT is no legal expert, and in the legal field, error raises the question of responsibility: what happens when a wrong legal solution is given to a client? The answer becomes more complicated and relevant for lawyers who practise in different jurisdictions and the law varies between those jurisdictions.
Solicitors are tied by their duties to the profession, including the duty of diligence and competence. They have their own civil liability in cases of professional misconduct. It is the solicitor that bears the prejudice caused by incorrect or misleading legal advice. They as the professionals are held accountable and penalised, not the chatbot used.
What are the legal implications and risks?
Although ChatGPT bears significant innovative benefits, it is by no means a perfect system and is subject to limitations. Primary legal concerns range from copyright issues to cybersecurity risks.
Confidentiality and ethical concerns
This is the most important limitation. As stated before, ChatGPT works by returning answers to “tips” given by humans. ChatGPT uses these tips to provide endless capabilities which can range from answering a simple question to loading a document and observing it under a certain jurisdiction.
According to OpenAI, the prompts entered can also be used for training purposes for OpenAI. They further recommend that users do not share any sensitive information in the tips (Sydney Butler, “How to (Temporarily) Add New Knowledge to ChatGPT,” How-To Geek, April 2023), as it is likely that these tips will be fed into the system and at some point be used as answers for other users, which could lead to a breach of confidentiality claim.
If we take a closer look at OpenAI privacy policy, it has separated information that is gathered automatically using ChatGPT:
- “Personal information we receive automatically from your use of the Services”
OpenAI collects information just like other apps do, like name, contact information, account credentials, payment card information and transaction history. It states that: “When you use our Services, we may collect personal information that is included in the input, file uploads, or feedback that you provide to our Services (‘Content’)” (What Is ChatGPT?: OpenAI Help Center).
- Personal information you provide
OpenAI states that it “may automatically collect information about your use of the Services, such as the types of content that you view or engage with, the features you use and the actions you take” (Sydney Butler and Jordan Gloor, “6 Reasons ChatGPT Is Giving You the Wrong Answers”, How-To Geek, April 2023).
In addition, its policy states that it may use this information to “(i) provide, administer, maintain and/or analyze the Services; (ii) improve our services and conduct research; (iii) communicate with you; (iv) develop new programs and services” (Chris Dreyer, “AI For Lawyers: Transform Your Legal Practice With AI Tools”, Rankings, 26 May 2023). This suggests that it is evident that OpenAI has access to whatever the user inputs into the prompt, even though it is unclear what information specifically can be collected and how it can be used. It is evident that OpenAI has access to whatever the user inputs into the prompt.
This caused an uproar earlier this year when one of the biggest companies was caught in the middle of three separate confidentiality scandals involving ChatGPT. Employees of the internationally recognised industry leader in technology, Samsung, were caught feeding sensitive information into ChatGPT prompts. In the first two cases, employees shared confidential source code into ChatGPT asking it to find a fix for it; in the third, they shared confidential meeting notes asking it to prepare a meeting minute out (Dreyer, above). On a separate but similar occasion, ChatGPT suffered a security breach where users could see other users’ conversations, which triggered concerns in particular with European regulators (Miles, Janelle, “What is ChatGPT and why are schools and universities so worried about students using AI to cheat?”, ABC).
Lawyers handle sensitive information on a daily basis, either with the firm or with clients. If this information was to be revealed, it would have an impact on the individual client but also hugely on the business. It is therefore mandatory that lawyers handle the information they are given with extra care, whether it is information they have access to or that is shared.
With regard to ethical duties, each jurisdiction has its own guidelines and set of rules pertaining to ethics and the process around these. In saying this, it is likely that all jurisdictions would have the same process of maintaining confidential information and providing competent legal advice to clients. Using ChatGPT with no proper legal analysis and human judgment could potentially breach ethical duties for the client, the lawyer and firm involved.
Copyright issues
There is also an issue and uncertainty around who owns the written content generated by ChatGPT. One could argue that the copyright should belong to the original owners of the data. However, as the chatbot is trained on vast quantities of text from an array of different sources, this creates an unrecognisable pool of original content, resulting in the original owner not being identified.
This begs the question, is OpenAI the rightful owner of the generated content, and if so, do we need to obtain a licence to use ChatGPT content for commercial purposes? Under its “Terms of Use”, OpenAI assigns all rights to ChatGPT: “As between the parties and to the extent permitted by applicable law, you own all Input, and subject to your compliance with these Terms, OpenAI hereby assigns to you all its right, title and interest in and to Output”.
This wording suggests that users can use the generated content for any purpose subsequently, involving the risk that the same content can be generated for other users who ask similar questions. In either suggestion, one would have to ensure that the AI is actually creating new content. Users should be aware of such issues and use AI-generated outputs as a source of inspiration instead of reproducing them precisely.
Cybersecurity setbacks
ChatGPT is able to code instantly, which could prove a useful error and vulnerability detection tool in complex code. One advantage of this is that it can become powerful cybersecurity material and could be used to monitor chat conversations for suspicious activity. One negative is that there is a concern that ChatGPT may make a cybercriminal’s life easier, as it has the ability to impersonate others, write flawless text, and create code which can be misused by anyone with malicious intent.
If its capabilities can create a market in cyber incident management for cyber teams through simulations, it can also open up opportunities for malware development, ransomware, BEC attacks, phishing attacks, spam, and impersonation attempts. Its code analysis could be used to guide hackers and fuel a climate which is already very hacker-friendly.
While it is unlikely that ChatGPT will steal your data, anyone secretly monitoring your conversation with a chatbot could invade your security and privacy. ChatGPT’s backdoor capabilities must be properly understood and reined in to limit fallout. It is thus essential to be always cognisant of the information stored when engaging with ChatGPT, and not share confidential information such as personal addresses etc. In addition, adopting certain steps such as up-to-date software, firewalls, network detection and response, and antivirus, will also assist in keeping your personal data, firm’s data or clients’ data secure from any conversational AI system.
Is there a link between ChatGPT and GDPR?
We still do not know what method is used by OpenAI to collect the data that ChatGPT is based on. However, we do know that such data are derived to a large extent from sources available on the internet.
Naturally, a portion of the data gathered on the internet may qualify as personal data. Like all processing activities, web scraping is regulated by the GDPR and, depending on national laws, may be subject to strict conditions to be legally implemented.
On 31 March 2023, the Italian Data Protection Authority issued a ban on ChatGPT’s use, stating: “no information is provided to users and data subjects whose data are collected by Open AI; more importantly, there appears to be no legal basis underpinning the massive collection and processing of personal data in order to ‘train’ the algorithms on which the platform relies”. As such, a number of issues appear to remain unresolved regarding ChatGPT’s compliance with the GDPR.
Data protection challenges also apply to interactions between ChatGPT and its users: a recent bug in an open source library led to the exposure of conversation titles to other users (OpenAI blog, 24 March 2023). It is important to note that no sensitive data should be shared in conversations, something also recommended by OpenAI.
Highlights
Artificial intelligence is not new. Different companies continue to work towards making the technology more economically feasible and user accessible. ChatGPT has so far proven to be a resource friendly tool, enabling improved time and cost management.
Various law firms have initiated the process of introducing artificial intelligence systems within their functioning, an example of which is “Harvey”, a generative AI developed entirely for complicated legal issues faced by law firms in their cases. However, legal teams within any business considering its use should make sure they have appropriate safeguards in place to govern who has access to the tool, what information can be submitted, and how the output can be used.
A lawyer’s work is highly specialised and requires multiple years of studying, training and accumulated experience to deliver results and client satisfaction. There is no doubt that lawyers will have to work with AI if they want to remain competitive, but this use must be done in a responsible manner and in observance of ethical obligations.
Chat GPT cannot replace expertise, judgment and experience, and when it is being used, lawyers should never rely on ChatGPT as the only source of information but always cross-reference with other sources and consult when unsure, to ensure the information they are providing is indeed up-to-date, accurate but most importantly applicable to the specific situation.
This article first appeared on www.thescottishlawyer.info/blog
Regulars
Perspectives
Features
Briefings
- Criminal court: Boundaries of corroboration
- Corporate: Deceptive digital design – no clever cookie?
- Agriculture: Ending LDTs in a second short continuation
- Succession: Attorney as executor?
- Sport: Is that in the rules?
- Scottish Solicitors' Discipline Tribunal: December 2023
- In-house: The real deal
- Intellectual property: Making your mark with a sound