We have all heard of Chat GPT, and many have tried it. However, only some of us realise how it works. Chat GPT is a conversational software/app that utilises artificial intelligence, although it has many more and perhaps even more disruptive uses.
The great revolution of Chat GPT? It is a generative artificial intelligence, particularly a Generative Pre-trained Transformer, i.e. a type of Large Language Model.
To do this, Open AI – the company that created Chat GPT – used large amounts of data to train the programme, using supervised and reinforcement learning techniques.
To summarise, Open AI has used large amounts of data, mainly available data and/or other data that it may have acquired from data brokers, which it has used to train the model. Based on the data it has been fed and trained with, Chat GPT can predict the most likely next word to answer questions or requests, thereby creating a text.
Like all disruptive technologies, legal challenges and difficulties must be met. In this article, we will focus on personal data issues.
Implementation of the GDPR
Chat GPT was created by Open AI LLC, based in California. Chat GPT’s privacy terms refer to the laws of the State of California concerning its privacy obligations. However, it has an “International Users” section with specific rules and information for users in the European Union, Switzerland and the United Kingdom.This is because the GDPR applies to the processing of personal data of data subjects residing in the territory of the Union, carried out by a controller or processor not established in the Union, where the processing activities are related to the offering of goods or services to such data subjects in the Union, regardless of whether the data subjects are required to make a payment or the monitoring of their behaviour, provided that such behaviour takes place in the European Union.
Thus, although Open AI does not have an establishment, it had to designate a controller and be subject to the rules of the GDPR regarding the processing of personal data of data subjects residing in the European Union.
The suspension in Italy
On 31 March 2023, the Italian data protection authority ordered the immediate suspension of the processing of Italian users’ data by Open AI, which in practical terms, led to Open AI suspending the Chat GPT service in Italy.
The Italian Data Protection Authority raised the following issues that would need to be resolved or demonstrated to be GDPR compliant by Open AI:
- “no information is provided to users and data subjects whose data is collected by Open AI.”
- “more importantly, there seems to be no legal basis to support the massive collection and processing of personal data to “train” the algorithms on which the platform is based”
- “incorrect personal data is processed”, as tests have shown that Chat GPT “does not always correspond to factual circumstances.”
- “the lack of [any] age verification mechanism exposes children to receiving inappropriate answers to their age and awareness.”
OpenAI and the Italian Data Protection Authority initiated a discussion, which resulted in a new decision on 11 April 2023 of the Italian Data Protection Authority, which established Open AI’s obligation to implement nine measures. Many emphasised the clear responsibility to inform and facilitate the exercise of data subjects’ rights, including the promotion of an information campaign in the leading Italian media, as well as the implementation of an age verification tool changing the legal basis of the processing of users’ data for algorithmic training, eliminating any reference to the contract and relying on consent or legitimate interest as legal bases.
Open AI appears to comply with the Italian Data Protection Authority’s request, as Chat GPT was made available to Italian users again on 28 April 2023.
Doubts remain?
The two main issues that may generate doubts were related to access by minors.
The other issue that seems more complex to us concerns the question of the legal basis for collecting user (and, it can add, third-party) data.
Following the Chat GPT privacy policy, Open AI processes the following personal data:
- Technical information, including Log Data, Usage Data, Device Information, Cookies and Analytics;
- Identifiers, including contact details;
- Commercial information, including transaction history;
- Information about usage data, including Content and how users have interacted with the Services;
- Geolocalisation data;
- Login credentials.
The data that generates the most significant concern will be Content data and will be those for which we have the most doubts that the processing complies in full with the provisions of the GDPR.
According to Open AI’s Privacy Policy, Content data refers to the data collected when using Chat GPT, including the inputs (questions and/or instructions given), files uploaded or comments and feedback provided by the user. And to that end, it should be noted that Chat GPT will use this data to train and improve its service.
This Content Data may and often will include personal data and may also have sensitive personal data. This personal data may be the user’s data or that of third parties. As for the grounds for the processing, according to the privacy policy, the data is processed for the performance of a contract, particularly for improving the language model and for Open AI’s interests.
One question arises whether data processing based on pursuing legitimate interests is legitimate. However, this is an issue that may still be under scrutiny. In any case, following the decision of the Italian authority, Open AI has changed its privacy settings to allow opt-out of the processing of Content data to comply with the right of objection of data subjects provided for in the GDPR.
However, as mentioned above, the Content data may include personal data of third parties, in which case Open AI may not be able to ensure the rights of the third party holder of such data, in particular, the right to object. The question also arises regarding how effectively exercising information rights, access, or rectification can be.
Confidentiality
Finally, Chat GPT users should be aware that the Content data they share will not be treated confidentially by Chat GPT but may be accessed by humans, in particular for the purpose of training and improving the language model.
Therefore, users should take care that the Content they share with Chat GPT does not contain any confidential information, involves the violation of a trade secret or should not otherwise be shared. This is an additional concern for professions that require professional secrecy, such as lawyers or health professionals.
Chat GPT can be a very useful tool, but if it’s used without precaution, it can also bring personal and professional risks.
The content of this information does not constitute any specific legal advice; the latter can only be given when faced with a specific case. Please contact us for any further clarification or information deemed necessary in what concerns the application of the law.