Since the US company OpenAI released the AI-based chatbot ChatGPT (Chat Generative Pre-trained Transformer) in November 2022, there has been a hype around it.
This is evidenced not only by the enormous number of new registrations within the first 5 days after its release (1 million), but also by the wave of news, articles and comments in the media from relevant expert communities (this one included), which did not diminish until then.
What is ChatGPT and what explains the hype about it?
ChatGPT is a so-called chatbot. A chatbot is a text-based application that allows people to chat with technical systems via a technical interface (API). This is of particular relevance for companies and a significant factor for increased efficiency and quality.
What makes ChatGPT special is that it works on the basis of artificial intelligence, i.e. it uses machine learning to independently generate answers to user questions. This takes it to a whole new level of usability compared to, for example, rule-based chatbots.
Data protection issues in the use of ChatGPT
Since personal data can be processed in the context of the use of chatbots, the requirements of the General Data Protection Regulation (GDPR) must be observed.
The obligation to ensure that personal data is processed in compliance with the GDPR primarily applies to the controller. The controller in terms of the GDPR is the natural or legal person who decides on the purposes and means of processing. Specifically, this means that a company has the authority to decide why and how ChatGPT is used for its interests.
Purpose limitation
Regarding the principle of purpose limitation, it is problematic that ChatGPT usually wants to use the personal data not only for the purpose determined by the controller, but also for other purposes of its own, such as for artificial intelligence training.
This is currently a difficult challenge for the controller when entering a data processing agreement, as the data may only be processed for its own purposes in accordance with its instructions.
If the data is also to be processed for other purposes, a suitable legal basis must legitimize this processing.
Separation of data
In this context, technical and organizational measures should be used to prevent data from different sources from being merged. This risk could theoretically exist if a controller’s customer uses ChatGPT and receives information from another customer as a result of his request.
Storage limitation
OpenAI states in its FAQ that it is not able to delete certain prompts (input to retrieve information) from the use of ChatGPT. Especially if these prompts contain personal data, this circumstance is not GDPR-compliant. The deletion of personal data is not only a right of the data subject, but also a principle of the GDPR to delete personal data as soon as the purpose of the data processing has been achieved. Therefore, if there is no guarantee that the data processed by ChatGPT is no longer related to a person, it should be used with caution.
Privacy Impact Assessment
Since AI-based chatbots such as ChatGPT are considererd to be a new technology, their use is likely to make a privacy impact assessment necessary pursuant to Art. 35 GDPR.
In addition, the federal and state supervisory authorities have already defined certain areas of application in a so-called positive list for which a privacy impact assessment is mandatory. This concerns, for example, the use of AI in the context of customer support.
How a privacy impact assessment is to be carried out is not regulated in the GDPR. However, there are guidelines and online tools from German and European data protection bodies that the controller can use.
For processors
If companies use ChatGPT as a processor and thus process personal data for their customers, the above-mentioned problem areas are equally relevant. The processor is bound by instructions from the controller and ultimately obligated to support him in complying with the GDPR.
Impact of the upcoming AI Act on the use of ChatGPT
The planned AI Act will regulate the marketing and use of artificial intelligence within Europe. Aside from the opportunities offered by the use of artificial intelligence, the new EU law will address the associated risks. It is therefore to be expected that, when the AI Act comes into force, providers and users of AI systems such as ChatGPT, among others, will be subject to more specific obligations regarding the above-mentioned aspects.
If you have any questions around data protection and the integration of ChatGPT or similar applications, please feel free to contact us.
(This post was not created with the use of ChatGPT 😉)