After the Italian data protection authority issued a temporary blocking of the service ChatGPT for Italian users, other data protection authorities have now also become active. Both the joint body of the German data protection authorities (Data Protection Conference) and the European Data Protection Board have recently set up a dedicated "Taskforce" to examine the issue in more detail.
Because of the regulatory activities, there is currently legal uncertainty regarding the data protection compliance of ChatGPT.
In the following article, we want to contribute to the clarification of the data protection implications and take a closer look at the various usage scenarios, as well as the resulting data protection assessments.
What are we looking at when assessing GDPR-compliance?
Before we can address the question of whether ChatGPT is "GDPR-compliant", we must first define the parameters of the assessment.
This is important because not only companies (OpenAI LLC) and the products of companies (development of ChatGPT and creation of the Large Language Model ("LLM")), but also the use of these very products by third parties (use of ChatGPT by companies) can be data protection compliant or non-compliant.
The following article examines the third option, i.e. the question of how the service can be used in a privacy-compliant manner (privacy compliance regarding the use of ChatGPT).
To answer the question of whether and how the service can be used in a GDPR-compliant manner, a distinction must be made between different usage scenarios. For this purpose, we have formed several short use cases.
Use of ChatGPT for (exclusively) personal purposes (Case study 1)
A user wants to enrich or organize his personal contact list, which has been maintained in a spreadsheet, by using ChatGPT.
Does using ChatGPT for personal purposes while entering personal data in a prompt violate the GDPR?
No, such use does not violate the GDPR, as the latter does not apply to data processing for purely personal purposes.
Personal use - GDPR not applicable
To the extent that processing (including personal data) is carried out exclusively for personal or household activities, this is outside the scope of the GDPR (Art. 2(2)(c) GDPR). According to Recital 18 of the GDPR, personal or household activities are at hand "when processing is carried out without (any) relation to a professional or economic activity." Typically, private purposes include processing data for own leisure time activities, entertainment, and hobbies.
Business or mixed personal-business use - GDPR applicable
In contrast, processing activities would be considered non-personal if they are performed in the context of economic activity, regardless of whether there is monetary compensation. This includes advertising measures, the exchange of data for services, but also preparatory activities, e.g. the creation of a first draft of a text to be used in a professional context.
In the case of mixed personal and business use, the GDPR is still applicable, even if personal use predominates since the exceptions to the scope of application of the GDPR are generally to be interpreted narrowly.
Business use of ChatGPT Web interface (Case study 2)
The company X-Inc. would like to use ChatGPT to edit its customer lists, including address data. The customer data is to be entered via input ("prompt") into the web interface.
Does this mean that Company X-Inc. is in breach of the GDPR?
Yes - Company X-Inc. is in breach of the GDPR as it transfers personal data to a third-party recipient (OpenAI LLC) without a legal basis.
As economic purposes are pursued with the use of the service, the processing occurs for business use. Since the personal data (customer data) is forwarded to a third-party recipient, this constitutes a processing of personal data within the scope of the GDPR.
The transfer requires a legal basis. In most cases, companies will not have obtained consent from customers to transfer data to OpenAI. Therefore, consent will, in most cases, not be a viable legal basis.
Justifying legitimate interest is also likely to be difficult, as the risks for data subjects regarding data processing by OpenAI are often difficult to assess. However, exceptions could arise if, for example, protective measures such as pseudonymization are taken and risks for data subjects can be excluded with sufficient certainty.
Since consent or legitimate interest is only likely to exist in special cases, only processing on behalf (Art. 28 GDPR) could be considered as a legitimization of the transfer.
However, a data processing agreement is not provided by OpenAI for this use.
The linked form leads to a subpage labeled "[Sales] Data Processing Addendum External", where one can electronically enter into a data processing agreement pursuant to Art. 28 GDPR with OpenAI LLC (this refers to the EU Standard Contractual Clauses).
However, the interesting information is located in the text description under the heading:
OpenAI LLC makes the statement here that the provided data processing agreement should only apply to their "Business Service Offerings" (i.e. the API users), but this explicitly does not include ChatGPT (web interface).
Consequently, OpenAI does not provide a data processing agreement for the use of the ChatGPT web console. This is only provided for API-using business customers.
Although OpenAI does not prohibit the business use of the service in its TOS, it nevertheless sees ChatGPT - at least implicitly - as a consumer service, as can be derived from the description of the "[Sales] Data Processing Addendum External".
In the absence of a data processing agreement for the use of the web interface, a data transfer based on Art. 28 GDPR cannot be considered.
Accordingly, companies will generally not be permitted under the GDPR to transmit personal data to OpenAI through prompts in the web interface of ChatGPT.
Business use of the ChatGPT API (Case study 3)
A company wants to make its customer support more effective. It creates an own customer chat interface on the website that accesses both the customer's order history in an internal database and the ChatGPT API (“API”) to generate suitable answers for customer queries. For this purpose, the customer data is transferred to OpenAI via the API.
Is the described integration of the API and the transfer of customer data to ChatGPT permissible under the GDPR?
Yes, the transfer of customer data can be based on Art. 28 GDPR in conjunction with concluded data processing agreement. However, since a data transfer to the USA takes place, a so-called Transfer Impact Assessment must also be carried out. If the result of this assessment is positive, the customer data may be transferred to OpenAI.
In contrast to the users of the web interface, API users can conclude a data processing agreement with OpenAI. Provided it is effective, data can therefore be transferred to OpenAI on this basis even without the consent or legitimate interest.
However, it must be considered that processing on behalf requires data to be processed exclusively on the instructions of and for the purposes of the controller.
Supplemental requirement - Transfer Impact Assessment (data transfer to the U.S.):
However, in addition to the conclusion of the data processing agreement, the company must carry out a so-called Transfer Impact Assessment ("TIA").
A TIA is used to assess risks posed by the transfer of data to a third country, which is not deemed to have an adequate level of data protection (i.e. the USA). Only if the result of this risk assessment is positive, i.e., it can be justified that there are no substantial data protection risks for data subjects, the data transfer permissible under data protection law.
Results and recommendations
No entry of personal data in the web interface
In the absence of a data processing agreement, no transfer of personal data may take place using the ChatGPT web interface for business purposes.
Such use may exceptionally be legitimized if there is consent by the data subject (Art. 6 para. 1 a) GDPR) or if there is a legitimate interest pursuant to (Art. 6 para. 1 f) GDPR).
Insofar as companies use the web interface, they should ensure that no personal data (e.g. customer data or employee data) is transmitted to OpenAI via a prompt.
No entry of trade secrets (Trade Secrets Act)
It should also be prohibited that data subject to the Trade Secrets Act is entered into ChatGPT's web console, regardless of whether such data is personally identifiable.
This applies to technical data and software code, as well as to all data that has an economic value and must be handled confidentially. As far as this is not prevented, a scenario like Samsung's recently reported incident is looming.
"ChatGPT Acceptable Use Policy"
Insofar as the use of the service is permitted, the company bears responsibility for data protection compliance concerning ChatGPT.
Therefore, companies should generally prohibit the entry of personal data. This can be done, for example, through an "Acceptable Use Policy".
If a company does not take internal precautions to prevent personal data from being entered in violation of data protection laws, the data protection violation is generally attributed to the company and can lead to fines and claims for damages by data subjects.
You will find a free sample of an acceptable use policy below that you can adapt to your organization’s requirements and use freely.
Further measures required for API integration
If the processing of personal data is to take place via API integration, the following additional measures must be taken:
- Conclusion of OpenAI’s data processing agreement (Art. 28 GDPR)
- Conducting a transfer impact assessment (Art. 44 ff. GDPR and requirements of the ECJ in the "Schrems II ruling")
- Carrying out a data protection impact assessment as far as necessary (Art. 35 GDPR)
- Documentation in the records of processing activities (Art. 30 GDPR)
A closer look at the issue shows that the use of ChatGPT's web interface to process personal data will regularly be non-compliant under data protection law.
Companies should therefore take precautions to prevent such use of the service to avoid the risk of violating data protection law. It is therefore recommended to prohibit the entry of personal data in the inputs (prompts) unless the customer has expressly consented to this.
In contrast, the use of the API faces fewer data protection hurdles. However, some measures must also be taken to ensure data protection-compliant processing.
Given of the importance of the issue, a harmonized approach at the European level would be welcomed. In any case, it would be difficult to justify why the service would not be permitted in Italy but would be available in Germany, for example. After all, the starting point for data protection is the same for all member states - the GDPR, which is to be interpreted uniformly within the European Union.
Against the backdrop of the data protection pitfalls highlighted, Politico magazine aptly writes: "ChatGPT is entering a world of regulatory pain in Europe".
Despite the regulatory uncertainties, companies should not rush to abandon the use of the service. After all, companies that have introduced AI report significant cost reductions and revenue increases. "Companies that have adopted AI continue to pull ahead".
If the use of ChatGPT makes sense in your respective business, the data protection risks can usually be reduced to a minimum by designing the integration with data protection requirements in mind.