Data Protection lawyers with 50+ years of experience

Free initial consultation
/insights

Updated Monday, September 23, 2024

Updated Monday, September 23, 2024

Provider or deployer? On the correct classification of a company's role under the AI-Act

With the entry into force of the AI Act, companies are faced with the question of which requirements must be observed when dealing with AI products in a legally compliant manner. This largely depends on whether a company is to be classified as a provider or deployer of an AI system. This article aims to answer what this depends on.

Steffen Groß

Partner (Attorney-at-law)

Jakob Riediger

Scientific Research Assistant

AI system or conventional software?
Provider or deployer?
Degree of criticality
Case study: Use of a company chatbot
Sources

Get assistance from our lawyers

Data Protection compliance can be complicated. Let our experienced team simplify it for you.

Free initial consultation

The AI Act came into force on August 1st, 2024. For companies, this raises the question of what obligations follow when they implement AI systems. For example, the integration of APIs into internal company software or the implementation of a GPT chatbot on your own website is conceivable.

The AI Regulation is linked to different roles in dealing with AI systems: accordingly, different obligations apply to “providers” of AI systems than to “deployers”. A correct classification of one's own role is therefore of central importance in the legally compliant handling of AI systems as a company.


AI system or conventional software?

Before classifying the specific role of your own company, it is important to assess whether it is an AI system within the meaning of the AI Act or just conventional software. This is because the obligations of the AI Regulation only apply in the case of an AI system.

Determining the definition of an AI system was one of the most controversial issues in the development of the legislation. In the end, the definition in Art. 3 No. 1 of the AIA was chosen:

  • According to this, AI system is “a machine-based system that is designed to operate autonomously to varying degrees and that can be adaptive once operational and that derives from the inputs received for explicit or implicit goals how to produce outputs such as predictions, content, recommendations or decisions that can influence physical or virtual environments”.

This is criticized for being too broad a definition that does not allow a clear distinction from conventional software. Recital 12 of the AIA should therefore be consulted in addition. This states:

  • “An essential characteristic of AI systems is their ability to infer.”

The central demarcation criterion is therefore the ability to “derive”. Simple software therefore obviously falls outside the scope of the AI Regulation. In the case of particularly complex software, however, it still seems difficult to draw a clear boundary. The use of “hybrid AI systems with deep learning elements” could be cited as an example.

In these borderline cases, the specific circumstances of the individual case must be carefully considered and a balancing decision made.


Provider or deployer?

If an AI system is present, it must be assessed whether the company is to be classified as a provider or deployer. This is because the AI Regulation links different obligations to this. While the term “user” was still used in the draft stage, the term “deployer” was ultimately agreed upon.
In legal terms, the distinction is to be made on the basis of the definitions in Art. 3 AIA.

Art. 3 No. 3 AIA: Provider
For companies, the question arises as to whether they already become a provider in the course of implementing an AI system.According to Art. 3 No. 3 AIA, a provider is “a natural or legal person, public authority, agency or other body which develops, or has developed, an AI system or an AI model with a general purpose and markets it under its own name or trademark or puts the AI system into service under its own name or trademark, whether in return for payment or free of charge”.

The existence of supplier status must therefore be determined on the basis of a 2-step process: the company must (1) develop the AI system or have it developed, and (2) place the AI system on the market or put it into operation (“and”).

At the first level, classification as a supplier therefore requires that the company has had the AI system developed. In this respect, the question regularly arises as to whether an AI system has already been developed or whether only an existing AI system is used. In abstract terms, this can be based on the definition of an AI system in Art. 3 No. 1 AIA: the development of an AI system is therefore assumed if it is not only based on an existing derivation, but also creates its own derivation function.

If the system is developed or has been developed, it must be placed on the market or put into service at the second level.
The term “putting into service” is defined in Art. 3 No. 11 AIA and also includes the personal use of the system. This means that companies that commission third-party development of the system for internal use only also become providers. The company can therefore only avoid being classified as a provider if the operation of the system is also left to a third party.

Similarly, in the case of placing a system on the market (Art. 3 No. 9 AIA), it is sufficient to have an AI developed by a third party and to place it on the market under its own name.
As the providers are the main addressees of the requirements of the AI Regulation, it may be advantageous to commission the third party to operate the system in addition to developing it. In this respect, contractual and corporate structuring options become relevant.

Art. 3 No. 4 AIA: Deployer
If the company is not a provider, the question of deployer obligations arises.
According to Art. 3 No. 4 AIA, an deployer is “a natural or legal person, public authority, agency or other body which uses an AI system under its own responsibility, unless the AI system is used in the course of a personal and non-professional activity”.

According to the wording of the definition (“own responsibility”), employees are not included.
Private individuals who do not use the system for professional purposes are also excluded from the scope of application. An example of this would be the private use of ChatGPT.

In addition, companies, unless they are providers, generally become deployers by using the system under their own responsibility.


Degree of criticality

As the AI Regulation follows a risk-based approach, the type of AI system must be assessed following the classification of the personal role (provider or deployer). This is because the higher the degree of criticality of the system used or its deployment scenario, the more comprehensive the obligations set out in the regulation.
In this respect, the regulation differentiates between: Prohibited practices; High-risk AI systems; Specific AI systems and general-purpose systems.

See in detail in this article.


Case study: Use of a company chatbot

Case study
Y-GmbH uses a GPT chatbot to communicate with customers on its own website. It uses a GPT wrapper to implement the chatbot on its website. What role does Y-GmbH play?

Step 1: AI-System or standard software?
The GPT chatbot is an AI system within the meaning of Art. 3 No. 1 AIA. It is a machine-aided system that is designed for autonomous operation, can be adaptable and can derive outputs from received inputs based on explicit goals. In contrast, it is not conventional software. This opens up the material scope of application of the AI Regulation.

Step 2: provider or deployer?
It is questionable whether Y-GmbH is the provider or deployer of the AI system. Classification as a provider would require that Y-GmbH (1) has developed the system or has it developed and (2) places it on the market or puts it into operation in its own name. Even if the system could be put into operation, a supplier status is ruled out in the present case because the company did not develop the system itself and the development was not commissioned.
In particular, it is not sufficient for the development of the AI system to integrate the GPT chatbot on its own website by using a GPT wrapper. This is because Art. 3 No. 3 AIA is linked to the development of the AI system itself. This is the GPT chatbot as such.
As a result, Y-GmbH has the status of an deployer in that it uses the AI system under its own responsibility.

Step 3: Degree of criticality
The third step would be to classify the degree of criticality in order to derive the specific duties to act.


Sources

(1) Chibanguza/Steege NJW 2024, 1769.

(2) Steege/Chibanguza Metaverse-HdB § 8 Rn. 27 f.

(3) Steege MMR 2022, 926.

Legal advice

Simpliant Legal - Wittig, Bressner, Groß Rechtsanwälte Partnerschaftsgesellschaft mbB

Consulting

Simpliant GmbH

Technology

Simpliant Technologies GmbH

Data protection

We will support you in implementing all data protection requirements with the GDPR.

Information security

We support you in setting up a holistic ISMS such as ISO 27001.

Artificial intelligence

We advise you on the integration of AI and develop legally compliant usage concepts.


© 2019 - 2024 Simpliant