Skip to content

The EU Regulation on Artificial Intelligence (AI Regulation, AI Regulation or AI Act) has been in force since 2 February 2025 and introduces the first binding obligations. A central regulation concerns the obligation of companies to train their employees* in the use of AI systems if AI systems are used (e.g. in the HR department, in customer service or in sales).

The following article looks at what this means for companies and their employees in concrete terms and the consequences of a possible breach:

I. Training obligation pursuant to Art. 4 KI-VO

Article 4 of the AI Regulation obliges providers and operators of AI systems to ensure that their staff have a sufficient level of AI expertise. This applies in particular to employees involved in the operation and use of AI systems. The aim is to ensure that AI is handled competently and to develop an awareness of opportunities, risks and potential damage (see Art. 3 No. 56 of the AI Regulation).


II. Who is affected by the training obligation?

The training obligation affects all companies – regardless of their size – that use or operate AI systems.

An “AI system” is “a machine-based system that is designed to operate with varying degrees of autonomy and that, once operational, can be adaptive and that derives outputs such as predictions, content, recommendations or decisions from the inputs received for explicit or implicit purposes that can influence physical or virtual environments (Art. 3 No. 1 AI Regulation).” To facilitate the application of the definition of “AI system”, the EU Commission published guidelines on 6 February 2025.

Typical AI systems include applications that integrate large language models such as ChatGPT, Gemini, Claude or DeepSeek . Applications in the area of “recruiting” (e.g. automated applicant selection) and “employee evaluation” may fall under the scope of the AI Regulation, depending on the structure and integration of typical AI models, provided they are specifically used in an operational context (e.g. for customer communication or in the HR area).

The training obligation applies to all “actors” in the AI Regulation. In practice, companies will typically be categorised as either “providers” or “operators“. This categorisation depends on where the company is located along the AI value chain(you can find out more about the distinction between operator and provider in our article “Provider or operator? The key roles in the AI Act decoded). Here is an example to clarify: If an employee uses an AI system as part of their professional activity for the company, the company can be considered an operator within the meaning of Art. 4 of the AI Regulation.

III. Content and scope of the training

The AI Regulation does not specify any concrete training content, but requires that the measures are tailored to the technical knowledge, experience, training and specific context of use of the employees. This means that companies must develop customised training concepts that cover technical aspects as well as ethical and legal issues when dealing with AI. This is because the requirements differ considerably depending on the area of application and intensity of AI use. For example, the HR department regularly requires more extensive training than other departments, especially when so-called “high-risk AI systems” as defined in Art. 6 of the AI Regulation are used. According to Art. 6 (2) of the AI Regulation and Annex III No. 4 to the AI Regulation, high-risk AI systems are those carried out in the area of employment and personnel management

  • AI systems that are intended to be used for the recruitment or selection of natural persons, in particular to place targeted job advertisements, screen or filter applications and evaluate applicants;
  • AI systems intended to be used for decisions affecting the terms of employment relationships (e.g. salary adjustments), promotions and terminations of employment contracts, for the assignment of tasks based on individual behaviour or personal characteristics, or for the monitoring and evaluation of the performance and behaviour of individuals in such employment relationships.

When using “high-risk AI systems”, Art. 86 para. 1 of the AI Regulation is also relevant. This states that employees (and also applicants) who are affected by a decision that the company has made on the basis of a “high-risk AI system” have the right to receive a clear and meaningful explanation from the company regarding the role of the AI systemin the decision-making process and the most important elements of the decision made.


IV. Consequences of a possible violation

A breach of the training obligation is not directly subject to fines or penalties, so there are (currently) no clear (financial) sanctions to be feared. However, indirect sanctions for non-compliance with information obligations vis-à-vis the competent national authorities would be conceivable, Art. 99 (5) of the AI Regulation. However, it remains to be seen whether this theoretical possibility will materialise in practice.

Irrespective of this, inadequate or omitted mandatory training may result in civil liability risks for companies if damage is caused by the incorrect use of an AI system. In this case, this could constitute a breach of the company’s general duty of care (here: mandatory training), which could entitle the injured party to claim damages.

V. Classification under labour law: remuneration, working hours, evidence

If the employer is obliged by law or on the basis of a law (here: Art. 4 KI-VO) to offer the employee further training required for the performance of the work (here: compulsory training), the costs for this may not be imposed on the employee in accordance with Section 111 of the German Industrial Code (GewO). The costs of the mandatory training are therefore borne by the employer.

Furthermore, Section 111 (2) GewO stipulates that the mandatory training should be carried out during regular working hours. Compulsory training courses held outside of regular working hours are considered working hours and must be remunerated.

Finally, the implementation of the mandatory training and the participation of the employees should be documented and proven, e.g. by certificates of participation, in order to be prepared in the event of legal disputes with any authorities.

VI Recommendation for action

If they have not already done so, companies should:

  • identify which employees use AI systems in the course of their work and, if necessary, categorise them into groups according to their level of knowledge,
  • analyse the specific training needs of these employees,
  • develop and implement suitable training programmes (including follow-up training),
  • introduce a policy or code of conduct on the use of AI.

Companies can also pool the required AI expertise and thus ensure that employees have the necessary expertise for their specific field of AI by appointing an AI officer(for more information, see our article “Do companies need an AI officer for sufficient AI expertise?“).

VII Conclusion: Take training obligations seriously – avoid risks

The training obligation under Art. 4 of the AI Regulation applies to companies of all sizes that use AI systems in an operational context. Standardised training for all employees is generally not sufficient, as the training requirements depend in particular on the area of application. There is an acute need for action, especially when using high-risk AI systems – for example in the HR department.

Even if an infringement is not yet subject to a fine, there is a risk of civil liability in the event of incorrect use.

Would you like to know whether your company is affected – or do you need support in implementing the training obligation?

We will be happy to support you with the legal categorisation and the development of suitable training concepts – pragmatically, purposefully and legally compliant.

We are also available as an external AI officer. Get in touch with us!

Or take advantage of the training programme set up by Blackboat and us to become a certified AI Officer in accordance with Art. 4 of the AI Regulation.

* The personal designations used in this article always refer equally to persons with the gender entry f/m/d. Multiple references are avoided in favour of better readability.