The works council has no co-determination rights unless the company introduces its own AI systems. As long as employees use their own, personal AI accounts, there is no need for a works agreement. In particular, the AI application does not trigger a right of co-determination according to Section 87 Paragraph 1 No. 6 of the Works Constitution Act if the employer does not receive information suitable for monitoring. This follows from a recent decision of the Hamburg Labor Court.
However, according to Section 90 Paragraph 1 No. 3 of the Works Constitution Act, the works council must be informed in good time by submitting the necessary documents, as the use of AI regulates work procedures and workflows.
According to Section 80 Paragraph 3 Sentences 2 and 3 of the Works Constitution Act, the works council can always call in an expert (at the employer’s expense), if, in order to carry out his tasks, he has to assess the introduction or application of artificial intelligence.
introduction of company-specific AI solutions
Effective risk management with regard to data protection and protection of company secrets is only possible if an internal AI solution is introduced. Microsoft 365, for example, offers the "Copilot" as a "daily AI companion" as a relatively easy-to-access option. For companies with For large data sets, it can be interesting to have your own databases prepared and processed using a secure AI solution.
However, risks related to copyright and other intellectual property rights can only be mitigated by in-house AI solutions To the extent that data on usage is collected and this enables the employer to If the (abstract) possibility of monitoring the performance and/or behavior of employees is allowed, the right of co-determination under Section 87 Paragraph 1 No. 6 of the Works Constitution Act is affected. In most cases, this will not be avoidable due to the AI solutions currently available on the market. In addition, a right of co-determination under Section 87 Paragraph 1 No. 1 of the Works Constitution Act comes into consideration if orderly behavior in the company - in the sense of "company cooperation" - is regulated.
The Hamburg Labor Court correctly stated that the requirements for using ChatGPT and similar tools do not fall under the codetermination-requiring orderly behavior. Ultimately, it is just a new work tool - so the work behavior that is not subject to codetermination is affected.
If there are specific risks to the health of employees, a right of co-determination according to Section 87 Paragraph 1 No. 7 of the Works Constitution Act (BetrVG) – combined with a risk assessment according to Section 3 of the Works Constitution Ordinance (BetrSichV) and Section 5 Paragraph 3 No. 6 of the Works Protection Act (ArbSchG) – may also be considered.
All internal company AI solutions will also be regulated by the AI Regulation in the future. Regardless of their risk classification, these AI systems must then meet the information requirements of Section 50 AI Regulation. Employees must always be able to recognize that they are dealing with an AI, unless this is obvious. High-risk systems - especially applications in the human resources area in connection with hiring, termination, promotion or other individual monitoring and assessment of Employees - are also subject to special provider and operator obligations. The employer as operator of a system must above all ensure that the system is safely organized and properly supervised.