Geissler Legal - Legal Advisor for Compliance & Commercial Law

AI tools as a risk for know-how and business ideas - Deceptive security through (conventional) NDAs?

Compliance-focused legal advice for your business activities - personal, cross-border, cost-efficient.

I. Introduction

Nowadays, almost every collaboration begins with the conclusion of a confidentiality or non-disclosure agreement (NDA). While in some cases this merely serves to formalize the collaboration as a first sign of trust, in other agreements - e.g. in cooperation partnerships in R&D - the effective protection of confidential information is extremely important from the very first second of the mutual exchange of confidential information.

But with the advent of gigantic AI tools such as ChatGPT or DeepSeek, companies are facing new challenges, especially in terms of confidentiality, which call into question the effectiveness of traditional NDAs.

II. The supposed security of NDAs

NDAs are legally binding contracts that oblige parties to keep sensitive information confidential. They are designed to protect trade secrets, innovations and strategic plans from unauthorized disclosure. Many companies see NDAs as a panacea for protecting their confidential data, but fail to recognize the need to define what exactly is being agreed between the parties. confidential as well as the risks that may arise from negligent information handling on the part of the other contracting parties. 

One thing to consider is that – despite the extension of the NDA regulations to all employees – not all of your business partner’s employees are briefed or even trained on AI – and may thoughtlessly disclose information to the data monsters mentioned above. 

Even if this was not done with malicious intent, from that second on you may have a confidentiality problem and objectively nothing to complain about other than a violation of the NDA.

III. The underestimated risk of AI and AI tools

The increasing use of AI tools in companies brings with it new risks, most notably:

  1. data collection through AI: AI systems like ChatGPT store user input to improve their performance. This could lead to the inadvertent disclosure of confidential information.

  2. reproduction of sensitive data: There is a risk that AI systems may reproduce confidential information through targeted prompting, even if this is not intended.

  3. Lack of control: Companies often do not have complete control over how their data is processed and stored by AI systems.

IV. Why NDAs alone are not enough:

  1. Technological gaps: Traditional NDAs are not designed to handle the complexity of AI systems and cannot effectively prevent data leaks caused by these technologies.

  2. limits of enforceability: In the event of violations by AI systems, it can be difficult to determine responsibility and take legal action.

  3. False security: The existence of an NDA can lead to a false sense of security and lead companies to be less cautious with sensitive information, especially when using the counterpart of generative AI.

V. Solutions for the protection of confidential data

To minimize risks, companies should consider the following steps:

  1. adaptation of NDAs: Develop specific clauses that address the use of AI tools and the associated risks to the contractual partners, if necessary by imposing a contractual penalty.

  2. training and guidelines: Raise awareness among employees about the correct handling of confidential information when using AI tools.

  3. Technical measures: Implement security systems that control access to and processing of sensitive data by AI tools.

  4. Regular checks: Conduct audits to ensure that confidential information is not inadvertently fed into AI systems.

VI. Conclusion

A combination of legal, technical and organizational measures is required (TOMs) to ensure comprehensive protection of confidential information. This is the only way companies can take advantage of AI tools without compromising their most valuable secrets. These TOMs must be explicitly mentioned in the NDA. Oblige your contractual partner who concludes the NDA to verifiably conduct in-house training and ensure that employees and all third parties such as external service providers from tax, accounting or HR are up to date with the latest legislation and regulations. If necessary, you should consider AI risks as a new category in your liability and general terms and conditions clauses and incorporate recourse constellations into your contract standards.

Last but not least, I strongly recommend that every company appoint an internal or external AI officer (e.g. a lawyer) – similar to a data protection officer. The mere fact of such a nomination serves to prevent liability for management and boards of directors.

Contact me if you need support!

Make an inquiry now
We will be happy to advise you comprehensively and personally on your concerns.

International Lawyer & certified Compliance Officer

PHOTO 3-Mobile

Happy to help you

Contact

Your law firm Geißler Legal.

address

Gertrudenstr. 30-36 (Willy-Millowitsch-Platz)
D-50667 Cologne
Phone:
0221-42482831
Phone:
0171-2211612

Opening hours

Mon-Sat: 10:00 am – 1:00 pm
Mon-Fri: 2:00 pm – 8:00 pm 
and by telephone appointment

Contact

en_USEnglish