AI in SaaS contracts: Innovation and regulation

7 min

AI features allow SaaS product providers to offer intelligent support to their users. However, this raises many legal questions. What specific services are owed? What about liability, data protection and copyright? Our article provides a concise overview of the key legal considerations for drafting your contract.

Arrange a no-obligation initial consultation

Contractual classification: What is owed?

SaaS contracts are not a distinct type of contract under the German Civil Code (BGB), but typically incorporate elements of rental, service, and work contract law.

Accordingly, there is no separate type of contract for extending SaaS contracts to include additional AI features.

This has an impact on the description of performance obligations to be fulfilled in the contract and the applicability of statutory provisions.

If a contract falls under one of the statutory contract types, the specified rules for the scope and quality of the performance owed apply (unless modified by agreement).

If such a classification is not possible, the law of the contract type that most closely matches the subject matter of the contract applies.

To avoid legal uncertainties and have a clear basis in the event of a dispute, the respective performance obligations should be defined as precisely as possible.

The contract should contain clear definitions of the AI functionalities and the data processed.

To create clarity, it can be helpful to exclude certain functionalities entirely and describe any restrictions transparently.

Additionally, the availability of AI features must be specified, bearing in mind that 100% availability cannot be guaranteed in practice.

Newsletter

For your Inbox

Current updates and important information on topics such as data law, information security, technology, artificial intelligence, and much more. (only in German)

Please add 4 and 5.

Mit Klick auf den Button stimmen Sie dem Versand unseres Newsletters und der aggregierten Nutzungsanalyse (Öffnungsrate und Linkklicks) zu. Sie können Ihre Einwilligung jederzeit widerrufen, z.B. über den Abmeldelink im Newsletter. Mehr Informationen: Datenschutzerklärung.

Does the Digital Content Directive play a role?

Insofar as the SaaS solution is offered to consumers, additional obligations may arise from the Digital Content Directive (implemented in Germany in Sections 327 et seq. of the German Civil Code (BGB)).

For example, it must be ensured that the AI function works as advertised.

In the B2B sector, there is more scope for individual agreements in this respect.

However, even here, it is not advisable to allow the AI function and the customer's expectations to diverge too widely.

What contractual limitations of liability are possible?

With the introduction of AI elements, there is an increased risk of being held liable by users for incorrect or misleading outputs.

To avoid any such claims, clear limitations of liability should be included in the general terms and conditions (GTC) as well as in individual contracts.

It is important to point out the limits of what is feasible: AI usually provides probabilities rather than infallible results.

In addition to contractual liability, the provider may also be liable in tort for violations of personal rights caused by AI-generated content, for example.

However, when it comes to possible liability disclaimers, legal barriers (e.g. relating to intent or gross negligence) and numerous court rulings must be considered.

Overly exotic disclaimers of liability for AI products are especially likely to render the disclaimer ineffective in GTC.

Does product liability law apply here?

According to the new EU Product Liability Directive (ProdHaftRL), software — including AI components — is considered a 'product' under product liability law.

This means that the new directive extends classic product liability to SaaS products, provided they are made available on the market in a business context.

Providers are therefore liable, regardless of fault, if software errors result in damage to the health, property, or specific data of a natural person.

Products are considered defective if damage results from a failure to provide security-related updates.

The provisions of product liability law cannot be negotiated in contracts, and any liability exclusions are invalid.

Providers can therefore only counter this by designing secure services.

However, product liability law generally has no effect on contract drafting.

Product liability law simply applies.

Currently, the Product Liability Directive has not yet been implemented as an amendment to the Product Liability Act.

Which aspects of data need to be covered by data protection law?

AI components in SaaS can process personal data, for example when users enter data into the tool.

The basic principles of the GDPR must therefore be observed: there must be a clear legal basis for processing, and the principles of purpose limitation and data minimisation must be adhered to.

As processors, SaaS providers must conclude a processing agreement.

If other external service providers are involved, their roles in terms of data protection law must also be clarified and defined in the contract.

AI applications such as ChatGPT are often operated in the USA or other third countries.

As soon as personal data is transferred to a country outside the EU, providers must ensure that an adequate level of data protection is in place for this transfer.

In addition to adequacy decisions — for example, those relating to the USA or Switzerland — the standard contractual clauses of the EU Commission are particularly relevant in this regard.

In any case, providers must keep their documentation and processes relating to third-country transfers up to date to fulfil their evidence-providing obligations.

Who owns the output of AI?

AI-generated content is not protected by German copyright law because it is not the intellectual creation of a human being.

However, it is possible that AI models could generate content incorporating protected works, or edit them in a way that is subject to copyright law.

In such cases, copyright protection also applies, meaning that certain uses, such as public reproduction, require the consent of the copyright holder.

Providers should take this into account when making contractual arrangements regarding the use of generated content by users, as well as the distribution of responsibility in the event of copyright claims by third parties.

Generally, though, AI output is in the public domain and can be used by anyone without consent.

If providers wish to protect AI-generated content, this must be regulated within the framework of a licence agreement.

Does the Data Act impact contract design?

The Data Act primarily affects networked devices and services by regulating data access and the shared use of generated data.

Therefore, not every AI feature in SaaS applications is automatically subject to Data Act regulations.

However, the regulation does apply wherever companies are obliged to transfer data or meet technical portability requirements.

Therefore, anyone integrating AI functions into their SaaS product should carefully check whether the Data Act is relevant, and if so, in which respects.

This may be the case when integrating third-party providers or when IoT data is generated, for example.

In such cases, providers should include appropriate contractual clauses and ensure that the data and interface structure complies with the requirements.

Additionally, the provisions of the Data Act must be considered if the contract includes a clause on using data for AI training purposes.

The Data Act sets out requirements for such contractual content.

What are the consequences of the AI Act?

The Act sets out a series of obligations for AI applications, which vary depending on the risk posed by the AI system and the role of the relevant party.

Therefore, the regulation is particularly relevant for providers whose AI features fall under the definition of high-risk AI.

A specific case-by-case assessment is required, but SaaS solutions are unlikely to be considered high-risk AI in most instances.

If this is the case, it must be determined whether the SaaS provider is a 'provider' within the meaning of Art. 3 No. 3 of the AI Act, or an 'operator' as defined in Art. 3 No. 4 of the AI Act.

The consequences primarily relate to documentation and compliance obligations, with possible contractual implications.

In a nutshell: Practical tips for drafting contracts

Clear service description

Define the AI functions offered, their extent and quality.

Adjust liability rules

Limit liability for AI errors within the legally permissible scope.

Ensure data protection compliance

Check, document and design order processing in accordance with data protection regulations.

Establish copyright usage rules.

Contracts should clarify responsibilities and rights for AI-generated content.

Consider the Data Act

Reflect technical requirements and data access rights in contracts.

Plan for AI regulation

Review the relevance of the AI Act and prepare the necessary compliance measures.

How can we assist you with drafting contracts?

While the integration of AI features into SaaS applications offers exciting opportunities, it also presents challenges for providers.

The scope of services and liability issues are often the same as with SaaS models without AI.

Drawing on our experience and expertise, we can advise you on the optimal contractual protection for your AI applications.

Schedule your initial consultation

Describe your situation to us in a no-obligation phone call, and our lawyers will work with you to find the best solution.

Schedule consultation