Try all features for free — 3 credits included on sign-upTry for free
Skip to main content
Regulation · AI Act

Is your SaaS using AI without realising it? The obligations that apply.

A SaaS platform that uses an AI API for recommendations or scoring automatically becomes a provider under the AI Act. Here are the concrete steps to achieve compliance and avoid tender exclusions.

Jérémy Pierre
Jérémy Pierre
AI Act compliance expert
18 June 2026 8 min reading time
Is your SaaS using AI without realising it? The obligations you must comply with
In brief · 4 key figures to remember
85%
of B2B SaaS platforms integrate AI without declaring it
2 Nov. 2026
deadline for Article 50 transparency
Annex III
candidate scoring and HR management = high-risk
30%
of 2026 RFPs will include AI Act clauses
01 - Regulation

Why your SaaS is concerned by the AI Act, even without in-house AI

A SaaS editor that integrates an AI API for recommendations, scoring or automation automatically becomes a provider under the AI Act.

Article 3(3) of the Regulation is clear: as soon as an AI system is integrated into a product and placed on the market under the editor's name or brand, the latter becomes the provider. It does not matter whether the underlying model comes from OpenAI, Mistral or Anthropic. What matters is the final commercialised use.

Concrete examples:

  • A project management SaaS that uses OpenAI's API to automatically generate meeting summaries.
  • A marketing tool that integrates a lead scoring model via a third-party library.
  • A HR platform that offers a training recommendation module based on a language model.

In these cases, the SaaS editor must comply with the provider's obligations, including the classification of the AI system and technical documentation if the system is considered high-risk.

Article 3(3) AI Act - Definition of provider
AiActo Glossary - Provider vs deployer
02 - Compliance

The 3 steps to achieve compliance with the AI Act

AI Act compliance for a B2B SaaS platform is based on three pillars: classification, documentation and transparency.

1. Classify your AI system

The first step is to determine whether the integrated AI system is considered high-risk. Annex III of the AI Act lists the cases concerned, including:

  • Human resources management (candidate scoring, performance evaluation).
  • Access to essential services (credit scoring, insurance).
  • Education and vocational training (automatic grading).

If the system falls into one of these categories, the editor must produce technical documentation in accordance with Annex IV.

2. Produce technical documentation

For high-risk systems, Annex IV requires detailed documentation, including:

  • A description of the system's functionalities and its limitations.
  • The training data used and their origin.
  • The cybersecurity and robustness measures implemented.
  • The post-market monitoring procedures.

This documentation must be made available to the competent authorities upon request.

3. Comply with transparency obligations

Article 50 of the AI Act requires providers of AI systems to inform end-users that they are interacting with AI. For a B2B SaaS, this means:

  • Informing clients that certain functionalities use AI.
  • Describing the system's capabilities and limitations in the information notice (Article 13).
  • Indicating whether the system is considered high-risk.
Annex III AI Act - List of high-risk systems
Annex IV AI Act - Technical documentation
03 - Case study

Case study: an HR SaaS with candidate scoring

A HR software editor integrates an AI API to score CVs. Here's how they must achieve compliance.

Scenario: an HR SaaS offers an automatic candidate scoring feature, based on OpenAI's API. This feature is marketed as a premium module.

Step 1: System classification

Candidate scoring falls under human resources management, listed in Annex III of the AI Act. The system is therefore considered high-risk.

Step 2: Technical documentation

The editor must produce documentation in accordance with Annex IV, including:

  • A description of the scoring criteria and their respective weights.
  • The training data used for the model (e.g. historical CVs, job offers).
  • Measures to protect against discriminatory biases.
  • Post-deployment monitoring procedures (e.g. regular audits).

Step 3: Transparency to clients

The editor must:

  • Inform clients that scoring uses AI.
  • Provide an information notice describing the system's capabilities and limitations (Article 13).
  • Indicate that the system is considered high-risk.
« An HR SaaS that does not provide this documentation will be excluded from tenders by major companies, which are increasingly demanding on AI Act compliance. »
CNIL - AI Act guide - Concrete examples
AiActo diagnostic - Check if your system is high-risk
04 - Transparency

Article 50: what your clients really expect

B2B clients, particularly large companies, are increasingly demanding transparency about the AI systems used in SaaS platforms.

Article 50 of the AI Act requires providers to inform end-users that they are interacting with AI. For a B2B SaaS, this means:

  • A clear mention in the terms and conditions or technical documentation.
  • An accessible information notice describing:
    • The AI functionalities used.
    • The system's limitations (e.g. error rate, potential biases).
    • The protection measures implemented.
  • A contact channel for questions related to AI.

Enterprise clients, particularly in regulated sectors (banking, insurance, healthcare), are increasingly including AI Act clauses in their RFPs. A SaaS without AI Act documentation will be systematically excluded.

AI Office - Transparency - Practical guide
Article 50 AI Act - Transparency obligations
05 - GPAI

Obligations if your SaaS uses a base model (GPAI)

If your SaaS integrates a language model such as GPT-4 or Mistral, additional obligations apply.

Article 25(4) of the AI Act requires providers of systems integrating a GPAI model (General-Purpose AI) to:

  • Inform downstream deployers (your clients) that the system uses a GPAI model.
  • Provide technical documentation on the model used, including:
    • The model's capabilities and limitations.
    • The training data (if available).
    • Cybersecurity measures.
  • Comply with the transparency obligations of Article 50.

Example: a content generation SaaS that uses Mistral's API must inform its clients that the system relies on a GPAI model and provide documentation on the model's limitations (e.g. bias risks, hallucinations).

Article 25(4) AI Act - GPAI obligations
AI Act Explorer - Interactive guide
06 - Business

AI Act compliance as a commercial argument

SaaS editors can turn AI Act compliance into a competitive advantage, particularly in tenders.

According to several studies, large companies (CAC40, ETI) are increasingly including AI Act clauses in their RFPs. SaaS editors without AI Act documentation are systematically excluded.

Strategies to leverage compliance:

  • Premium option: Offer AI Act documentation as a paid option for enterprise clients.
  • Enterprise tier: Reserve compliant features for high-end clients.
  • Commercial argument: Highlight compliance in pitches and marketing materials (e.g. "Our SaaS is AI Act compliant, guaranteeing transparency and security for your data").
  • Certification: Obtain third-party certification to strengthen credibility (e.g. CNIL label, ISO 42001 certification).

Example: a credit scoring SaaS can offer AI Act-compliant documentation as a premium option, priced 20% higher than the standard rate. Enterprise clients, subject to strict regulatory obligations, will pay this premium to avoid risks.

Identify your obligations in 3 minutes

Our free diagnostic tells you whether your SaaS is concerned by the AI Act and which steps to follow to achieve compliance.

07 - FAQ

Frequently asked questions

Answers to the most common questions from SaaS editors about the AI Act.

Yes. As soon as you integrate an AI feature into your SaaS and place it on the market under your name or brand, you become a provider under the AI Act. This applies even if you use a third-party API such as OpenAI or Mistral. Article 3(3) of the Regulation is explicit on this point.

Penalties can go up to €35 million or 7% of global turnover, whichever is higher. For SMEs, fines are capped at €15 million or 3% of turnover. The competent authorities may also order the withdrawal of the system from the market.

Consult Annex III of the AI Act, which lists the cases considered high-risk. The most common categories for SaaS include human resources management, access to essential services (credit, insurance) and education. You can also use our free diagnostic to check if your system is concerned.

A provider is an entity that develops or places on the market an AI system under its own name or brand. A deployer is an entity that uses an AI system as part of its activities, without placing it on the market. For example, a SaaS editor that integrates AI is a provider, while a company that uses this SaaS is a deployer.

Yes. Article 50 of the AI Act requires providers to inform end-users that they are interacting with AI. For a B2B SaaS, this means informing your clients that certain features use AI and describing the system's capabilities and limitations in an information notice (Article 13).

If your SaaS integrates a GPAI model (General-Purpose AI), you must inform your clients that the system uses such a model and provide technical documentation on its capabilities and limitations. Article 25(4) of the AI Act details these obligations, which are in addition to those of Article 50 on transparency.

You can offer AI Act documentation as a premium option or enterprise tier, charged as an extra. Highlight compliance in your commercial pitches and marketing materials to reassure enterprise clients, who are increasingly demanding on this point. Third-party certification (e.g. CNIL label) can also enhance your credibility.

Jérémy Pierre
Jérémy Pierre
Founder aiacto.eu · AI Act compliance expert

Supports AI providers and deployers in their regulatory compliance journey.

Share this article