Skip to main content

SMEs: Are you affected by the transparency obligations of the AI Act?

Most SMEs using AI are subject to the EU Regulation without knowing it. Take the test in 5 questions.

The AI Act and SMEs: what you need to know

Regulation (EU) 2024/1689 on artificial intelligence — the 'AI Act' — applies to any organisation that develops or uses AI systems in the European Union, regardless of its size.

Contrary to popular belief, SMEs do not benefit from a general exemption. If you use a chatbot, generate AI content, or make AI-assisted decisions, you are likely subject to transparency obligations.

Obligations vary depending on your role (provider or deployer) and the type of system used. Some have already been in effect since February 2025.

Did you know?

82% of European SMEs using AI are subject to at least one AI Act transparency obligation, according to a European Commission study (2024).

The 3 transparency obligations affecting SMEs

For each obligation that applies to you, you must produce a compliance document. Here they are.

Art. 50.1

AI interaction notification

People must be informed that they are interacting with an AI system, unless it is obvious from the context.

E.g.: Your customer service chatbot must display a message indicating that responses are AI-generated.

Art. 50.2

Synthetic content labelling

Content generated or modified by AI (text, images, audio, video) must be labelled as such in a machine-readable format.

E.g.: Images created by your generation tool must contain metadata indicating their artificial origin.

Art. 4

AI literacy

Every organisation must ensure that its personnel using AI systems have a sufficient level of AI literacy.

E.g.: Your employees using AI tools daily must receive appropriate training.

In practice, you need to produce documentation proving you comply with these obligations. AiActo generates it automatically from your answers.

Take the quiz and generate my document

See a real example

We use AI ourselves — here is the transparency document generated by AiActo for AiActo. This is exactly what you will get.

Does the AI Act apply to you?

5 yes/no questions. Instant result.

0 / 5 questions0%
1

Do your employees use AI tools in their daily work?

E.g.: drafting emails with ChatGPT, answering clients, building case files, translating documents

0/4

Your personalised diagnosis

No transparency obligation applies directly to you. Stay informed as the framework evolves.

Art. 4 — AI literacy

Your staff must be trained in the use of the AI tools they employ.

Does not apply

Art. 50.2 — Synthetic content labelling

AI-generated content must be labelled as artificial (machine-readable metadata).

Does not apply

Art. 50.3 — Automated decision transparency

People must be informed that decisions concerning them are made with AI assistance.

Does not apply

Art. 50.1 — AI interaction notification

You must inform users that they are interacting with an AI system.

Does not apply

Provider or deployer?

You use a third-party AI service: you are a 'deployer'. Your obligations focus on transparency and usage monitoring.

Based on your answers, we create your personalised compliance documentation.

1 credit is enough (€29) — or Deployer plan at €49/month

Already have an account? Log in

Prefer a full diagnostic?

The full diagnostic analyses your AI system in detail and identifies all your obligations (not just transparency).

Concrete examples by sector

How the AI Act applies across different industries.

E-commerce — Customer chatbot

A customer service chatbot must clearly display that it is an AI. Personalised product recommendations must be transparent.

Art. 50.1Art. 4

Creative agency — Image generation

AI-created visuals for marketing campaigns must be labelled with C2PA metadata indicating their artificial origin.

Art. 50.2Art. 4

HR firm — Candidate scoring

An AI-powered CV screening tool requires transparency obligations towards candidates and heightened vigilance (potentially high-risk system).

Art. 50.1Art. 50.3Art. 4

SaaS startup — Embedded AI

If you integrate an AI model into your product, you combine provider obligations and must document your system (Annex IV).

Art. 50.1Art. 50.2Art. 4

Timeline and penalties

2 Feb 2025

Prohibited practices + AI literacy

Ban on unacceptable-risk AI systems. AI literacy obligation (Art. 4) in force.

2 Aug 2025

GPAI rules

Obligations for general-purpose AI models (foundation model providers).

2 Aug 2026

High-risk systems

All obligations for Annex III high-risk systems come into effect.

Expected penalties

Up to 3% of global turnover or €15 million (whichever is higher) for non-compliance with transparency obligations. SMEs benefit from proportionate caps.

3 steps to achieve compliance

1

Diagnose

Identify your AI systems and determine your specific obligations with our automated diagnostic.

2

Document

Generate the technical documentation and transparency notices required by the regulation.

3

Maintain

Track your compliance progress and update your documents with each modification.

Frequently asked questions

Are SMEs exempt from the AI Act?
No. The AI Act applies to all organisations that develop or use AI systems in the EU, regardless of their size. However, SMEs benefit from certain accommodations: priority access to regulatory sandboxes, proportionate penalty caps, and support measures under Article 62.
When do transparency obligations come into effect?
Some are already in force since 2 February 2025 (prohibited practices and AI literacy). Article 50 transparency obligations will fully apply from 2 August 2026 for high-risk systems.
What does an SME risk for non-compliance?
Fines can reach 3% of global turnover or €15 million for transparency obligation violations. The AI Act provides proportionate caps for SMEs and startups.
I use ChatGPT/Copilot in my company, am I affected?
Yes. As a deployer of an AI system, you are subject to the AI literacy obligation (Art. 4) and must ensure your employees understand the limitations and risks of these tools.
What is the difference between a provider and a deployer?
The provider develops or places an AI system on the market. The deployer uses an AI system under its own authority. The same entity can be both. The provider has more extensive obligations (technical documentation, conformity assessment).
Does the transparency obligation apply to AI-generated emails?
Yes, if content is AI-generated and presented to people, it must be labelled. However, the regulation distinguishes editorially reviewed content by humans (which may be exempt under certain conditions).
Must I label all AI-generated content?
Article 50.2 requires synthetic content (text, images, audio, video) to be labelled in a machine-readable format. This covers deepfakes and artificial content, with exceptions for creative or satirical uses that are clearly identifiable.
How does AiActo help me concretely?
AiActo automates the diagnosis of your obligations, generates the technical documentation required by the AI Act, and guides you step by step through compliance. The initial diagnostic is completely free.

Ready to check your compliance?

Start your free diagnostic and get a compliance plan tailored to your SME in minutes.