SMEs: Are you affected by the transparency obligations of the AI Act?
Most SMEs using AI are subject to the EU Regulation without knowing it. Take the test in 5 questions.
The AI Act and SMEs: what you need to know
Regulation (EU) 2024/1689 on artificial intelligence — the 'AI Act' — applies to any organisation that develops or uses AI systems in the European Union, regardless of its size.
Contrary to popular belief, SMEs do not benefit from a general exemption. If you use a chatbot, generate AI content, or make AI-assisted decisions, you are likely subject to transparency obligations.
Obligations vary depending on your role (provider or deployer) and the type of system used. Some have already been in effect since February 2025.
Did you know?
82% of European SMEs using AI are subject to at least one AI Act transparency obligation, according to a European Commission study (2024).
The 3 transparency obligations affecting SMEs
For each obligation that applies to you, you must produce a compliance document. Here they are.
AI interaction notification
People must be informed that they are interacting with an AI system, unless it is obvious from the context.
E.g.: Your customer service chatbot must display a message indicating that responses are AI-generated.
Synthetic content labelling
Content generated or modified by AI (text, images, audio, video) must be labelled as such in a machine-readable format.
E.g.: Images created by your generation tool must contain metadata indicating their artificial origin.
AI literacy
Every organisation must ensure that its personnel using AI systems have a sufficient level of AI literacy.
E.g.: Your employees using AI tools daily must receive appropriate training.
In practice, you need to produce documentation proving you comply with these obligations. AiActo generates it automatically from your answers.
Start the quick diagnostic and generate my documentSee a real example
We use AI ourselves — here is the transparency document generated by AiActo for AiActo. This is exactly what you will get.
Does the AI Act apply to you?
5 yes/no questions. Instant result.
Do your employees use AI tools in their daily work?
E.g.: drafting emails with ChatGPT, answering clients, building case files, translating documents
Concrete examples by sector
How the AI Act applies across different industries.
E-commerce — Customer chatbot
A customer service chatbot must clearly display that it is an AI. Personalised product recommendations must be transparent.
Creative agency — Image generation
AI-created visuals for marketing campaigns must be labelled with C2PA metadata indicating their artificial origin.
HR firm — Candidate scoring
An AI-powered CV screening tool requires transparency obligations towards candidates and heightened vigilance (potentially high-risk system).
SaaS startup — Embedded AI
If you integrate an AI model into your product, you combine provider obligations and must document your system (Annex IV).
Timeline and penalties
Prohibited practices + AI literacy
Ban on unacceptable-risk AI systems. AI literacy obligation (Art. 4) in force.
GPAI rules
Obligations for general-purpose AI models (foundation model providers).
High-risk (Annex III) + Transparency (Art. 50)
All high-risk obligations and transparency requirements come into effect together.
Expected penalties
Up to 3% of global turnover or €15 million (whichever is higher) for non-compliance with transparency obligations. SMEs benefit from proportionate caps.
3 steps to achieve compliance
Diagnose
Identify your AI systems and determine your specific obligations with our automated diagnostic.
Document
Generate the technical documentation and transparency notices required by the regulation.
Maintain
Track your compliance progress and update your documents with each modification.
Frequently asked questions
No. The AI Act applies to all organisations that develop or use AI systems in the EU, regardless of their size. However, SMEs benefit from certain accommodations: priority access to regulatory sandboxes, proportionate penalty caps, and support measures under Article 62.
Some are already in force since 2 February 2025 (prohibited practices and AI literacy). Article 50 transparency obligations and high-risk system obligations (Annex III) will apply from 2 August 2026. The omnibus proposal provides for postponements but is not yet adopted.
Fines can reach 3% of global turnover or €15 million for transparency obligation violations. The AI Act provides proportionate caps for SMEs and startups.
Yes. As a deployer of an AI system, you are subject to the AI literacy obligation (Art. 4) and must ensure your employees understand the limitations and risks of these tools.
The provider develops or places an AI system on the market. The deployer uses an AI system under its own authority. The same entity can be both. The provider has more extensive obligations (technical documentation, conformity assessment).
Yes, if content is AI-generated and presented to people, it must be labelled. However, the regulation distinguishes editorially reviewed content by humans (which may be exempt under certain conditions).
Article 50.2 requires synthetic content (text, images, audio, video) to be labelled in a machine-readable format. This covers deepfakes and artificial content, with exceptions for creative or satirical uses that are clearly identifiable.
AiActo automates the diagnosis of your obligations, generates the technical documentation required by the AI Act, and guides you step by step through compliance. The initial diagnostic is completely free.
Ready to check your compliance?
Start your free diagnostic and get a compliance plan tailored to your SME in minutes.