Regulation (EU) 2024/1689 · AI Act Compliance

Your AI Act compliance, structured and guided.

Identify your obligations in 3 minutes with a free diagnostic. Write your compliant documentation step by step with AI-guided assistance, including review, validation and audit-ready PDF export.

CNILGDPR compliant · Data hosted in France
aiacto.eu
aiacto interface — AI Act compliance diagnostic

The AI Act compliance challenge

The AI Act Regulation imposes complex obligations — don't face them alone

0Key articles
0Risk levels
0MMaximum fine
0 minExpress diagnostic

Costly manual process

Weeks of work with specialized consultants to produce documentation that will be outdated at the next update.

Major financial penalties

Non-compliance fines can reach 35 million euros or 7% of annual global turnover.

Regulatory complexity

Over 100 articles, technical annexes, staggered deadlines — a legal maze difficult to navigate alone.

Start your free diagnostic

Results in 3 minutes — no commitment

Diagnostic · Free · 3 minutes

Identify your obligations in 3 minutes.

Answer about a dozen questions about your AI system. Get your risk level, your regulatory role, and the exact list of your obligations.

Start the diagnostic
Technical Documentation · AI Act Compliant

Write your technical documentation, section by section.

AI generates each section using the exact European regulatory vocabulary. You review, adjust, and export.

See the documentation
AI Writing · Regulatory vocabulary

Assisted writing, not a black box.

Every paragraph is generated with exact references to the Regulation's articles. You stay in control, AI accelerates the writing.

Discover AI writing
Project management

Manage every step of your compliance

From client management to diagnostic version tracking, every tool is designed to save you time.

Who is affected by the AI Act?

Identify your role and regulatory obligations

Provider

Provider — Art. 16 to 24

You develop or place on the market an artificial intelligence system in the European Union.

Art. 16 — 'The provider shall ensure that its high-risk AI systems are compliant with the requirements [...] before placing them on the market or putting them into service.'

  • Technical documentation (Annex IV)
  • Risk management system (Art. 9)
  • Data governance (Art. 10)
  • EU declaration of conformity (Art. 47)

Deployer

Deployer — Art. 26-27

You use a high-risk AI system as part of your professional activity.

Art. 26 — 'Deployers shall use high-risk AI systems in accordance with the instructions of use accompanying the systems.'

  • Human oversight (Art. 26.2)
  • Fundamental rights impact assessment — FRIA (Art. 27)
  • Registry of AI systems used
  • Log retention (Art. 26.6)

Importer

Importer — Art. 26

You import an AI system from outside the European Economic Area.

Art. 23 — 'Importers shall ensure that the appropriate conformity assessment procedure has been carried out by the provider.'

  • Conformity verification
  • Documentation retention
  • Cooperation with authorities
  • Non-conformity reporting

Distributor

Distributor — Art. 26

You make an AI system available on the market without modifying its characteristics.

Art. 24 — 'The distributor shall verify that the high-risk AI system bears the required CE conformity marking.'

  • CE marking verification
  • Compliant storage conditions
  • Cooperation with authorities
  • Market withdrawal if necessary
"AI Act compliance is the regulatory challenge of the decade for European businesses."

AI Office — European Commission, 2024

Clever CloudMistral AIStripeBunny CDN

Plans for every need

Start for free, scale as you grow

Free

€0 forever

Start evaluating your AI systems

  • Unlimited diagnostics
  • Risk classification
  • Obligations identification
Popular

Provider

€149 / month

For AI system providers

  • Everything in Free
  • Technical documentation Art. 11
  • Conformity declaration
  • Professional PDF export
  • Priority support

GPAI

€199 / month

For GPAI model providers

  • Everything in Provider
  • Model Documentation Form
  • Training data summary
  • Downstream documentation
  • Copyright compliance

Need one-time credits? Discover our pay-per-document offer.

Secure paymentStripeDrapeau de l'Union europ\u00e9enneHosted in France

AI Act Timeline

Key compliance dates for Regulation (EU) 2024/1689

February 2, 2025

Prohibited practices + AI Literacy

Effective

Prohibition of unacceptable-risk AI systems (social scoring, subliminal manipulation). AI literacy obligation for organizations deploying AI systems.

August 2, 2025

GPAI Obligations

Effective

Entry into force of rules for general-purpose AI model providers (GPAI). Technical documentation, training data summary, copyright compliance policy.

August 2, 2026

High-risk systems + Transparency

Next deadline

Full application of obligations for high-risk AI systems (Annex III): technical documentation, conformity assessment, human oversight. Transparency rules for all AI systems.

August 2, 2027

Regulated products (Annex I)

Upcoming

Extension of obligations to AI systems integrated into products already covered by sectoral European legislation (medical devices, machinery, toys, etc.).

Frequently asked questions about the AI Act

Everything you need to know about the European Artificial Intelligence Regulation

The AI Act, officially Regulation (EU) 2024/1689, is the world's first comprehensive legal framework dedicated to artificial intelligence. Adopted on June 13, 2024 by the European Union, it establishes harmonized rules for the development, placing on the market, and use of AI systems. It classifies systems into four risk levels — minimal, limited, high, and unacceptable — and imposes proportionate obligations for each category.

If your company develops, deploys, imports, or distributes artificial intelligence systems within the European Union, you are affected by the AI Act. This includes companies established outside the EU whose AI systems are used on European territory. Our free diagnostic identifies your exact regulatory role and specific obligations in under 3 minutes.

Not necessarily. Classification depends on usage, not technology. A standard customer service chatbot is generally limited risk (transparency obligation Art. 50). However, a chatbot used for recruitment, credit assessment, or access to essential public services will be classified as high-risk (Annex III). Our diagnostic analyzes your specific use case to determine your exact classification.

The penalties under the AI Act are significant: up to 35 million euros or 7% of annual global turnover for the most serious infringements (use of prohibited systems), 15 million or 3% for violation of provider and deployer obligations, and 7.5 million or 1.5% for providing incorrect information. Proportionally reduced caps are provided for SMEs and startups.

The application timeline is progressive: since February 2, 2025 for prohibited practices and AI literacy obligation, August 2, 2025 for general-purpose AI models (GPAI), August 2, 2026 for high-risk systems and transparency obligations, and August 2, 2027 for systems integrated into regulated products. We recommend starting your compliance journey now to calmly anticipate the deadlines.

Yes, as a deployer of a GPAI model-based system. You must comply with transparency obligations (inform users they are interacting with AI) and, depending on your use case, potentially high-risk obligations if your application falls under Annex III. The model provider (Mistral AI, OpenAI, Anthropic, etc.) has their own GPAI obligations, but this does not exempt you from yours.

Providers of high-risk AI systems must produce comprehensive technical documentation compliant with Annex IV of the Regulation, covering the general system description, development process, training data governance, risk management system, performance metrics, and instructions for use. Deployers must additionally maintain a registry of AI systems used and conduct a fundamental rights impact assessment (FRIA). AiActo automatically generates all of these documents.

The provider develops or has developed the AI system and places it on the market under their own name. The deployer uses an AI system as part of their professional activity. The same company can be both: provider of their own AI tool and deployer of third-party tools. Obligations differ significantly: the provider is responsible for the system's technical compliance, the deployer for its compliant use.

No, and this is an important nuance. AiActo is a technology platform that automates the most time-consuming part of compliance: producing technical documentation. Where a law firm provides strategic advice, we provide immediate operational efficiency. We enable your teams (or your lawyers) to focus on what matters by eliminating weeks of manual drafting. For very specific cases, legal expertise remains complementary to our tool.

Ready to structure your AI Act compliance?

Start with a free diagnostic in 3 minutes — identify your obligations and begin your documentation.

Drapeau de l'Union europ\u00e9enne100% European infrastructure
Coming soon

Be among the first to know

Sign up for priority access to the AI Act compliance platform.