Identify your obligations in 3 minutes with a free diagnostic. Write your compliant documentation step by step with AI-guided assistance, including review, validation and audit-ready PDF export.

The AI Act Regulation imposes complex obligations — don't face them alone
Weeks of work with specialized consultants to produce documentation that will be outdated at the next update.
Non-compliance fines can reach 35 million euros or 7% of annual global turnover.
Over 100 articles, technical annexes, staggered deadlines — a legal maze difficult to navigate alone.
Results in 3 minutes — no commitment
Answer about a dozen questions about your AI system. Get your risk level, your regulatory role, and the exact list of your obligations.
Start the diagnosticAI generates each section using the exact European regulatory vocabulary. You review, adjust, and export.
See the documentationEvery paragraph is generated with exact references to the Regulation's articles. You stay in control, AI accelerates the writing.
Discover AI writingFrom client management to diagnostic version tracking, every tool is designed to save you time.
Identify your role and regulatory obligations
Provider — Art. 16 to 24
You develop or place on the market an artificial intelligence system in the European Union.
Art. 16 — 'The provider shall ensure that its high-risk AI systems are compliant with the requirements [...] before placing them on the market or putting them into service.'
Deployer — Art. 26-27
You use a high-risk AI system as part of your professional activity.
Art. 26 — 'Deployers shall use high-risk AI systems in accordance with the instructions of use accompanying the systems.'
Importer — Art. 26
You import an AI system from outside the European Economic Area.
Art. 23 — 'Importers shall ensure that the appropriate conformity assessment procedure has been carried out by the provider.'
Distributor — Art. 26
You make an AI system available on the market without modifying its characteristics.
Art. 24 — 'The distributor shall verify that the high-risk AI system bears the required CE conformity marking.'
"AI Act compliance is the regulatory challenge of the decade for European businesses."
AI Office — European Commission, 2024
Start for free, scale as you grow
Need one-time credits? Discover our pay-per-document offer.
Key compliance dates for Regulation (EU) 2024/1689
February 2, 2025
Prohibition of unacceptable-risk AI systems (social scoring, subliminal manipulation). AI literacy obligation for organizations deploying AI systems.
August 2, 2025
Entry into force of rules for general-purpose AI model providers (GPAI). Technical documentation, training data summary, copyright compliance policy.
August 2, 2026
Full application of obligations for high-risk AI systems (Annex III): technical documentation, conformity assessment, human oversight. Transparency rules for all AI systems.
August 2, 2027
Extension of obligations to AI systems integrated into products already covered by sectoral European legislation (medical devices, machinery, toys, etc.).
Everything you need to know about the European Artificial Intelligence Regulation
The AI Act, officially Regulation (EU) 2024/1689, is the world's first comprehensive legal framework dedicated to artificial intelligence. Adopted on June 13, 2024 by the European Union, it establishes harmonized rules for the development, placing on the market, and use of AI systems. It classifies systems into four risk levels — minimal, limited, high, and unacceptable — and imposes proportionate obligations for each category.
If your company develops, deploys, imports, or distributes artificial intelligence systems within the European Union, you are affected by the AI Act. This includes companies established outside the EU whose AI systems are used on European territory. Our free diagnostic identifies your exact regulatory role and specific obligations in under 3 minutes.
Not necessarily. Classification depends on usage, not technology. A standard customer service chatbot is generally limited risk (transparency obligation Art. 50). However, a chatbot used for recruitment, credit assessment, or access to essential public services will be classified as high-risk (Annex III). Our diagnostic analyzes your specific use case to determine your exact classification.
The penalties under the AI Act are significant: up to 35 million euros or 7% of annual global turnover for the most serious infringements (use of prohibited systems), 15 million or 3% for violation of provider and deployer obligations, and 7.5 million or 1.5% for providing incorrect information. Proportionally reduced caps are provided for SMEs and startups.
The application timeline is progressive: since February 2, 2025 for prohibited practices and AI literacy obligation, August 2, 2025 for general-purpose AI models (GPAI), August 2, 2026 for high-risk systems and transparency obligations, and August 2, 2027 for systems integrated into regulated products. We recommend starting your compliance journey now to calmly anticipate the deadlines.
Yes, as a deployer of a GPAI model-based system. You must comply with transparency obligations (inform users they are interacting with AI) and, depending on your use case, potentially high-risk obligations if your application falls under Annex III. The model provider (Mistral AI, OpenAI, Anthropic, etc.) has their own GPAI obligations, but this does not exempt you from yours.
Providers of high-risk AI systems must produce comprehensive technical documentation compliant with Annex IV of the Regulation, covering the general system description, development process, training data governance, risk management system, performance metrics, and instructions for use. Deployers must additionally maintain a registry of AI systems used and conduct a fundamental rights impact assessment (FRIA). AiActo automatically generates all of these documents.
The provider develops or has developed the AI system and places it on the market under their own name. The deployer uses an AI system as part of their professional activity. The same company can be both: provider of their own AI tool and deployer of third-party tools. Obligations differ significantly: the provider is responsible for the system's technical compliance, the deployer for its compliant use.
No, and this is an important nuance. AiActo is a technology platform that automates the most time-consuming part of compliance: producing technical documentation. Where a law firm provides strategic advice, we provide immediate operational efficiency. We enable your teams (or your lawyers) to focus on what matters by eliminating weeks of manual drafting. For very specific cases, legal expertise remains complementary to our tool.
Start with a free diagnostic in 3 minutes — identify your obligations and begin your documentation.
Sign up for priority access to the AI Act compliance platform.