EU AI Act: Why 'It Doesn't Apply to Us' Is the Most Dangerous Phrase for Your Business

Key takeaways
- 42% of European SMEs: already use AI-powered tools without realising it (Eurostat 2024).
- AI Act criterion: it's not the company size that matters, but the nature and risk level of the AI system in use.
- Concrete examples: a CRM with prospect scoring, accounting software with cashflow forecasting, or an ATS recruitment tool may all be affected.
- Severe penalties: up to €15 million or 3% of global turnover for non-compliance with high-risk system obligations.
- Quick quiz: 5 questions to determine in 2 minutes whether your business is affected by the AI Act.
- Simple solution: AiActo's free diagnostic checks your compliance in under 3 minutes.
Imagine receiving a letter from the AI Office informing your company of a breach of the Regulation (EU) 2024/1689-the EU AI Act. The reason? The use of a non-compliant AI system embedded in a tool you use daily, without even knowing it contained artificial intelligence. A worst-case scenario? Not necessarily. According to Eurostat (2024), 42% of European companies with over 10 employees use at least one tool incorporating AI. Yet many SME leaders still believe the EU AI Act "doesn't apply to us." Spoiler: it probably does.
Why "It Doesn't Apply to Us" Is Dangerous
The EU AI Act entered into force on 1 August 2024, with its first obligations already applicable since February 2025. Yet a 2025 study by McKinsey reveals that 68% of European SME leaders underestimate the regulation's impact on their operations. The reason? Four persistent but false beliefs that could prove costly-up to €15 million in fines or 3% of global turnover, as outlined in Article 71 of the Regulation.
Here's why these beliefs are dangerous-and how to debunk them.
Myth #1: "We're Too Small to Be Affected"
Many SME leaders assume the EU AI Act only applies to large corporations or tech giants. Yet the Act's criteria are not based on company size, but on the nature and risk level of the AI system in use. A five-person SME using a credit scoring tool to assess customers is just as affected as a major bank.
Concrete examples:
- A three-person online shop: uses dynamic pricing software to adjust prices based on demand. This tool incorporates AI to analyse purchasing behaviour and predict trends. Risk: if the system is deemed "high-risk" (e.g., if it significantly influences prices of essential products), the SME must comply with AI Act obligations such as technical documentation or risk assessments.
- A 10-employee accountancy firm: uses accounting software with a cashflow forecasting module. This module, powered by AI algorithms, analyses past financial flows to anticipate liquidity issues. Risk: if the module is used to make significant financial decisions (e.g., granting credit or delaying payments), it could be classified as "high-risk" and require strict compliance.
- A five-person real estate agency: uses a CRM with a prospect scoring module. This AI-powered module evaluates the likelihood of a prospect purchasing a property based on browsing history, interactions with the agency, and demographic data. Risk: if the scoring significantly influences commercial decisions (e.g., prioritising certain prospects over others), the system could be deemed "limited-risk" and require transparency obligations.
In all three cases, company size is irrelevant. What matters is the use of an AI system and its risk level. Non-compliance penalties can be existential for an SME: up to €15 million or 3% of global turnover.
Myth #2: "We Don't Actually Use AI"
Many leaders believe their company doesn't use AI simply because they haven't developed in-house models or deployed chatbots. Yet AI is often embedded in everyday tools, unbeknownst to users. According to Gartner (2024), 75% of enterprise software will incorporate some form of AI by 2028-often invisibly to the end user.
Concrete examples:
- Accounting software: with a cashflow forecasting module. This module uses AI algorithms to analyse past financial flows and predict liquidity issues. Issue: if the module is used to make financial decisions (e.g., granting credit or delaying payments), it could be classified as "high-risk" and require strict compliance.
- A CRM (Customer Relationship Management) system: with a prospect scoring module. This AI-powered module assesses the likelihood of a prospect becoming a customer based on browsing history, interactions with the company, and demographic data. Issue: if the scoring significantly influences commercial decisions (e.g., prioritising certain prospects), the system could be deemed "limited-risk" and require transparency obligations.
- An ATS (Applicant Tracking System) recruitment tool: with an automatic CV sorting module. This module uses AI algorithms to analyse CVs and rank them based on job fit. Issue: if the system is used to automatically exclude certain candidates, it is classified as "high-risk" under the AI Act and requires strict compliance, including bias risk assessments and detailed technical documentation.
In all three cases, the company uses AI without even realising it. If these tools are classified as "high-risk," non-compliance can prove costly: up to €15 million in fines or 3% of global turnover.
Myth #3: "Our Sector Isn't Affected"
Many leaders assume the EU AI Act only applies to "tech" or "innovative" sectors. Yet the regulation applies to all sectors as soon as an AI system is used. An artisanal bakery using AI-powered order planning software is potentially affected. A recruitment agency using an ATS with automatic scoring is even more so.
Concrete examples by sector:
- Healthcare: a physiotherapy practice uses appointment management software with a cancellation prediction module. This AI-based module analyses historical appointment data to anticipate cancellations and optimise scheduling. Risk: if the system significantly influences appointment scheduling (e.g., prioritising certain patients), it could be deemed "limited-risk" and require transparency obligations.
- Retail: a clothing boutique uses dynamic pricing software to adjust prices based on demand. This tool incorporates AI to analyse purchasing behaviour and predict trends. Risk: if the system is deemed "high-risk" (e.g., if it significantly influences prices of essential products), the boutique must comply with AI Act obligations.
- Construction: an SME uses project management software with a delay prediction module. This AI-based module analyses historical project data to anticipate delays and optimise planning. Risk: if the system is used to make significant decisions (e.g., resource allocation or deadline planning), it could be classified as "high-risk" and require strict compliance.
- Hospitality: a restaurant uses inventory management software with a sales forecasting module. This AI-powered module analyses historical sales data to anticipate stock needs and prevent shortages. Risk: if the system significantly influences purchasing decisions (e.g., automatically ordering products), it could be deemed "limited-risk" and require transparency obligations.
In all these cases, the sector is irrelevant. What matters is the use of an AI system and its risk level. Non-compliance penalties are the same for all: up to €15 million or 3% of global turnover.
Myth #4: "The AI Act Is Only for Tech Startups"
Many leaders assume the EU AI Act only applies to tech startups or companies developing AI models. Yet the regulation applies to all companies using AI systems, whether they develop them or not. An SME using a credit scoring tool to assess customers is just as affected as a startup developing a deep learning model.
Concrete examples:
- An electrician: uses service management software with a fault prediction module. This AI-based module analyses historical service data to anticipate faults and optimise scheduling. Risk: if the system significantly influences decision-making (e.g., prioritising certain jobs), it could be deemed "limited-risk" and require transparency obligations.
- A travel agency: uses a destination recommendation tool with an AI-powered personalisation module. This module analyses browsing history and customer preferences to suggest suitable destinations. Risk: if the system significantly influences customer choices (e.g., suggesting more expensive or less suitable destinations), it could be deemed "limited-risk" and require transparency obligations.
- A law firm: uses legal research software with a jurisprudence prediction module. This AI-powered module analyses past court decisions to anticipate trends and suggest legal arguments. Risk: if the system is used to make significant decisions (e.g., defence strategy or assessing case success chances), it could be classified as "high-risk" and require strict compliance.
In all three cases, the company is not a tech startup, yet it uses AI systems that may be affected by the AI Act. Non-compliance penalties remain the same for all: up to €15 million or 3% of global turnover.
Quiz: Are You Affected by the EU AI Act?
To determine whether your business is affected by the EU AI Act, answer these five yes/no questions. If you answer "yes" to at least one, your company is likely affected.
- Do you use a tool that automatically analyses data to make decisions (e.g., a CRM with prospect scoring, recruitment software with automatic CV sorting, or dynamic pricing software)?
- Do you use a tool that predicts trends or behaviours (e.g., accounting software with cashflow forecasting, inventory management software with sales forecasting, or project management software with delay prediction)?
- Do you use a tool that personalises recommendations or content for customers or employees (e.g., product recommendation software, training software with personalised learning paths, or HR software with training suggestions)?
- Do you use a tool that automates decision-making processes (e.g., recruitment software that automatically excludes certain candidates, credit scoring software that automatically rejects loan applications, or surveillance systems that trigger automatic alerts)?
- Do you use a tool that analyses images, videos, or voices to make decisions (e.g., facial recognition software for access control, medical image analysis tools, or video surveillance systems with automatic behaviour detection)?
If you answered "yes" to at least one question, your business is likely affected by the EU AI Act. To assess your risk level and obligations, use AiActo's free diagnostic, which classifies your AI system in under 3 minutes.
What to Do If Your Business Is Affected
If you've discovered your business is affected by the EU AI Act, don't panic. Here's how to achieve compliance:
- Classify your AI system: determine whether your system is deemed "minimal-risk," "limited-risk," "high-risk," or "prohibited" under the AI Act. Use AiActo's free diagnostic for step-by-step guidance.
- Identify your obligations: depending on your system's risk level, you'll need to meet different requirements. For example:
- Minimal-risk systems: no specific obligations, but it's advisable to document your AI use to demonstrate good faith in case of an inspection.
- Limited-risk systems: transparency obligations (e.g., informing users they're interacting with an AI system).
- High-risk systems: strict obligations, including technical documentation, risk assessments, governance measures, and declaration to competent authorities.
- Document your compliance: for high-risk systems, you'll need to produce detailed technical documentation compliant with Annex IV of the Regulation. This documentation must include:
- A description of the AI system and its objectives.
- A description of the data used to train the system.
- A risk assessment and measures implemented to mitigate risks.
- A description of governance and monitoring measures.
- Implement AI governance: appoint a responsible person for AI compliance within your company and establish processes to regularly monitor and evaluate your AI system.
- Declare your system: if your system is classified as "high-risk," you must declare it to the competent authorities (e.g., the AI Office in Europe).
To simplify these steps, AiActo offers a comprehensive platform guiding you through AI Act compliance step by step. With AiActo, you can:
- Classify your AI system in seconds using a free diagnostic.
- Automatically generate technical documentation compliant with Annex IV of the Regulation.
- Receive tailored support to implement AI governance suited to your business.
- Export professional PDF documentation ready for inspection.
Conclusion: Is the EU AI Act a Hidden Opportunity?
The EU AI Act isn't just a regulatory constraint-it's also an opportunity for SMEs to differentiate themselves, build customer trust, and future-proof their operations. By achieving compliance now, you can:
- Avoid severe penalties: up to €15 million or 3% of global turnover.
- Strengthen customer trust: by demonstrating responsible and transparent AI use.
- Future-proof your business: AI is becoming ubiquitous across all sectors. Compliance now prepares you for regulatory and technological evolution.
- Access new markets: some companies and public institutions already require AI Act compliance from suppliers and partners.
Rather than thinking "it doesn't apply to us," ask: "What if the EU AI Act is an opportunity for my business?" To find out, start with AiActo's free diagnostic. In under 3 minutes, you'll know whether your business is affected and what steps to take next for compliance.
"AI Act compliance isn't a constraint-it's an investment in your company's future. By achieving compliance now, you avoid severe penalties, build customer trust, and prepare for the era of responsible AI."
- AiActo Glossary
Does the EU AI Act apply to small businesses?
Yes, the EU AI Act applies to all businesses, regardless of size, as soon as they use an AI system. The criterion isn't company size, but the nature and risk level of the AI system in use. For example, a five-person SME using a credit scoring tool to assess customers is just as affected as a major bank.
How can I tell if my business uses AI without realising it?
Many everyday tools incorporate AI without users realising it. Examples include accounting software with cashflow forecasting, a CRM with prospect scoring, or recruitment software with automatic CV sorting. To check if your business uses AI, ask yourself:
- Do you use a tool that automatically analyses data to make decisions?
- Do you use a tool that predicts trends or behaviours?
- Do you use a tool that personalises recommendations or content?
If you answered "yes" to at least one question, you're likely using AI. To confirm, take AiActo's free diagnostic.
Which sectors are affected by the EU AI Act?
The EU AI Act applies to all sectors as soon as an AI system is used. Examples include:
- Healthcare: appointment management software with cancellation prediction.
- Retail: dynamic pricing software to adjust prices based on demand.
- Construction: project management software with delay prediction.
- Hospitality: inventory management software with sales forecasting.
In all these cases, the sector is irrelevant. What matters is the use of an AI system and its risk level.
What are the penalties for non-compliance with the EU AI Act?
Penalties for non-compliance with the EU AI Act can be severe: up to €15 million or 3% of global turnover, as outlined in Article 71 of the Regulation. These penalties apply for failing to meet obligations related to AI systems, particularly "high-risk" systems.
How can I check if my business is compliant with the EU AI Act?
To check your business's compliance with the EU AI Act, start by classifying your AI system based on its risk level. Use AiActo's free diagnostic for step-by-step guidance. Then, identify your obligations based on your system's risk level and implement the necessary measures for compliance (technical documentation, risk assessments, AI governance, etc.).
Does the EU AI Act apply to tools I purchase (rather than develop myself)?
Yes, the EU AI Act applies to all AI systems used, whether developed in-house or purchased from a provider. For example, if you use a CRM with a prospect scoring module or recruitment software with automatic CV sorting, you're affected by the AI Act-even if you didn't develop these tools yourself. In this case, it's your responsibility to ensure the tools you use comply with the regulation.
What are the key upcoming deadlines for the EU AI Act?
Key upcoming deadlines for the EU AI Act include:
- 2 November 2026: transparency and watermarking obligations for generative AI systems (Article 50).
- 2 December 2027: obligations for "high-risk" AI systems listed in Annex III (e.g., recruitment systems, credit scoring, or biometric surveillance).
- 2 August 2028: obligations for AI systems embedded in regulated products (e.g., medical devices or autonomous vehicles).
To prepare for these deadlines, start classifying your AI systems and identifying your obligations now. AiActo's free diagnostic can help clarify your next steps.
How can AiActo help me achieve compliance with the EU AI Act?
AiActo offers a comprehensive platform to help you achieve compliance with the EU AI Act. With AiActo, you can:
- Classify your AI system in seconds using a free diagnostic.
- Automatically generate technical documentation compliant with Annex IV of the Regulation.
- Receive tailored support to implement AI governance suited to your business.
- Export professional PDF documentation ready for inspection.
AiActo is designed to be simple and accessible, even for SMEs without regulatory compliance expertise. To get started, take the free diagnostic.