Try all features for free — 3 credits included on sign-upTry for free
Skip to main content
Compliance · AI Act

Microsoft Copilot and the AI Act: is your company exposed?

Microsoft 365 Copilot is now integrated into the tools of thousands of French companies. But its use entails AI Act obligations, which vary depending on the application. Here's what you need to check right away.

Jérémy Pierre
Jérémy Pierre
AI Act compliance expert
12 June 2025 8 minute read
Microsoft Copilot and the AI Act: What are your company's obligations?
In brief · 4 key figures to remember
2 August 2025
GPAI obligations for Microsoft, already in force
Art. 26
Your responsibility as a deployer
Annex III
HR or credit uses potentially high-risk
EU Data Boundary
Data hosting in Europe to verify
01 - Context

Why Copilot exposes you to the AI Act

Microsoft 365 Copilot is a generative AI tool integrated into Word, Excel, or Outlook. Deploying it in your company automatically places you under the AI Act regime.

The AI Act distinguishes between two roles: the provider (Microsoft) and the deployer (your company). As a deployer, you are responsible for ensuring the compliance of how you use Copilot. This includes risk classification, documentation, and human oversight, even if Microsoft provides the underlying model.

Article 26 of the AI Act states that the deployer must ensure the system is used in accordance with its intended purpose. If you repurpose Copilot for HR or financial decisions, you become responsible for the compliance of that specific use.

02 - Responsibilities

What Microsoft covers and what remains your responsibility

Microsoft, as the provider of the GPAI model (GPT-4), must comply with Articles 51 to 56 of the AI Act. However, this does not cover your internal uses.

Microsoft (provider)
  • Compliance of the GPAI model (Articles 51-56)
  • Technical documentation and transparency (AI Transparency Report)
  • Management of systemic risks
Your company (deployer)
  • Risk classification based on use case (Article 6)
  • Documentation of internal processes (Article 26)
  • Human oversight and user training
  • Transparency towards affected third parties (Article 50)

Microsoft publishes an AI Act compliance documentation for its services. This documentation is useful, but it does not replace your own risk assessment of your specific uses.

03 - Obligations

Your obligations by use case

Obligations vary depending on whether Copilot is used for general productivity or sensitive decisions.

1. Standard use (productivity, drafting, research)

In this case, Copilot is likely classified as limited-risk (Article 50). Your obligations are light but real:

  • Inform users that Copilot is an AI system.
  • Document internal use cases (e.g., generating meeting minutes).
  • Train employees in responsible use (e.g., not sharing sensitive data).
  • Ensure outputs do not affect third parties without transparency (e.g., automatically generated emails).

2. Sensitive use (HR decisions, credit, performance evaluation)

If Copilot is used for automated decisions in areas listed in Annex III of the AI Act, it may be classified as high-risk. Your obligations then become strict:

  • Risk assessment before deployment (Article 9).
  • Detailed technical documentation (Article 11).
  • Mandatory human oversight (Article 14).
  • Activity logging (Article 12).
  • Clear information to affected individuals (Article 13).

« A company using Copilot to screen CVs or evaluate loan applications automatically becomes a provider of a high-risk system, even if it did not develop the model. »

04 - Risks

Practical cases where Copilot becomes high-risk

Here are concrete examples where your use of Copilot may fall into the high-risk category.

1. Recruitment and human resources management

If Copilot is used for:

  • Screening or pre-selecting CVs.
  • Generating performance evaluations.
  • Proposing promotions or pay rises.

These uses fall under Annex III, point 4 of the AI Act (management of employment relationships). You must then apply the obligations for high-risk systems, even if Copilot was not designed for this purpose.

2. Credit and financial scoring

If Copilot is used for:

  • Analysing loan or credit applications.
  • Assessing a customer's creditworthiness.
  • Proposing personalised pricing terms.

These uses fall under Annex III, point 5 (access to essential services). Again, high-risk obligations apply.

3. Health and insurance

If Copilot is used for:

  • Analysing medical records (even partially).
  • Proposing personalised insurance premiums.

These uses fall under Annex III, points 1 and 6. They are considered high-risk and require strict compliance.

05 - Data

Data hosting: what Microsoft says

Since 2023, Microsoft has offered the EU Data Boundary option, which allows European customer data to be stored exclusively in EU-based data centres.

For Copilot, this means:

  • Data processed by Copilot (documents, emails, prompts) remains within the EU.
  • This facilitates GDPR compliance but does not exempt you from AI Act obligations.
  • Check your Microsoft 365 contract to confirm this option is activated.

The AI Office notes that hosting data within the EU is an important criterion, but not sufficient to guarantee full compliance with the AI Act.

Identify your obligations in 3 minutes

Our free diagnostic analyses your use of Copilot and tells you which compliance steps to follow.

06 - FAQ

Frequently asked questions

Answers to the most common questions about Copilot and the AI Act.

Microsoft is responsible as the provider of the GPAI model (Articles 51-56). However, your company remains responsible for how you use Copilot, particularly if that use is classified as high-risk (Article 26).

Refer to Annex III of the AI Act. If Copilot is used for decision-making in listed areas (HR, credit, health, etc.), it is likely high-risk. A precise diagnostic is recommended.

Yes. Article 4 of the AI Act requires AI literacy for users. This includes training on Copilot's limitations, bias risks, and best practices to avoid non-compliant uses.

You must apply the obligations for high-risk systems (Articles 9 to 15): risk assessment, technical documentation, human oversight, logging, and informing affected individuals. A compliance audit is strongly recommended.

The EU Data Boundary facilitates GDPR compliance by limiting data transfers outside the EU. However, it does not cover all aspects of GDPR (e.g., access or rectification rights) or the specific obligations of the AI Act. A full verification is necessary.

It depends on the purpose. If the analysis serves automated decision-making (e.g., credit scoring), it may be high-risk. In any case, you must inform customers and document the use (Article 50). Avoid processing sensitive data without prior analysis.

Sanctions can reach up to €35 million or 7% of global turnover (Article 99). For SMEs, fines are capped at €15 million or 3% of turnover. Regulatory authorities (CNIL in France) may also impose corrective measures.

Jérémy Pierre
Jérémy Pierre
Founder aiacto.eu · AI Act compliance expert

Supports AI providers and deployers in meeting regulatory compliance requirements.

Share this article