AI Act and GDPR comparative table of cross-cutting obligations.
The AI Act and GDPR coexist without prejudice to one another. How can their requirements be aligned to avoid duplication and leverage synergies? Detailed correspondence table and analysis of convergence points.

Two regulations, simultaneous application
The AI Act and GDPR form an integrated regulatory framework for AI in Europe.
Article 2(7) of the AI Act specifies that the regulation applies without prejudice to the GDPR. This wording indicates coexistence without hierarchy. Both texts apply simultaneously whenever an AI system processes personal data, which covers the majority of use cases.
Recital 10 of the AI Act emphasises the need for cooperation between AI supervisory authorities and data protection authorities. This coordination aims to avoid contradictions and optimise controls. In France, the CNIL was designated as the national competent authority for the AI Act in February 2026, strengthening this institutional synergy.
Companies must therefore adopt a unified approach to compliance. Separate documentation for each regulation would result in costly redundancies. Conversely, intelligent integration allows for shared efforts and reduces the risk of non-compliance.
Comparative table of cross-cutting obligations
Points of convergence and divergence between the AI Act and GDPR, structured by theme.
| Theme | AI Act (Regulation EU 2024/1689) | GDPR (Regulation EU 2016/679) | Points of convergence |
|---|---|---|---|
| Scope | Applies to AI systems placed on the market or put into service in the EU, regardless of the provider's place of establishment (Art. 2). | Applies to the processing of personal data carried out in the context of the activities of an establishment of a controller or processor in the EU (Art. 3). | Cumulative scope for AI systems processing personal data. The same activity may fall under both regulations. |
| Roles and responsibilities |
Provider: designs or develops the AI system (Art. 3(3)). Deployer: uses the AI system under its authority (Art. 3(4)). Importer/distributor: makes the AI system available in the EU (Art. 3(5-6)). |
Controller: determines the purposes and means of processing (Art. 4(7)). Processor: processes data on behalf of the controller (Art. 4(8)). |
An AI provider may be a GDPR controller. A deployer may be a controller or processor depending on its role in data processing. |
| Risk management |
FRIA (Fundamental Rights Impact Assessment) mandatory for high-risk AI systems (Art. 27). Assesses risks to health, safety, and fundamental rights. Must be updated throughout the system's lifecycle. |
DPIA (Data Protection Impact Assessment) mandatory for processing likely to result in high risk (Art. 35). Assesses risks to the rights and freedoms of data subjects. Must be carried out prior to processing and updated if necessary. |
Both assessments may be combined for high-risk AI systems processing personal data. The FRIA may incorporate DPIA elements and vice versa. |
| Transparency |
Enhanced transparency obligation for AI systems (Art. 50). Watermarking mandatory for AI-generated or manipulated content (Art. 50(2)). User information on interaction with AI (Art. 50(1)). |
Right to information for data subjects (Art. 13-14). Transparency obligation on automated processing logic (Art. 13(2)(f)). |
Transparency obligations complement each other. Information required by the AI Act may be integrated into GDPR privacy notices to avoid redundancies. |
| Data subject rights |
Right to an explanation for decisions taken by high-risk AI systems (Art. 68). Right to contest automated decisions (Art. 68(3)). |
Right of access (Art. 15), rectification (Art. 16), erasure (Art. 17). Right to restriction of processing (Art. 18). Right to data portability (Art. 20). Right not to be subject to automated individual decision-making (Art. 22). |
Data subject rights reinforce each other. The AI Act's right to explanation complements the GDPR's right to information for automated decisions. |
| Documentation and records |
Record of AI systems mandatory for providers and deployers of high-risk systems (Art. 49). Detailed technical documentation for high-risk systems (Annex IV). |
Record of processing activities mandatory for controllers and processors (Art. 30). Documentation of data breaches (Art. 33(5)). |
Records may be merged for AI systems processing personal data. AI Act technical documentation may enrich the GDPR record. |
| Incidents and breaches |
Notification of serious incidents to competent authorities within 15 days (Art. 73). Broad definition: any malfunction impacting health, safety, or fundamental rights. |
Notification of personal data breaches to the data protection authority within 72 hours (Art. 33). Notification to data subjects if high risk (Art. 34). |
The same incident may trigger both obligations. The AI Act notification may cover the GDPR notification if it includes the elements required by the GDPR. |
| Sanctions |
Up to €35M or 7% of global turnover for prohibited practices (Art. 99(3)). Up to €15M or 3% of global turnover for other infringements (Art. 99(4)). |
Up to €20M or 4% of global turnover for serious violations (Art. 83(5)). Up to €10M or 2% of global turnover for other infringements (Art. 83(4)). | Sanctions may be cumulative for the same infringement. Maximum penalties add up in case of simultaneous violation of both regulations. |
"The convergence between the AI Act and GDPR is not a coincidence, but a political choice. Both regulations share a common philosophy: regulating technologies to safeguard fundamental rights." - Paul Nemitz, Principal Adviser to the European Commission
Documentary synergies to leverage
How to mutualise compliance efforts between the AI Act and GDPR.
Companies can optimise compliance by identifying documentary convergence points. Four major synergies emerge:
Data governance
Article 10 of the AI Act imposes strict requirements on the quality of data used to train AI systems. These requirements overlap with GDPR provisions on data minimisation (Art. 5(1)(c)) and data protection by design (Art. 25). Unified documentation can cover both aspects.
Risk assessments
The FRIA (AI Act) and DPIA (GDPR) share a similar methodology. Both assessments may be merged for high-risk AI systems processing personal data. This approach reduces administrative burden while ensuring comprehensive risk coverage.
Records of activities
The record of AI systems (Art. 49 AI Act) and the record of processing activities (Art. 30 GDPR) may be combined. AI-specific information, such as system classification or transparency measures, may be added to existing GDPR record entries.
Transparency and information
The AI Act's transparency obligations (Art. 50) may be integrated into GDPR privacy notices (Art. 13-14). For example, information on the use of an AI system may be included in the same notice as information on personal data processing.
These synergies enable streamlined compliance. An integrated approach reduces costs and limits the risk of contradictions between the two regulatory frameworks.
AI Act-specific technical obligations
Certain AI Act requirements have no equivalent in the GDPR.
The AI Act introduces specific technical obligations that go beyond GDPR requirements. These measures aim to ensure the robustness, transparency, and human oversight of AI systems.
Human oversight
High-risk AI systems must be designed to enable effective human oversight (Art. 14). This obligation includes appropriate interfaces and deactivation mechanisms. The GDPR does not provide equivalent measures.
Robustness and cybersecurity
AI systems must be resilient to attacks and errors (Art. 15). The AI Act imposes specific resilience tests and cybersecurity measures, distinct from the GDPR's general security requirements (Art. 32).
Accuracy and precision
AI systems must achieve an appropriate level of accuracy, robustness, and cybersecurity (Art. 15(1)). This technical obligation has no direct equivalent in the GDPR, which focuses on data protection rather than system performance.
Automatic logging
High-risk AI systems must incorporate automatic logging features to ensure traceability of operations (Art. 12). These logs must be retained for a period appropriate to the context of use. The GDPR does not provide such technical requirements.
AI content watermarking
Article 50(2) of the AI Act requires clear marking of AI-generated or manipulated content. This obligation aims to combat disinformation and ensure transparency. The GDPR does not address this issue.
These technical specificities require particular attention. Companies must integrate them into their development and deployment processes, in addition to data protection measures.
Role of supervisory authorities in France
The CNIL supervises both the AI Act and GDPR, facilitating coordination.
In France, the CNIL was designated as the national competent authority for the application of the AI Act by decree in February 2026. This designation aligns with a logic of consistency, as the CNIL is already the reference authority for the GDPR. It coordinates its actions with ARCOM and the DGCCRF to cover all aspects of the regulation.
The CNIL published guidelines in March 2026 on the articulation between the AI Act and GDPR. These guidelines clarify expectations for integrated compliance. They notably highlight:
- The possibility of merging risk assessments (FRIA and DPIA).
- Integration of AI Act transparency obligations into GDPR privacy notices.
- Coordination of incident notifications between the two regulatory frameworks.
Companies may rely on these guidelines to structure their compliance. The CNIL also provides tools and templates to facilitate the implementation of cross-cutting obligations. These resources are available on its AI-dedicated website: www.cnil.fr/fr/intelligence-artificielle.
Coordination between European authorities is also strengthened. The AI Office, established in 2025, works closely with the European Data Protection Board (EDPB) to harmonise interpretations and control practices.
Identify your cross-cutting obligations
Our free assessment analyses your AI systems and data processing to map your AI Act and GDPR obligations. Results in 3 minutes.
Frequently asked questions
Answers to the most common questions on the articulation between the AI Act and GDPR.
No. The GDPR only applies to the processing of personal data. An AI system that does not use personal data, such as a predictive maintenance system for industrial machinery, is not subject to the GDPR. However, the AI Act may apply if the system is placed on the market or put into service in the EU, depending on its classification (high-risk or not).
Not necessarily. The FRIA (AI Act) and DPIA (GDPR) may be merged for high-risk AI systems processing personal data. Both assessments share a similar methodology and may be documented in a single report, provided all requirements of both regulations are covered. The CNIL recommends this integrated approach in its 2026 guidelines.
The AI Act does not provide for a role equivalent to that of the DPO. However, the DPO may play a key role in AI Act compliance, particularly for data protection aspects. Their remit may be expanded to include supervision of AI Act obligations, especially for systems processing personal data. This approach is encouraged by the CNIL to ensure overall consistency.
The same incident may trigger both notification obligations. The AI Act notification must be sent to the competent authority (the CNIL in France) within 15 days. If the incident involves a personal data breach, a GDPR notification must also be made within 72 hours. To avoid redundancies, the AI Act notification may include the elements required by the GDPR, provided the 72-hour deadline for the latter is respected.
Yes. Sanctions under the AI Act and GDPR may be cumulative for the same infringement. For example, a breach of transparency obligations for a high-risk AI system processing personal data could result in sanctions under both regulations. Maximum penalties add up: up to €55M or 11% of global turnover in case of cumulative maximum sanctions for prohibited practices.
