Back to obligations
Article 9High-risk

Risk management system

Provider
Deadline: August 2, 2026

Article 9 requires providers of high-risk AI systems to establish a continuous and iterative risk management system covering the entire lifecycle of the system. This system must enable the identification, analysis and mitigation of risks to health, safety and fundamental rights.

Key points

1Continuous iterative process throughout the system lifecycle
2Identification and analysis of known and reasonably foreseeable risks
3Estimation and evaluation of risks from intended use and reasonably foreseeable misuse
4Adoption of appropriate and proportionate risk management measures
5Testing to identify the most appropriate risk management measures

Related definitions

Related articles

Check your compliance with Article 9

Our free diagnostic identifies the obligations applicable to your AI system and guides you to the necessary documentation.

Coming soon

Be among the first to know

Sign up for priority access to the AI Act compliance platform.