Try all features for free — 3 credits included on sign-upTry for free
Skip to main content
Compliance · AI Act

Shadow AI in the workplace: how to inventory unauthorised AI tools.

Employees are using ChatGPT, Gemini or other AI tools without informing the IT department or DPO. This phenomenon, known as shadow AI, undermines compliance with the AI Act and exposes the organisation to legal and operational risks.

Jérémy Pierre
Jérémy Pierre
AI Act compliance expert
04 May 2026 8 min read
Shadow AI: How to inventory unauthorised AI tools in your organisation
In brief · 4 key figures
41%
of organisations report unauthorised AI usage (Gartner, 2024)
65-78%
of large organisations affected (Microsoft, 2025)
Art. 26(6)
register and logging obligations for high-risk systems
3 steps
to turn shadow AI into a compliance opportunity
01 - Definition

What is shadow AI?

Shadow AI refers to the use of artificial intelligence tools by employees without validation or oversight from the IT department or DPO.

These tools include chatbots such as ChatGPT, writing assistants, image generators or coding solutions like GitHub Copilot. They are often accessible in just a few clicks, via personal accounts or freemium versions. According to several studies, between 41% and 78% of organisations are affected, with a higher prevalence in large enterprises.

Shadow AI is not a marginal phenomenon. It reflects a broader trend: the rapid adoption of technologies by business functions, outside traditional IT channels. This practice poses major challenges for data governance and regulatory compliance.

02 - Causes

Why is shadow AI growing?

Three factors explain the growth of shadow AI: the accessibility of tools, productivity pressures and the lack of official alternatives.

Consumer AI tools are designed for immediate adoption. A simple email address is enough to create an account and start using services like ChatGPT or Gemini. Freemium versions, with no financial commitment, further reduce barriers to entry.

Productivity pressures push employees to seek quick solutions. In a context of high workload, AI appears as a lever to save time, whether for drafting an email, analysing data or generating code. Business functions, often ahead of IT services, adopt these tools without waiting for internal approvals.

Finally, the lack of official alternatives encourages shadow AI. If the organisation does not offer approved, easy-to-use AI tools, employees turn to external solutions. This phenomenon is particularly marked in sectors where AI needs are emerging, such as legal or communications.

03 - Risks

Concrete risks under the AI Act

Shadow AI undermines the organisation's ability to comply with the AI Act and exposes it to legal, operational and reputational risks.

The AI Act imposes strict obligations on AI systems, particularly regarding transparency, documentation and supervision. An undeclared tool cannot be assessed, documented or audited. For example, Article 26(6) requires a register and logs for high-risk systems. This cannot be met if the system is not identified.

Operational risks are equally concerning. Consumer AI tools are not designed for professional use. They may process sensitive data, such as customer information or trade secrets, without any guarantee of confidentiality. Prompts sent to external APIs may contain personal data, exposing the organisation to GDPR breaches.

"A prompt containing customer data sent to a non-EU API may constitute a GDPR breach, even if the tool is used for internal purposes."

Finally, shadow AI introduces undocumented biases into business processes. An AI text generation or data analysis tool may produce biased results without the organisation being aware. These biases can have legal consequences, particularly in areas such as recruitment or risk assessment.

04 - Method

5-step inventory method

Identifying unauthorised AI tools requires a structured approach, combining collaborative enquiry and technical analysis.

Here is a 5-step method for conducting a comprehensive inventory:

1

Employee survey

Send an anonymous questionnaire to identify the tools used. Ask simple questions: "Which AI tools do you use at work?", "For what tasks?", "Have you created a personal account to access them?".

2

IT expenditure analysis

Review corporate card statements and invoices to identify subscriptions to AI tools. Paid versions of ChatGPT, Midjourney or other services are often purchased directly by business functions.

3

Network and log scanning

Use monitoring tools to detect connections to AI APIs or services. Firewall and proxy logs can reveal undeclared usage, particularly to endpoints such as api.openai.com.

4

Interviews with business functions

Organise workshops with teams to understand their needs and usage. These exchanges help identify tools unknown to IT and gather feedback on their effectiveness.

5

Voluntary declaration form

Set up a simple channel for employees to declare their AI usage. This form should be accessible, non-judgemental and accompanied by clear communication on the objectives.

05 - Tools

Sample AI tool declaration form

A simple, accessible form encourages employees to declare their AI usage without fear.

Here is a sample form you can adapt and deploy in your organisation:

AI tool declaration form

Objective: Identify AI tools used in the organisation to ensure compliance and secure data.

Instructions: Complete this form for each AI tool you use at work. Your responses will remain confidential.

This form can be deployed using tools such as Google Forms, Microsoft Forms or internal solutions. The key is to make it accessible and to clearly communicate its purpose: to secure AI usage, not to penalise employees.

06 - Strategy

Stick or carrot: which approach to take?

Banning shadow AI without offering alternatives is ineffective. A balanced approach combines awareness-raising, official solutions and clear frameworks.

07 - FAQ

Frequently asked questions

Which AI tools are most commonly used in shadow AI?

The tools most frequently used in shadow AI are chatbots such as ChatGPT and Gemini, writing assistants, and coding solutions like GitHub Copilot. These tools are popular due to their accessibility and perceived efficiency for routine tasks.

How can I detect shadow AI usage in my organisation?

To detect shadow AI usage, combine several methods: an anonymous survey of employees, analysis of IT expenditure to identify subscriptions to AI tools, scanning of network logs to detect connections to external APIs, and interviews with business functions to understand their needs and usage.

What are the legal risks of shadow AI under the AI Act?

Shadow AI exposes the organisation to several legal risks under the AI Act. An undeclared tool cannot be documented, audited or supervised, violating transparency and register obligations. Additionally, using consumer AI tools may lead to leaks of sensitive data, breaching GDPR.

How can I raise awareness among employees about the risks of shadow AI?

To raise awareness, organise training sessions on the risks of shadow AI and best practices. Explain the obligations under the AI Act, such as the need to declare tools used. Offer official, secure alternatives and set up a simple, confidential declaration channel.

What is an Acceptable Use Policy for AI?

An Acceptable Use Policy for AI is a document that sets out the rules for using AI tools within the organisation. It specifies which tools are permitted or prohibited, what types of data can be processed, and the declaration and approval procedures. This policy is often accompanied by AI literacy training.

How can I create a catalogue of approved AI tools?

To create a catalogue of approved AI tools, first identify business needs through interviews or surveys. Then evaluate available tools based on compliance, security and efficiency criteria. Select solutions such as Microsoft 365 Copilot or GitHub Copilot Enterprise, which offer compliance guarantees. Clearly communicate this catalogue and train employees in its use.

What are the alternatives to consumer AI tools?

Alternatives to consumer AI tools include professional solutions such as Microsoft 365 Copilot, GitHub Copilot Enterprise, or tools tailored to specific business functions. These solutions offer compliance, security and confidentiality guarantees while meeting employees' needs.

Share this article