Shadow AI: The Hidden Compliance Risk in Your Company
Artificial intelligence has entered Italian companies faster than many managers expected. The problem isn’t adoption itself, it’s that it’s happening without a clear structure, and this exposes organizations to concrete risks, including from a regulatory standpoint.
AI Adoption in Italian companies is growing fast, but so are the risks
The numbers speak for themselves. According to the Istat report “Enterprises and ICT – Year 2025,” 16.4% of Italian companies with at least 10 employees already use at least one artificial intelligence technology, almost double compared to 8.2% in 2024, and more than three times the level in 2023. This is no longer a pilot project reserved for large corporations: AI is already inside marketing teams, operational processes, and data analysis.
The phenomenon is even more evident in large enterprises: 71% have launched at least one AI project, and 84% have purchased generative AI tool licenses, an increase of 31 percentage points in a single year. In parallel, 47% of Italian workers already use AI tools in their daily work.
The critical point, however, is not how many are using it but how they are using it.
What is Shadow AI and why it's a hidden comliance risk
Behind the positive numbers lies a figure that should concern every business leader: according to Istat, the share of companies that declare they use AI without being able to identify even a single specific business area has gone from 15.5% in 2024 to 33.4% in 2025. In practice, one in three companies is using AI without really knowing where or how.
This phenomenon has a name: Shadow AI. Employees using ChatGPT to process client data, automations built without supervision, third-party tools integrated into processes without a risk assessment. Globally, 30–40% of AI use within companies occurs in an unmonitored way. The result? In 2025, 41% of large enterprises experienced at least one privacy incident related to AI use.
When AI is used without clear policies, defined accountability, and adequate training, very concrete problems arise:
Sensitive client or employee data processed in a non-compliant manner
Business decisions that are neither traceable nor justifiable
Lack of control over automated processes
Exposure to legal and reputational risks
And the most insidious problem is precisely this: these risks are not immediately visible. They surface when it’s already too late.
EU AI Act and GDPR: what AI Compliance really means for your business
Being compliant doesn’t simply mean following a law. It means knowing how AI is being used within your organization, having effective control over processes, and ensuring transparency in automated decisions.
But ignoring regulation carries a very concrete cost today. The EU AI Act (EU Regulation 2024/1689) is already in force: the first obligations — including mandatory AI literacy for employees — took effect on Febbraio 2, 2025, while those concerning high-risk systems will enter into force on Agosto 2, 2026. Fines for non-compliance can reach up to €35 million or 7% of annual global turnover — figures even higher than the penalties provided for under the GDPR.
And the GDPR remains a parallel risk: in 2025, breach notifications in Europe grew by 22% year on year, exceeding for the first time 443 reported incidents per day. Cumulative fines since 2018 have now reached €7.1 billion.
The AI Governance gap: why tools alone are not enough
The point isn’t just what AI does, but how people interpret it, use it, and make decisions based on its outputs. It’s structural. According to the EY Italy AI Barometer, 74% of managers are familiar with the ethical framework on AI, but only 47% of employees are aware of it — an awareness gap that translates directly into operational risk.
Companies are investing in tools, but they are not building:
Governance — who is responsible for decisions made with AI?
Competencies — do teams know how to use AI safely and compliantly?
Structure — are there policies, audit trails, control processes?
Only 16% of Italian SMEs use AI in a structured way. For all the others, spontaneous and ungoverned adoption creates every day a wider gap between what AI does and what the company is able to control.
The question is: are you in control of it?
Discover how to govern AI compliantly