The $100 Million Adoption Gap: Why Your AI Budget Doesn't Matter
At a panel discussion in London a while back, Eleanor Bertin from Pigment dropped a line that should make every C-suite executive uncomfortable:
“Most companies have big budgets right now for AI. So that’s not the problem anymore. Every business leader, every CEO wants to implement AI. I think the problem is that they don’t necessarily get yet the adoption, like the bottom-up adoption from every single contributor.”
Let that sink in. The money’s there. The tools are there. The executive commitment is there. But the people? They’re not showing up.
And here’s the part that should really keep you up at night: 78% of organizations now use AI in at least one business function, yet 70-85% of AI initiatives fail to meet expected outcomes. Companies are spending billions, literally, global AI investment hit $109.1 billion in the U.S. alone in 2024, on tools that employees either won’t touch or can’t use effectively.
This isn’t a technology problem. It’s not even really a budget problem. It’s a leadership problem masquerading as an implementation challenge.
The Brutal Math of the Adoption Gap
The statistics paint a devastating picture. 42% of companies abandoned most of their AI initiatives in 2025, up from just 17% in 2024. The average organization scrapped 46% of AI proof-of-concepts before they reached production.
Think about what that means in real terms. If your company spent $10 million on AI last year, statistically $4.6 million of that went into the dumpster before you even got to production. And of the $5.4 million that made it through? Most of it is sitting unused in systems your employees have already learned to work around.
I’ve seen this pattern repeat itself over and over. A company announces a major AI initiative. Six months later, they’ve deployed the shiny new tools. A year later? The tools are there, the dashboards are there, but when you talk to the people actually doing the work, they’re still using spreadsheets and workarounds because the AI “doesn’t quite fit how we really work.”
The problem Eleanor identified is that companies are treating AI adoption like a deployment problem: Buy the right tools. Set them up correctly. Train the users. Check the box.
But deployment isn’t adoption. Deployment is what IT does. Adoption is what leaders create.
The Four Stages of AI Maturity (And Why Most Companies Get Stuck at Stage 1)
During the panel, Eleanor and the other leaders revealed something fascinating: there’s actually a maturity spectrum for how companies think about AI value. Most organizations don’t even realize they’re stuck at the bottom.
Stage 1: Productivity Focus
The Question: “Can we do the same work faster?” Example Companies: ServiceNow, Uber (according to Eleanor) What It Looks Like: Measuring hours saved, tasks automated, clicks reduced
This is where most companies live. They’re asking: Can AI help us create reports faster? Can it summarize meetings? Can it draft responses quicker?
There’s nothing wrong with productivity gains. But here’s the trap: productivity metrics are easy to measure and easy to sell to the CFO, so companies stop there. They declare victory when they’ve shaved 15 minutes off someone’s daily workflow, never realizing they’re leaving 90% of the value on the table.
Stage 2: Business Outcomes
The Question: “What’s the ROI?” Example Companies: Unilever, Coca-Cola (according to Eleanor) What It Looks Like: Revenue impact, cost savings, margin improvement
At this stage, companies start connecting AI to actual business results. It’s not about “hours saved,” it’s about “revenue generated” or “costs eliminated.”
The challenge? Only 6% of organizations qualify as “AI high performers” achieving 5%+ EBIT impact. Most companies that try to measure business outcomes discover they can’t actually draw a clean line from their AI investment to their bottom line.
Stage 3: Deep Adoption
The Question: “How many people are using it well?” Example Companies: Anthropic (according to Eleanor) What It Looks Like: User engagement, proficiency levels, workflow integration
Here’s where it gets interesting. Shobie Ramakrishnan from GSK shared a telling example during the panel: “We gave out AI Companions to 70,000 people in our company and we didn’t do any marketing on it. We put it out and we let people use it. 52,000 people log in every month because it makes their lives better.”
Do the math. That’s a 74% voluntary adoption rate. No mandates. No training quotas. No executive emails about “driving AI adoption.” Just people choosing to use something because it actually helps them do their job.
That’s what deep adoption looks like. And it’s the prerequisite for Stage 4.
Stage 4: Growth Acceleration
The Question: “How much can we really accelerate growth with AI?” What It Looks Like: New capabilities, new markets, new business models
This is the transformation everyone’s chasing. It’s when AI doesn’t just make your current business faster, it makes your business fundamentally different. It’s GetYourGuide using AI to scale from local experiences to global reach. It’s GSK identifying which patients will benefit from which medicines, fundamentally changing how they design clinical trials.
But here’s the reality: You cannot skip to Stage 4, it’s a maturity curve. And most companies are trying to.
They want transformation (Stage 4) while only measuring productivity (Stage 1), without ever building deep adoption (Stage 3). It’s like trying to run before you can walk, except you haven’t even learned to crawl yet.
Why Smart Leaders Are Measuring the Wrong Things
The panel revealed something crucial about why this gap persists. When asked about metrics, the investors said “revenue growth.” The product leaders said “adoption rates” and “productivity gains.” But Shobie from GSK gave the most sophisticated answer:
“We look at metrics through three lenses. One is outcome metrics. Has it improved? Does it help us make money, save money, move the medicine faster through the pipeline? We are absolutely relentless in chasing the outcome metrics. The second one that I look for are learning loops beginning to form between AI and humans, because without that, you can’t really say you’ve got diffusion going on. And the third is fluency metrics, which is how many people are learning to use it.”
Read that middle part again. Learning loops between AI and humans.
Most companies are measuring lagging indicators (revenue, costs) without any visibility into the leading indicators (are people actually learning? Are patterns emerging? Is the system getting smarter through use?).
It’s like trying to drive by only looking in the rearview mirror. You can see where you’ve been, but you have no idea where you actually are, nor visibility into where you’re going.
The Leadership Failure Mode That’s Costing You Millions
Here’s the pattern I see constantly:
C-suite decides AI is strategic priority
IT/procurement deploys the best tools money can buy
Training happens (usually one-time, usually generic)
Six months later, executives ask why adoption is at 15-20%
Everyone blames the users (”they’re resistant to change”) or the technology (”it wasn’t ready”)
But the real failure happened in step 1. The C-suite treated AI adoption as a deployment problem instead of a leadership problem.
Eleanor nailed it: “The way we’ve been thinking about it is really to think it through the pain of the individual contributors and the way we can really insert ourselves into their day-to-day workflow... It’s about being really seamless and helping integrate and not adding paint to what they do.”
Notice what she didn’t say. She didn’t say “we deployed the best AI tools.” She didn’t say “we mandated usage.” She said they focused on actual workflow pain and seamless integration.
A Barclays leader she spoke with put it even more bluntly: “That’s really where they see value today is only when we are able to truly integrate into their day-to-day workflow without changing the output.“
Think about that. The best AI doesn’t change what people do, it makes what they already do better. But most companies try to force people to change everything about how they work, then wonder why adoption fails.
The Uncomfortable Truth About Your AI Strategy
If you’re stuck at Stage 1 (productivity focus), measuring hours saved and tasks automated, here’s what’s really happening:
You’re deploying solutions before understanding problems
You’re buying tools before talking to the people who’ll use them
You’re measuring outputs (tasks completed) instead of outcomes (business impact)
You’re treating adoption as a training problem instead of a design problem
And here’s the most uncomfortable truth: Your employees already know your AI strategy isn’t working. According to research, over 90% of employees secretly use personal AI tools like ChatGPT at work, often with higher ROI than your official enterprise deployments.
They’re not resistant to AI. They’re resistant to your AI.
How to Actually Bridge the Adoption Gap
The companies getting this right, the ones in Stages 3 and 4, share common patterns:
1. They Start With Jobs, Not Tools
GSK didn’t start with “we need AI.” They started with “we need to identify which patients benefit from which medicines.” The AI was the answer to a real problem, not a solution looking for a problem.
As I wrote in “It’s Not The What, It’s The Why”, understanding the job people are trying to do is foundational. Until you know why someone is doing something, you can’t build tools that actually help them do it better.
2. They Measure Learning, Not Just Usage
Tracking logins isn’t enough. Are people getting better at using the tools? Are they discovering new capabilities? Are feedback loops forming where the AI learns from the humans and the humans learn from the AI?
If you can’t answer these questions, you’re flying blind.
3. They Treat Adoption as a Leadership Challenge
MIT’s research shows that 95% of AI pilots fail, and the primary reasons aren’t technical. They’re organizational: poor change management, lack of executive alignment, insufficient training, and failure to redesign workflows.
These are leadership problems. Full stop.
4. They Integrate, Don’t Add
The Barclays principle: integrate into existing workflows without changing the output. This requires deep understanding of how work actually gets done, not how you think it gets done or how the process map says it gets done.
5. They Build Proof Through Voluntary Adoption
When 74% of GSK employees voluntarily use AI tools without marketing or mandates, that’s proof the tools work. When you have to mandate usage and track compliance, that’s proof they don’t.
The Questions You Should Be Asking
If you’re a leader responsible for AI adoption, stop asking “How do we increase AI usage?” and start asking:
What job are people actually trying to do? (Not what an idealized process map says, what they actually do)
Where does our AI create friction instead of reducing it? (Be brutally honest)
Who are our power users and what do they know that we don’t? (The people using it well can teach you what’s working)
Are we measuring learning or just measuring logins? (Usage metrics lie)
What would have to be true for people to choose our AI tools over their workarounds? (This question cuts through all the BS)
The Hard Truth
Your AI budget doesn’t matter if your people won’t use the tools.
Your executive commitment doesn’t matter if you’re not willing to do the hard work of understanding how work actually gets done.
Your technology stack doesn’t matter if you’re treating adoption as a deployment problem instead of a leadership challenge.
The $100 million adoption gap isn’t about the technology. It’s not even about the money. It’s about leaders who are buying solutions before they understand the problems, deploying tools before they understand the workflows, and measuring productivity when they should be measuring learning.
The companies winning with AI, the ones at Stages 3 and 4, aren’t the ones spending the most. They’re the ones who recognize that adoption is a leadership problem that requires understanding people, workflows, and systems at a level most executives aren’t comfortable with.
Eleanor was right. The budget isn’t the problem anymore.
The question is: are you ready to solve the real problem?
What stage is your company stuck at? And more importantly, what are you going to do about it?
For more on AI leadership and bridging the gap between technology and actual business transformation, subscribe to Decoding the Signal.

