Locked, stocked, and losing budget: AI vendor lock-in bites back
Execs in the C-suite thought they could swap models in a week. They were hallucinating
Opinion The days when you could jump from one frontier AI model to another at the drop of a hat are going away as vendor lock-in starts to kick in, and prices increase.
Once upon a time, say last month, people thought nothing of jumping from one AI frontier model to another. One week, the hottest AI model was Gemini 3.1 Pro, then it was Claude 4.6, now, maybe, it's GPT-5.5. Next month? Who knows. That's fine for Joe Amateur Programmer, but for Janet Pro Programmer, it's another story.
You see, enterprise AI buyers face two converging problems. First, it's proving much harder to switch between AI vendors than people expected. At the same time, AI vendors are pushing through price increases that are reshaping software economics. We always knew this would happen. AI prices have been loss leaders for years now, and the bills are finally coming due.
A recent survey by AI orchestration platform provider Zapier of 542 US executives with active AI vendor contracts, found that nearly 90 percent believed they could switch AI vendors within four weeks, and 41 percent said they could do it in just 2–5 business days. Now who's hallucinating?
I've long thought that behind all the lip service company brass gives AI, most senior executives are completely clueless about what AI is and how to deploy it. This kind of delusional thinking is proof.
According to Zapier's report, only 42 percent of organizations that attempted to migrate between AI platforms report that it went smoothly. The remaining 58 percent? They say the process either failed outright or required significantly more effort than expected. Really. Who'd have thought it?
The trouble stems from all the layers of technical dependency that early adopters underestimated. AI implementations require vendor-specific APIs, proprietary training data, custom tooling for model deployment, and deep integrations into existing workflows, none of which transfer cleanly between providers.
According to Zapier: "The problem is that when AI is already woven into internal processes, connected to other systems, and tuned to specific workflows, it has dependencies, edge cases, and little adaptations that nobody documented because they were 'temporary.'"
It's not just the software which is making it harder to move. As AI consultant Haroon Choudery pointed out: "Switching model vendors is no longer just an API migration. It is context, workflows, and institutional memory." Moving any of that from one vendor's platform to another isn't easy, and it's even worse if you don't have a handle on what you've got locked into those three areas. Guess what? Choudery observed, "Most operators I talk to haven't mapped any of them."
I'm not surprised. This is yet more proof that your C-level executives don't have a clue about what they're doing by pouring their resources into AI as fast as they can.
Some people I've spoken to seem to think that, because AI costs so little, even if moving from one to another is expensive, they can to afford it because the models themselves are so cheap.
Yet AI providers which are losing money hand over fist are finally raising prices across the board. For example, OpenAI increased the cost for developers using its flagship GPT-5.2 model from $1.25 per input token in the previous GPT-5.1 to $5.75. Ouch!
It's not just OpenAI. Anthropic confirmed a de facto price increase for its Claude enterprise edition on April 15, 2026, when it moved from fixed pricing to a dynamic usage-based model. Experts think this could double or triple costs for heavy-duty users.
You don't have to be a hardcore AI developer to see this. For example, when I wrote this, you could no longer get a new GitHub Copilot subscription. GitHub is also restricting the compute you'll get from its individual subscription plans, while dropping access to Opus models entirely. I do hope you weren't planning on launching your business around GitHub Copilot.
It's not just pure AI programs where you'll see this. AI costs are also pushing up prices for programs such as Microsoft 365.
- AI bug reports went from junk to legit overnight, says Linux kernel czar
- Nanny state discovers Linux, demands it check kids' IDs before booting
- Project Glasswing and open source software: The good, the bad, and the ugly
- OK, so Anthropic's AI built a C compiler. That don't impress me much
- Open source devs consider making hogs pay for every download
Everyone is gonig to do this. There will still be sweetener fixed-price tiers, but you'll get less compute power in them. Like it or lump it, we're heading to a token-based pricing structure and the end of fixed-price tiers.
As Nick Turley, an OpenAI executive, said recently, "There's no world in which pricing doesn't significantly evolve." You think? As for those all-you-can-eat plans? Forget about them. They're history.
These pricing changes reflect fundamental realities of infrastructure. Memory chip prices, in case you haven't noticed, are giving gold a run for its money. All those gigawatt AI data centers aren't going to pay for themselves either.
As Datos Insights CEO and co-founder Eli Goodman told Reworked last year: "The most common myth is that AI works like regular software. That's not true; every query has a real cost. The provider's bill goes up when you use more."
AI is not like Software-as-a-Service (SaaS), where costs shrink with scale. We talk about how much AI training costs, but every query you make and agent you launch costs you inference tokens. In short, the more you use AI, under its new pricing structure, the more it's going to cost you.
Nik Kale, Cisco principal engineer and product architect, added: "Microsoft's increases aren't a temporary spike — they're the beginning of a new price baseline for the AI era. GPU capacity, inference scaling, and the rising energy demands of large-model workloads have become structural, recurring costs. Vendors can't absorb them anymore."
Can you? Well, you're going to find out.
But, wait, there's more! Say you're running Meta Llama on your own hardware. You're safe then, right? Right? Wrong. First, Llama was never, ever really open source. So, when Meta decided to turn it into abandonware in favor of its proprietary Muse Spark, you're left in the lurch.
"The question isn't whether AI is useful," the Zapier report noted. "What happens when the AI you depend on disappears, spikes its prices, or gets acquired by a private equity firm that's going to strip it for parts?"
That's a darn good question, isn't it? Do you have an answer? You'd better start working on one. The more you've already invested in AI, the more you're almost certainly locked into specific vendors, and I guarantee you their prices are going to increase to everything the market can bear and then some more. ®




