OpenAI jumps out of Microsoft's bed, into Amazon's Bedrock
Altman's gaggle of GPTs now available in limited preview in an AWS region near you
OpenAI's top models are officially available on Amazon Web Services' Bedrock managed inference and agent platform.
The collaboration, announced at an AWS event in San Francisco on Tuesday, provides an alternative avenue for accessing Altman and company’s growing library of GPTs without having to expose your data to OpenAI's APIs.
Amazon contends that enterprises want to build agents and other AI-augmented tools using OpenAI's models, but have been stopped by security policy, data privacy, and sovereignty concerns.
By opening its models up to a trusted third party, OpenAI can sidestep many of these concerns. Bringing its models to AWS also means that customers don't need to jump through nearly as many hoops to adopt its models, since Amazon has already put the effort into connecting its services to Bedrock.
Alongside the managed inference service, OpenAI's models will also be made available on Amazon's Bedrock Managed Agents and AgentCore platforms, which provide tools and blueprints for building enterprise agents and connecting them to enterprise data and services. AWS also announced a whole host of new agentic AI tools for its own end-customers at the same time, including Quick, a personalized assistant akin to Microsoft Copilot but for apps from multiple vendors, and various new flavors of Connect, which was originally Amazon's hosted CRM product but is expanding to help customers automate tasks in HR, health care, and supply chain management.
Finally, enterprises will be able to connect OpenAI's Codex code agent to models running in AWS datacenters, providing some level of assurance that their codebases won't end up in Altman's next model.
For the moment, access to OpenAI's models on AWS remains in limited preview, with the LLM-maker's second-newest GPT-5.4 model available now; the later GPT-5.5 is coming within the next couple of weeks, according to AWS CEO Matt Garman's remarks at an event in San Francisco on Tuesday.
- Tokenmaxxing isn't an AI strategy
- Locked, stocked, and losing budget: AI vendor lock-in bites back
- Microsoft's GitHub shifts to metered AI billing amid cost crisis
- Microsoft and OpenAI's open relationship is now official
Tuesday's announcement makes good on OpenAI's promise in February to make its models available on AWS in exchange for up to $35 billion in new financing. However, to claim all of it, OpenAI will have to spin up two gigawatts of Amazon's Trainium accelerators.
It also seems that much of this was possible thanks to Microsoft's willingness to open its relationship with OpenAI in exchange for being freed from its revenue sharing commitments.
Under the new terms, Microsoft remains OpenAI's primary cloud provider and retains access to the model dev's tech. OpenAI, meanwhile, is free to get in bed with anyone it likes, be it Amazon or someone else.
As such, the new terms mean OpenAI's tie up with Amazon may not be a one off, but rather a blueprint for future infrastructure and services deals. ®




