Sam Altman Exclusive Interview: Why OpenAI is Breaking Up with Microsoft?
- Core Insight: The focus of AI competition is shifting from "the most powerful model" to "enterprise-grade infrastructure." The Bedrock Managed Agents launched through OpenAI's partnership with AWS are not simply about listing models on a platform; they deeply embed AI into AWS's native identity, permission, data, and security systems, enabling enterprises to build "virtual colleagues" capable of executing internal tasks.
- Key Elements:
- Revised OpenAI-Microsoft Agreement: Azure no longer has exclusive rights to OpenAI models. OpenAI can now expand its products to other cloud platforms like AWS. Microsoft has ceased paying revenue shares and has relinquished exclusive IP licensing.
- Product Core: Bedrock Managed Agents encapsulate OpenAI's frontier models within AWS's agent runtime, including native services like identity authentication, permission management, logging, and governance, lowering the barrier for enterprises to deploy AI agents.
- Enterprise Deployment Challenges: Enterprise AI agents need to navigate databases, SaaS applications, permission systems, and compliance requirements—a far more complex environment than local setups. This collaboration aims to provide a one-click enterprise-grade solution.
- Shifting Competitive Landscape: Future AI competition will hinge on who can build a platform that enterprises trust to use safely, scale continuously, and execute real work, rather than just competing on API pricing, chip performance, or model leaderboards.
- Roles of Both Parties: Enterprise customers can directly contact AWS for support, with data remaining within their AWS VPC. OpenAI and AWS are co-building the product and have committed to deploying models on AWS's self-developed Trainium chips.
Original title: An Interview with OpenAI CEO Sam Altman and AWS CEO Matt Garman About Bedrock Managed Agents
Original author: Ben Thompson, Stratechery
Original translation: Peggy, BlockBeats
Editor's note: On April 27, OpenAI and Microsoft just amended their agreement. Azure will no longer have exclusive access to OpenAI models, allowing OpenAI to expand its products to other cloud platforms like AWS.
Note: Azure is Microsoft's cloud computing platform, usually referred to as Microsoft Azure. Like AWS and Google Cloud, it primarily provides enterprises with cloud services such as servers, databases, storage, networking, security, and AI model deployment.
To the outside world, this might seem like a mere change in cloud service distribution channels. However, based on the discussion between Sam Altman and AWS CEO Matt Garman, the more critical shift is that AI is moving from "model invocation" to the "enterprise workflow" stage.
This article is compiled from an interview between the tech business analysis media Stratechery and Sam Altman and Matt Garman. It focuses on the launch of Bedrock Managed Agents powered by OpenAI, discussing the parallels between cloud computing and the AI platform shift, the deployment challenges of enterprise agents, the difference between AgentCore and managed services, and AWS's position in the AI infrastructure competition.
Note: Stratechery was founded by tech analyst Ben Thompson. It has long focused on tech company strategy, platform economics, cloud computing, AI, and media industry changes. Its content features in-depth analysis and executive interviews, holding significant influence in Silicon Valley tech and investment circles, and is often seen as a key window for observing the strategic directions of major tech companies.
The core of Bedrock Managed Agents is not just allowing AWS customers to use OpenAI models, but embedding the models into AWS's native identity, permissions, logging, governance, deployment, and security systems. In other words, what enterprises truly need is not a smarter chat window, but a system of "virtual colleagues" that can operate within the organization, access data, execute tasks, and adhere to permission boundaries.
This is the most noteworthy aspect of this collaboration: the focus of the AI competition is shifting from "who has the strongest model" to "who can turn models into usable enterprise infrastructure." In individual developer scenarios, Codex can leverage the local environment to solve many complex problems. But in enterprise scenarios, agents must contend with databases, SaaS, permission systems, security boundaries, and compliance requirements.
In a sense, this collaboration echoes the logic of the early cloud computing era. AWS lowered the startup costs for entrepreneurs, allowing small teams to build internet products without owning servers. Today, OpenAI and AWS are attempting to lower the barrier for enterprises to deploy AI agents, enabling companies to integrate AI into real business processes without having to piece together models, permissions, data, and security systems themselves. The difference is that this time, adoption is faster, and the enterprise demand is more urgent.
Therefore, this article is not really about OpenAI models being "listed" on AWS. It's about the AI infrastructure entering its next phase: the deep coupling of models, cloud, data, and enterprise permission systems. Future competition may no longer be just about API prices, chip performance, or model benchmark rankings, but about who can build an AI platform that enterprises trust, can scale, and can actually execute work.
Below is the original interview:
Introduction
Good morning, as I mentioned yesterday, today's Stratechery interview is early in terms of my posting schedule (moved from Thursday to Tuesday); but it's also late in terms of actual delivery time (pushed from 6 AM ET to 1 PM ET) because the topic was under an embargo.
Over the past few days, this embargo placed me in a slightly awkward position: Last Friday, I interviewed OpenAI CEO Sam Altman and AWS CEO Matt Garman about Bedrock Managed Agents, powered by OpenAI. Naturally, one of my questions was: How does this collaboration reconcile with the agreement between OpenAI and Microsoft that gave Azure exclusive access to OpenAI models?
Note: Bedrock Managed Agents is a managed AI agent service launched by AWS, powered by OpenAI models. It's not just about letting enterprises call OpenAI models on AWS; it embeds the models into AWS's native identity authentication, permission management, logging, security, governance, and deployment systems. This allows enterprises to build AI agents within their cloud environment that can execute tasks, access internal data, and adhere to permission boundaries. Simply put, think of it as OpenAI agent infrastructure running within the AWS enterprise environment.
Late Sunday, I heard through the grapevine that Microsoft would announce something on Monday. At the time, I wondered if it might be a preemptive lawsuit!
By Monday, Microsoft and OpenAI announced they had revised their agreement, allowing OpenAI to offer its products on other cloud providers, including AWS.
And so, here is this interview.
I believe the new arrangement between Microsoft and OpenAI makes good sense for both parties. Here are the key points from the new agreement listed in Microsoft's official announcement:
· Microsoft remains OpenAI's primary cloud partner; OpenAI products will be released on Azure first, unless Microsoft cannot or chooses not to support the necessary capabilities. OpenAI can now offer its full product suite to customers through any cloud provider.
· Microsoft will continue to receive licenses to OpenAI model and product-related IP, valid until 2032. However, Microsoft's license will no longer be exclusive.
· Microsoft will no longer pay revenue share to OpenAI.
· The revenue share paid by OpenAI to Microsoft will continue until 2030, unaffected by OpenAI's technological progress. The percentage remains the same but is capped.
· Microsoft, as a major shareholder, will continue to participate directly in OpenAI's growth.
I think the last point is the most important. Previously, Azure had a real competitive advantage as the only hyperscaler offering OpenAI models. However, this exclusivity was also constraining OpenAI, especially as more and more enterprises cared most about accessing the model on their current cloud platform. I've pointed out multiple times that this was a significant competitive advantage for Anthropic. In other words, Azure's exclusive rights were actually hurting Microsoft's investment in OpenAI. Given Anthropic's rapid growth this year, Microsoft needs to protect that investment, even if it means diluting Azure's differentiation.
At the same time, OpenAI clearly sees AWS as a massive opportunity – big enough to forgo some revenue related to Azure over the next few years. Combined with the point above, this also makes it easier for Azure management to accept losing exclusivity: after all, without paying revenue share to OpenAI, Azure's profit and loss statement looks much better. OpenAI has also released Microsoft from the AGI provisions; now, regardless of what happens, the agreement between the two companies will last until 2032.
It now seems quite clear that OpenAI's next focus will be on AWS. And the strongest evidence is the subject of this interview: Bedrock Managed Agents powered by OpenAI. The simplest way to understand this product is as Codex for AWS. Codex works well largely because it's local, which naturally solves many complex problems, especially security. But getting agents to work across departments and systems within an organization is a completely different story. The goal of this product is to make it easier for organizations that already have most of their data on AWS to use such workflows.
In this context, this interview explores how AWS pioneered the entire cloud computing category and its impact on startups; the similarities and differences between AI and that paradigm shift. We then discuss Bedrock Managed Agents: what it is, and how it differs from Amazon's existing AgentCore product. We also talk about Trainium, why chips won't matter that much for most AI users, and why partnership is a logical choice compared to Google's emphasis on full-stack integration.
A reminder, all Stratechery content, including interviews, is available as a podcast; click the link at the top of this email to add Stratechery to your podcast player.
On to the interview.
Interview Content
This interview has been lightly edited for clarity.
OpenAI Enters AWS, Azure Exclusive Era Ends
Ben Thompson (Host): Matt Garman, Sam Altman — Matt, welcome to Stratechery; Sam, welcome back. I've interviewed Sam Altman before in October 2025, March 2025, and February 2023.
Sam Altman (OpenAI CEO): Thank you.
Matt Garman (AWS CEO): Thank you for having me.
Host: Matt, this is your first time on Stratechery. Unfortunately, I think Sam's presence will prevent us from doing our usual "getting to know the guest" segment. Besides, he probably doesn't want to hear us reminisce about our days at Kellogg. But it's great to have a fellow alum on the podcast.
Matt Garman: Yes, I'm happy to be here. I can come back another time, and we can go deeper.
Host: That would be great. You started at AWS as an intern and are now running the whole organization during the AI wave. In what ways do you think building the AI business is similar to building the original general-purpose computing business — let's call it that for now? And in what ways is it truly different?
Matt Garman: I think the similarity lies in seeing the same excitement and seeing builders outside being able to do things they couldn't do before. Back when we started AWS, the cool thing was that developers suddenly had access to infrastructure that only the largest companies could use before. You had to have a multi-million dollar budget for a data center. Now, a developer with just a credit card and a few dollars could launch an application. It massively expanded what internet builders could do.
Our philosophy was that people could build whatever they wanted. We wouldn't presume to dictate what they should build. We believed in the creativity of people worldwide; put powerful tools in their hands, and they would create interesting and amazing things.
I think AI is at least as transformative for builders, probably even more so. Think about what's possible now: you don't need to go to school for ten years to learn programming to build an app; you don't need a huge team of hundreds; you don't need months and months and months to build something. You can build quickly, iterate quickly, with a small team. AI is unlocking innovation across every sector. In many ways, it feels very similar. It's truly exciting to see the capabilities it's giving our customer base.
Host: But back when AWS emerged, you were the only player, so the benefits and drawbacks naturally accrued to you. Is there a sense that in the AWS era, it was about general-purpose computing — making compute fungible, elastic, cheap? But in AI, especially during the training phase, the winning abstraction seems more like the highly vertically integrated supercluster, very advanced networking, extremely tight coupling between software and hardware. Was this a surprise for you? Because you weren't starting from zero, not in a "we're the only ones here" scenario. You had a specific understanding of large-scale computing, but it didn't seem to align perfectly in the early years of AI.
Matt Garman: I'm not sure it was that different for us. I think the truly different thing is how incredibly fast the adoption has been. I think that surprised everyone. Sam, feel free to disagree. But the rate at which people have grasped these capabilities and run with them, I think, has exceeded everyone's expectations.
It was very different when we started cloud computing. We spent a very long time explaining why a book-selling company was offering compute. We had to work very hard to explain what cloud computing was. There was a lot of heavy lifting, which people often forget now. In 2006, no one took it for granted that the world's computing would move to the cloud. There was a lot of hard explanation and pushing.
Host: Do you feel that need for explanation now? Because many people were initially anchored in the training era, and you might say, "we're thinking about the inference era," which is a different beast. Do you need to re-deploy that explanatory capability?
Matt Garman: Yes, but the speed at which people understand what you're saying is completely different. So I think, yes, there is some education needed when moving people from "this looks cool, I can talk to a smart chatbot" to "this can actually do work inside your company." But that educational process is happening relatively fast given the pace of technological evolution.
Host: I promise we'll get to today's product soon. Sam, from an entrepreneurial ecosystem perspective, AWS was obviously transformative. It completely changed the startup barrier. Now anyone can start. Seed investors, angels emerged. The funding bar was pushed back. You didn't need "we're going to buy servers" in your pitch deck; you could build an app first, then go for your A round.
From your perspective, how is the world AWS enabled different from the world AI is enabling now, and how is it similar?
Sam Altman: I think there have been four major platform shifts that massively enabled startups: the internet, cloud, mobile, and AI. Of those four, the first I experienced as an adult was the cloud. In the early days of YC, it's hard to overstate how much that changed things for startups.
Before that, startups had to rent colocation space, assemble their own servers, rack and stack them. It was incredibly complex, and you had to raise a lot of money first. Then suddenly, the cloud arrived. Although YC started right before the cloud really took off, maybe in its second year.
Host: I was going to ask that — ultimately, are YC and the cloud more intertwined than you realized at the time?
Sam Altman: We felt they were highly intertwined at the time. It felt like YC was riding this wave from the start, even though there were some earlier examples of cloud services before AWS.
Host: If AWS exists, the capital required to start a startup is indeed significantly less than before.
Sam Altman: It was a huge enabling change, and it's why YC sounded so crazy at the time. People would say, "You can't invest a few tens of thousands in a startup; it's impossible, their server costs alone are more than that." So it completely changed what startups could do with small amounts of capital.
Generally, when there's a major platform shift and you can do things faster with less capital, startups win. That's the classic way startups beat incumbents. I saw that firsthand early in my career with the cloud. Now watching companies build with AI, it feels very similar in direction. But as Matt said, the speed this time is crazy.
Host: Is there a sense that incumbents, the big companies, are adopting AI much faster than they adopted cloud computing?
Sam Altman: That is more the case. But I'm also talking about the speed of startup revenue growth. I gave a talk at YC recently and asked at the end, "What's the expected revenue for a good company at demo day now?" They said, "That answer changes every month. Maybe even between the start and end of a single batch." I've never seen that happen before. The speed at which people are building substantial businesses on this new platform is unlike anything I've seen.
Host: Matt, throughout the cloud era, AWS was essentially the default cloud for all startups, which gave you a huge advantage. What makes you the default cloud today? Because now many people are building on the OpenAI API. Do you feel, "we're coming at this from a very different angle. We have this massive installed base demanding AI, but we're less visible to the startup community Sam is talking about"?
Matt Garman: I think there are several aspects. First, we're very excited about this partnership, and I think it will be very significant for startups too. But even today, if you talk to startups, most scaling startups are still scaling on AWS for many reasons. The scale is there, the availability is there, the security is there, the reliability is there, the partner ecosystem of other ISVs is on AWS, and their customers are on AWS.
Host: (Laughs) Whether they like it or not, they've all used the AWS console, so they're used to it.
Matt Garman: And we help them. We spend a ton of time enabling startups, not just with credits, but with advice on how to architect their systems, how to think about go-to-market, and many things like that. I think a lot of startups appreciate that. We invest a huge amount of time and effort to ensure that, because we truly believe startups are the lifeblood of AWS. They always have been, as Sam said, and they still are today. I still go to the Valley or other places every quarter to meet directly with startups, understand what they're doing, and make sure we're building things that actually meet their needs.
So the competition for startup mindshare today is definitely higher than it was 20 years ago. But it remains just as important to us. We invest a lot of time in trying to make sure we meet those startup needs.
Host: Is it fair to say that people building directly on the OpenAI API, rather than using the Azure version of the OpenAI service, are more likely to use a stack where their general compute is on AWS and their AI is with OpenAI?
Matt Garman: I think that's a very common pattern for many startups today. Absolutely.
Bedrock Managed Agents: Bringing AI Agents into Enterprise Workflows
Host: That brings us to today's announcement: Bedrock Managed Agents powered by OpenAI. I think I got that right. My understanding is that the selling point isn't just that OpenAI models are available within AWS — which I didn't think was allowed yet — but that OpenAI's frontier models are wrapped in an AWS-native agent runtime that includes identity, permissions state, logging, governance, and deployment. Sam, is that an accurate description?
Sam Altman: Yes, that's a good summary.
Host: Great. So what is this? Now explain it in plain English.
Sam Altman: I think the next phase of AI is moving from "you give an


