Sam Altman mới đây phỏng vấn: Tại sao OpenAI lại muốn "chia tay" với Microsoft?
- Quan điểm cốt lõi: Trọng tâm cạnh tranh AI đang chuyển từ "mô hình mạnh nhất" sang "cơ sở hạ tầng doanh nghiệp". Sự hợp tác giữa OpenAI và AWS ra mắt Bedrock Managed Agents không đơn thuần là đưa mô hình lên kệ, mà là tích hợp sâu AI vào hệ thống bản địa về danh tính, quyền hạn, dữ liệu và bảo mật của AWS, giúp doanh nghiệp xây dựng các "đồng nghiệp ảo" có khả năng thực thi các tác vụ nội bộ.
- Các yếu tố then chốt:
- Sửa đổi thỏa thuận giữa OpenAI và Microsoft: Azure không còn độc quyền mô hình của OpenAI nữa, OpenAI có thể mở rộng sản phẩm của mình sang các nền tảng đám mây khác như AWS; Microsoft ngừng chia sẻ doanh thu và từ bỏ cấp phép độc quyền IP.
- Cốt lõi sản phẩm: Bedrock Managed Agents đóng gói các mô hình tiên tiến của OpenAI vào môi trường runtime agent của AWS, bao gồm các dịch vụ bản địa như xác thực danh tính, quản lý quyền hạn, nhật ký, quản trị, giúp hạ thấp rào cản triển khai AI agent cho doanh nghiệp.
- Khó khăn khi triển khai doanh nghiệp: AI agent doanh nghiệp phải đối mặt với cơ sở dữ liệu, SaaS, hệ thống phân quyền và yêu cầu tuân thủ, phức tạp hơn nhiều so với môi trường cục bộ. Sự hợp tác này nhằm cung cấp một giải pháp doanh nghiệp "một chạm".
- Sự thay đổi trong cục diện cạnh tranh: Tương lai, cạnh tranh AI sẽ phụ thuộc vào việc ai có thể xây dựng nền tảng mà doanh nghiệp tin tưởng sử dụng, có thể mở rộng liên tục và thực sự thực thi công việc, chứ không chỉ đơn thuần so sánh giá API, hiệu năng chip hay bảng xếp hạng mô hình.
- Vai trò của hai bên: Khách hàng doanh nghiệp có thể liên hệ trực tiếp với bộ phận hỗ trợ của AWS, dữ liệu được giữ lại trong VPC của AWS; OpenAI và AWS cùng xây dựng sản phẩm và cam kết triển khai mô hình trên chip Trainium tự phát triển của AWS.
Original Title: An Interview with OpenAI CEO Sam Altman and AWS CEO Matt Garman About Bedrock Managed Agents
Original Author: Ben Thompson, Stratechery
Original Translation: Peggy, BlockBeats
Editor's Note: On April 27, OpenAI and Microsoft revised their cooperation agreement, meaning Azure no longer has exclusive access to OpenAI models. This allows OpenAI to expand its products to other cloud platforms like AWS.
Note: Azure is Microsoft's cloud computing platform, officially known as Microsoft Azure. Like AWS and Google Cloud, it provides enterprises with cloud services such as servers, databases, storage, networking, security, and AI model deployment.
To the outside world, this might seem like a simple change in cloud service distribution channels. However, judging from the discussion between Sam Altman and AWS CEO Matt Garman, the more critical shift is that AI is moving from "model invocation" to the "enterprise-grade workflow" phase.
This article is compiled from an interview by Stratechery, a tech business analysis media outlet, with Sam Altman and Matt Garman. It focuses on the Bedrock Managed Agents launched through the OpenAI and AWS partnership, discussing the parallels between the cloud computing and AI platform shifts, the challenges of deploying enterprise-grade agents, the difference between AgentCore and managed services, and AWS's position in the AI infrastructure competition.
Note: Stratechery was founded by tech analyst Ben Thompson. It focuses on tech company strategy, platform economy, cloud computing, AI, and media industry changes. Its content primarily features in-depth analysis and executive interviews, holding significant influence in Silicon Valley's tech and investment circles and is often seen as a key window for observing the strategic moves of major tech companies.
The core of Bedrock Managed Agents is not just about making OpenAI models available to AWS customers, but embedding these models into AWS's native identity, permission, logging, governance, and security systems. In other words, what enterprises truly need is not a smarter chat window, but a system of "virtual colleagues" that can operate within the organization, access data, execute tasks, and adhere to permission boundaries.
This is the most noteworthy aspect of this partnership: the focus of AI competition is shifting from "who has the most powerful model" to "who can turn models into usable enterprise infrastructure." In individual developer scenarios, Codex can leverage the local environment to solve many complex problems. But in enterprise scenarios, agents must contend with databases, SaaS systems, permission frameworks, security boundaries, and compliance requirements.
In a sense, this collaboration re-enacts the early logic of cloud computing. AWS lowered the startup costs for companies, allowing small teams to build internet products without building their own servers. Now, OpenAI and AWS are trying to lower the barriers for enterprises to deploy AI agents, enabling companies to integrate AI into real business processes without having to piece together models, permissions, data, and security systems themselves. The difference this time is the faster adoption rate and the more urgent enterprise demand.
Therefore, this article is not really about OpenAI models being "listed" on AWS, but about AI infrastructure entering its next phase: models, cloud, data, and enterprise permission systems beginning to deeply integrate. Future competition may no longer be just about API prices, chip performance, or model leaderboards, but about who can build an AI platform that enterprises can confidently use, continuously scale, and trust to do real work.
Below is the original text:
Introduction
Good morning. As I mentioned yesterday, today's Stratechery interview is early in terms of my publishing schedule (moved from Thursday to Tuesday), but late in terms of actual delivery (delayed from 6 AM to 1 PM Eastern Time), as the topic was subject to an embargo.
Over the past few days, this embargo put me in a slightly delicate position. Last Friday, I interviewed OpenAI CEO Sam Altman and AWS CEO Matt Garman about Bedrock Managed Agents, powered by OpenAI. Naturally, one of my questions was how this collaboration squares with the agreement between OpenAI and Microsoft that gave Azure exclusive access to OpenAI models.
Note: Bedrock Managed Agents is a managed AI agent service launched by AWS, powered by OpenAI models. It's not just about allowing enterprises to call OpenAI models on AWS, but embedding the models into AWS's native identity authentication, permission management, logging, security, governance, and deployment systems. This enables enterprises to build AI agents within their own cloud environment that can execute tasks, access internal data, and adhere to permission boundaries. Simply put, think of it as OpenAI agent infrastructure running within an AWS enterprise environment.
Late Sunday, I heard through the grapevine that Microsoft would announce something on Monday. I was wondering if it might be a preemptive lawsuit!
On Monday, Microsoft and OpenAI announced they had revised their agreement, allowing OpenAI to offer its products on other cloud providers, including AWS.
Thus, this interview came to be.
I believe this new arrangement between Microsoft and OpenAI makes a lot of sense for both parties. Here are the key points from the new agreement outlined in Microsoft's official post:
· Microsoft remains OpenAI's primary cloud partner; OpenAI products will launch first on Azure, unless Microsoft cannot or chooses not to support the necessary capabilities. OpenAI can now offer its full product suite to customers through any cloud provider.
· Microsoft will continue to receive licenses for IP related to OpenAI models and products until 2032. However, Microsoft's license will no longer be exclusive.
· Microsoft will no longer pay OpenAI a revenue share.
· The revenue share OpenAI pays to Microsoft will continue until 2030, unaffected by OpenAI's technological progress, with the same percentage but a total cap.
· Microsoft, as a major shareholder, will continue to directly participate in OpenAI's growth.
Most importantly, I think, is the last point. Previously, Azure had a real competitive advantage as the only hyperscaler offering OpenAI models. But this exclusivity was also constraining OpenAI, especially as more enterprises care most about accessing models on the cloud platform they already use. I have pointed out multiple times that this was a significant competitive advantage for Anthropic. In other words, Azure's exclusive rights were actually hurting Microsoft's investment in OpenAI. Given Anthropic's rapid growth this year, Microsoft needs to protect that investment, even if it means Azure's differentiation is weakened.
At the same time, OpenAI clearly sees AWS as a massive opportunity—one big enough to forgo some Azure-related revenue in the coming years. Combined with the previous point, this also makes it easier for Azure management to accept losing exclusivity: after all, without paying revenue share to OpenAI, Azure's profit and loss statement looks much better. OpenAI has also released Microsoft from the AGI clause; now, regardless of what happens, the agreement between the two companies will last until 2032.
It now seems quite clear that OpenAI's next focus will be on AWS. The strongest evidence is the subject of this interview: Bedrock Managed Agents, powered by OpenAI. The simplest way to understand this product is as Codex within AWS. Codex works well largely because it is local, which naturally resolves many complex issues, especially security. But making agents operate across departments and systems within an organization is a completely different matter. The goal of this product is to make it easier for organizations that already have most of their data on AWS to use such workflows.
Around this point, our discussion covered how AWS pioneered the entire cloud computing category and its impact on startups, the similarities and differences between AI and that paradigm shift, what Bedrock Managed Agents is and how it differs from Amazon's existing AgentCore product, Trainium, why chips might not be so important for most AI users, and why cooperation is a reasonable choice compared to Google's emphasis on full-stack integration.
As a reminder, all Stratechery content, including interviews, is available as a podcast. Click the link at the top of this email to add Stratechery to your podcast player.
Let's get into the interview.
Interview Content
This interview has been lightly edited for clarity.
OpenAI Comes to AWS, End of the Azure Exclusivity Era
Ben Thompson (Host): Matt Garman, Sam Altman—Matt, welcome to Stratechery; Sam, welcome back. I've previously interviewed Altman in October 2025, March 2025, and February 2023.
Sam Altman (OpenAI CEO): Thank you.
Matt Garman (AWS CEO): Thanks for having me.
Host: Matt, this is your first time on Stratechery. Unfortunately, I think Sam's presence will prevent us from doing our usual "getting to know the guest" segment. Besides, he probably doesn't want to hear us reminisce about our days at Kellogg School of Management. Still, it's great to have a fellow alumnus on the podcast.
Matt Garman: Yes, I'm glad to be here. I can come back another time, and we can go deeper.
Host: That would be great. You've been with AWS since your internship, and now you're leading the entire AWS organization through the AI wave. How do you see the similarities and true differences between building the AI business now and building the original general-purpose computing business back then—if we can call it that for now?
Matt Garman: I think the similarity is the same excitement, and seeing builders out there able to do things they couldn't do before. When we started AWS, the cool thing was that developers suddenly had access to infrastructure that only the largest companies could use before. Previously, only companies with multi-million dollar budgets for data centers had such capabilities. Now, a developer could start an application with just a credit card and a few dollars. It massively expanded what internet builders could do.
Our idea was that people could build whatever they wanted. We wouldn't prescribe what they should do. We believed that creativity exists everywhere in the world; put powerful tools in their hands, and they would create interesting and amazing things.
I think AI's empowerment of builders is at least as transformative, possibly even more so. Think about what's possible now: you don't need to study programming for ten years to build an application; you don't need huge teams of hundreds; you don't need months and months to build things. You can build quickly and iterate fast with a small team. AI is unlocking innovation across every sector of the world. In many ways, it feels very similar. Seeing its impact on the customer base is genuinely exciting.
Host: But back when AWS emerged, you were the only player, so the benefits—and drawbacks—naturally fell to you. Was there a feeling that in the AWS era, a lot was about general-purpose computing—making compute fungible, elastic, cheap—while in AI, especially in the training phase, the winning abstraction seems more like highly vertically integrated superclusters, very advanced networking, and extremely tight coupling between software and hardware? Was that a surprise for you? Because this time, you weren't starting from scratch, or being "the only ones here"; you had a specific understanding of large-scale computing, but at least in the early years of AI, it didn't seem to perfectly align.
Matt Garman: I'm not sure how different it is for us. I think what's truly different is the speed of adoption has been astonishing. I think it surprised everyone. Sam, feel free to disagree. But the rate at which people adopted these capabilities and grasped them exceeded all expectations, I believe.
It's very different from when we started cloud computing. Back then, it took a very long time to explain why a book-selling company would offer computing power. We had to work hard to explain what cloud computing was. There was a lot of heavy lifting that people often forget now. But in 2006, no one took it for granted that the world's computing would move to the cloud. There was a lot of difficult explaining and pushing.
Host: Do you feel the need to do a similar kind of explaining now? Because many people initially anchored on the training era, and you guys would say, "We're thinking about the inference era," which would be something different. Are you having to reinvigorate that explanatory capability?
Matt Garman: Yes, but the speed at which people understand what you're saying is completely different. So, I think there's still some education needed to move people from "this looks cool, I can chat with a smart bot" to "it can actually do work in your enterprise." But given the speed of technical evolution, it's relatively fast.
Host: I promise we'll get to today's product soon. But Sam, from the startup ecosystem perspective, looking back, AWS was clearly transformative in changing the barrier to entry for startups. Now anyone can start. Seed rounds and angel investors followed, pushing the funding threshold later. You didn't need to write "we're buying servers" in your pitch deck; you could build an app first and then raise a Series A or whatever.
From your perspective, how is the world AWS opened up different from what AI is opening up today? And what are the similarities?
Sam Altman: I think there have been four major platform moments for empowering startups: the internet, cloud, mobile, and AI. Of these four, the one I experienced as an adult was the cloud. It's hard to overstate how much it changed things for startups in the early days of YC.
Before that, startups had to rent colocation space, assemble their own servers, and rack the equipment. It was incredibly complex, and you had to raise a lot of money first. Then, suddenly, the cloud appeared. Although the cloud came after YC was founded, it must have been in the second year.
Host: That's what I was going to ask—ultimately, are YC and the cloud inextricably linked more than you realized at the time?
Sam Altman: We felt they were highly intertwined at the time. It felt like YC was riding the cloud wave from the beginning, even though there were some earlier examples of cloud services before AWS.
Host: If AWS existed, the capital required to get a startup off the ground was indeed much less than before.
Sam Altman: It was a massive enabling shift, and part of why YC sounded so crazy back then. People would say, "You can't start a company with a few tens of thousands of dollars; it's impossible; the server costs alone are more than that." So it fundamentally changed what startups could accomplish with small amounts of capital.
Typically, when there's a major platform shift where you can do things faster and with less capital than before, startups win. This is the classic way startups beat incumbents. Early in my career, I saw this happen with the cloud. Now, watching companies build on AI feels very similar in direction. But as Matt said, the speed this time is crazy.
Host: Is there a sense that existing large companies, the incumbents, are adopting AI much faster than they adopted cloud computing?
Sam Altman: There's certainly more of that. But I'm also talking about the speed of revenue growth for startups. I was speaking at YC recently, and I asked at the end: "What's the revenue expectation now for a good company at the end of YC?" They said, "The answer changes every month. It might even be different between the start and end of the same YC batch." That's never happened before. The speed at which people are building scalable businesses on this new platform is unlike anything I've seen.
Host: Matt, throughout the cloud era, AWS was essentially the default cloud for all startups, giving you a huge advantage. What makes you the default cloud today? Because a lot of people are building on the OpenAI API now. Do you feel like "we're coming at this from a very different angle; we have this massive installed base of customers who are all asking us for AI capabilities, but we have less visibility into this whole startup cohort Sam is talking about"?
Matt Garman: I think there are a few aspects. First, we're very excited about this partnership, and I think it will be very significant for many startups. But even today, if you talk to startups, most scaling startups are still scaling on AWS for many reasons. The scale is there, the availability is there, the security is there, the reliability is there, the partner ecosystem of other ISVs is on AWS, and their customers are on AWS.
Host: (Laughs) Whether they like it or not, people have used the AWS console, so they're also used to it.
Matt Garman: And we help them. We spend a lot of time empowering startups, not just with credits, but with advice on how to build systems, how to think about go-to-market, and many similar things. I think a lot of startups appreciate that. We invest a lot of time and effort to ensure this because we truly believe startups are the lifeblood of AWS. It's been that way from the beginning, as Sam said, and it remains true today. I still go to Silicon Valley or other places every quarter to meet directly with startups, hear what they're doing, and confirm that what we're building actually meets their needs.
So, the competition for startup attention is certainly more intense today than it was 20 years ago. But it remains as important to us as ever. We invest heavily to ensure we can meet the needs of these startups.
Host: Is it fair to say that people building directly on the OpenAI API, rather than using the Azure version of the OpenAI service, are more likely to employ a tech stack where general computing is on AWS and the AI part uses OpenAI?
Matt Garman: I think that's a very common pattern for many startups today. Absolutely.
Bedrock Managed Agents: Bringing AI Agents into Enterprise Workflows
Host: That brings us to today's announcement: Bedrock Managed Agents, powered by OpenAI. I think I have that right. As I understand it, the selling point isn't just that OpenAI models are available in AWS—I think that wasn't allowed before—but that OpenAI's frontier models are encapsulated in an AWS-native agent runtime that includes identity, permission state, logging, governance, and deployment. Sam, is that accurate?
Sam Altman: Yes, that's a good summary.
Host: Thanks.


