Behind OpenAI's $110 Billion Financing: The Game Between Amazon and Microsoft
- Core View: By introducing Amazon as a new investor and signing clear cooperation agreements with Microsoft and Amazon regarding "stateless APIs" and "stateful runtime environments" respectively, OpenAI has successfully achieved diversification in its cloud infrastructure bets, thereby significantly enhancing its strategic initiative and bargaining power.
- Key Elements:
- OpenAI completed a $110 billion financing round, with Amazon, Nvidia, and SoftBank as the main contributors, achieving a pre-money valuation of $730 billion.
- Microsoft and Amazon have locked in different technological paths: Microsoft exclusively hosts OpenAI's "stateless APIs" (the current mainstream business model), ensuring predictable cash flow.
- Amazon is collaborating with OpenAI to build a "stateful runtime environment" (AI Agent), betting on the future transformation of enterprise productivity paradigms.
- Analysts point out that future enterprise AI investments will lean more towards purchasing Agent workflows capable of long-term operation and cross-tool collaboration, rather than single API calls.
- By introducing Amazon and clearly delineating cooperation boundaries, OpenAI has broken its previous singular reliance on Microsoft's cloud services, leveraging competition among giants to strengthen its own bargaining power.
Original | Odaily (@OdailyChina)
Author|Azuma (@azuma_eth)

On the evening of February 27th, OpenAI announced it had completed its latest funding round of $110 billion at a pre-money valuation of $730 billion.
The funding for this round came from three tech giants: Amazon invested $50 billion (an initial $15 billion, with the remaining $35 billion to be delivered over the coming months upon meeting specific conditions), Nvidia invested $30 billion (which will be recouped through the procurement of a total of 5 GW of computing power), and SoftBank also invested $30 billion.
Following the funding round, OpenAI founder Sam Altman expressed his gratitude to the three investors on his personal X account. Notably, Sam Altman's order of thanks was Amazon, Microsoft, Nvidia, SoftBank — the name of Microsoft, the "old" shareholder and important partner who did not invest in this round, was mentioned immediately after Amazon, which pledged the most capital.

Commenting on this, overseas blogger Aakash Gupta, who has long tracked the AI sector, pointed out that while most people are focused on the astronomical figure of $110 billion, the most critical information in Sam Altman's statement lies in two overlooked technical terms: "Stateless API" and "Stateful Runtime Environment," which were respectively secured by Microsoft and Amazon.
Behind the Technical Terms Lies the Present and Future of AI
The core difference between Stateless API and Stateful Runtime Environment hinges on the words "Stateless" and "Stateful."
The "Stateless" nature of a Stateless API means the server does not retain persistent state across requests — one call completes one inference; you ask a question, the AI gives an answer. After the lifecycle of that request ends, the system does not retain context or continue running. The "Stateful" nature of a Runtime Environment implies a persistent execution environment — an Agent possesses historical memory, can exist persistently, collaborate across tasks, and execute tasks over the long term.
Stateless API is currently the mainstream form of LLM commercialization. Industries like finance, retail, manufacturing, and healthcare mostly integrate AI by embedding it into existing systems through this form (e.g., various Q&A assistants, document summarization, search enhancement). The advantage of this model is that enterprises can quickly overlay AI capabilities onto their existing architecture with minimal organizational and process restructuring, achieving functional optimization with low friction. However, as model capabilities converge, computing power costs continue to decline, and price competition intensifies, token-based billing for Stateless APIs makes them more prone to standardization and commoditization, with their marginal profits facing potential ongoing compression.
In contrast, Stateful Runtime Environment currently has limited commercial scale, but it represents not just simple "functional optimization" but a shift in business paradigm — it is not only capable of answering questions but can be viewed as digital labor performing concrete tasks. This means the budgets it touches will extend from mere API call fees to automation, process management, and even part of labor costs. Precisely because of this, market expectations for Stateful Runtime Environment far exceed its current scale.
Aakash Gupta also stated on this matter that in 2026 and 2027, almost every enterprise's roadmap will revolve around "autonomous agent workflows," not one-off API calls. Companies heavily investing in AI in the future will increasingly prefer to purchase systems that can run sustainably, collaborate across tools, and maintain context over the long term.
To put it in the simplest terms, Stateless API represents the present, while Stateful Runtime Environment represents the future.
What Did Microsoft and Amazon Take Away?
On the day the funding was completed, Microsoft and Amazon respectively announced their latest cooperation agreements with OpenAI.
Microsoft stated in its announcement that the terms of the partnership jointly announced by Microsoft and OpenAI in October 2025 will remain unchanged (the terms include OpenAI purchasing $250 billion worth of Azure services). Azure remains the exclusive cloud provider for OpenAI's Stateless API; any Stateless API calls to OpenAI models generated through cooperation with any third party (including Amazon) will be hosted on Azure; OpenAI's first-party products, including Frontier, will also continue to be hosted on Azure.
Amazon stated in its announcement that AWS and OpenAI will jointly build a Stateful Runtime Environment powered by OpenAI models and offer it to AWS customers through Amazon Bedrock, helping enterprises build generative AI applications and Agents at production scale; AWS will also become the exclusive third-party cloud distribution service provider for OpenAI Frontier; The existing $38 billion multi-year cooperation agreement between AWS and OpenAI will be expanded to $100 billion over 8 years. OpenAI will consume 2 GW of Trainium computing power via AWS infrastructure to support Stateful Runtime Environment, Frontier, and other advanced workload demands; OpenAI and Amazon will also develop customized models that can be used to support Amazon's customer-facing applications.
Comparing the two announcements, the current situation becomes clear.
Microsoft, with its $250 billion agreement and exclusive service rights, has locked in the current traffic engine. Whenever OpenAI's Stateless API is called, Azure will be charging behind the scenes — regardless of who the customer is or where the channel is, the traffic ultimately flows back to Azure. This is a highly predictable cash flow, but the issue lies in the trend of shrinking profit margins for Stateless APIs. Call volume may continue to grow, but actual profits may not remain stable in the long run.
On the other side, Amazon, with its $50 billion in real capital and the $100 billion expansion agreement, has secured the underlying hosting rights for the AI Agent era for AWS. Once Agents become the core carriers of enterprise productivity, the resources that will be truly consumed long-term — computing power, storage, scheduling systems, workflow orchestration, and cross-tool collaboration — will all settle on AWS's runtime environment.
One controls the present cash flow, the other bets on the future structure of productivity.
OpenAI's Diversified Bet
Before the future truly arrives, no one knows whether Microsoft's or Amazon's choice is right. But what is certain is that under these two clearly defined cooperation agreements with distinct interests, OpenAI's initiative is significantly increasing.
In recent years, OpenAI has been highly dependent on Microsoft for cloud infrastructure. Microsoft is not only a major shareholder holding 27% of the shares but also the controller of the infrastructure. This binding brought effective early-stage resource advantages to OpenAI, but it also meant the balance of bargaining power naturally tilted towards Microsoft. With Amazon's strong entry, it and Microsoft are bound to engage in direct competition over securing OpenAI's future service rights.
For OpenAI, this is a typical diversification strategy — not being deeply bound to any single cloud service provider, not letting future growth be entirely subject to one party, and using future business as leverage to secure better terms.
Neither Microsoft nor Amazon can afford to give up OpenAI at present. When neither party can leave the table, bargaining power naturally returns to OpenAI's hands.


