AI Earnings Showdown: $650 Billion Poured into AGI
- Core Thesis: In Q1 2026, the combined capital expenditure guidance from four tech giants—Microsoft, Google, Meta, and Amazon—totals nearly $650 billion (equivalent to Sweden's annual GDP). This marks the AI arms race entering a deep strategic game between physical infrastructure (such as power grids and data centers) and capital efficiency (return on investment pathways). Rumors that OpenAI failed to meet its revenue targets have triggered market turmoil, highlighting a shift in the AI narrative from faith-based investing to cash flow validation.
- Key Takeaways:
- The combined CapEx guidance of the Big Four reaches $650 billion, a scale comparable to Sweden's GDP, directly aimed at securing a ticket to AGI.
- Rumors of OpenAI missing revenue targets caused Oracle to drop 4%, CoreWeave to fall 5.8%, and SoftBank to decline 12%, as the market questions the return on massive AI compute investments.
- Google Cloud's order backlog has reached $462 billion, but its CEO admitted that physical infrastructure limitations prevent meeting demand, with AI growth constrained by power, land, and construction lead times.
- Meta exceeded earnings expectations (revenue up 33% YoY), but its stock price fell 7% after raising its 2026 CapEx guidance to $125-145 billion and lacking a clear path for cloud monetization.
- AI has not disrupted traditional search; Google's advertising revenue grew 15% year-over-year, demonstrating the 'Jevons paradox': efficiency gains expand demand rather than reduce it.
- Qualcomm officially enters the data center market, reflecting the blurring of industry boundaries caused by AI computing workloads redistributing between the cloud and the edge.
- Valuation logic for AI diverges between A-shares and US stocks: KLA was sold off due to its China exposure, while Cambricon, despite a surge in performance, saw super-investor Zhang Jianping quietly reduce his stake, revealing a market divided between the "domestic substitution" narrative premium and the reality of execution.
Original author: Sleepy.md
On April 29, 2026, Microsoft, Google, Meta, and Amazon all released their Q1 earnings on the same day. Isolating the capital expenditure guidance from these four companies, the figure approaches $650 billion. This scale is equivalent to the entire annual GDP of a country like Sweden.
In other words, the four wealthiest tech companies in the world are preparing to use the economic output of a moderately developed country for an entire year to buy their ticket to the era of AGI.
Now, everyone's eyes are fixed on that ticket to the AGI ship. At this moment, playfully dubbed the "Night of the Final Showdown" for global AI assets, if we slightly shift our gaze away from those grand narratives and look into the unnoticed, hidden corners, you'll discover a shadow war over physical constraints, capital anxiety, and industry restructuring. It has already reached a critical, decisive stage.
How Did a Company That Didn't Report Earnings Crash the US Stock Market?
What truly controls market sentiment isn't necessarily the most profitable company on paper, but the one everyone treats as a "totem of faith."
April 29 was originally supposed to be the most important day of the US earnings season. But before the listed companies turned in their reports, the market experienced an inexplicable stampede. Goldman Sachs data shows this was the second-worst trading day for AI assets so far this year.
The trigger wasn't a profit warning from any listed company, but rather a report from *The Wall Street Journal* the day before. According to the report, OpenAI failed to meet its 2025 revenue targets, and the goal of reaching 1 billion weekly active users remains distant. What stung market nerves more was the report mentioning that OpenAI CFO Sarah Friar had internally warned that if revenue growth continues to underperform, the company might struggle to support its massive $600 billion computing procurement commitment in the future.
A company that isn't publicly listed and doesn't need to release earnings reports, based solely on a rumor, caused Oracle's stock to fall 4%, CoreWeave to drop 5.8%, and even SoftBank, far across the Pacific, to plunge 12% in over-the-counter trading.
When a $600 billion computing commitment collides with revenue growth that hasn't materialized simultaneously, the market suddenly realizes the most dangerous part of the AI narrative isn't that no one believes in the future, but that the future is too expensive.

For the past two years, OpenAI has been the religion of Silicon Valley.
GPU procurement, data center construction, cloud provider expansion, startup valuations – many seemingly disparate decisions all hinge on the same judgment: model capabilities will continue to leapfrog, user scale will keep expanding, and AGI will eventually turn today's expensive investments into tomorrow's tickets.

The strongest aspect of this logic is its self-reinforcing nature. The more people believe, the higher the valuation; the higher the valuation, the more people dare not disbelieve.
But around April 29, the market seriously questioned the cash flow of this faith for the first time. Even OpenAI must face customer acquisition costs, user retention, revenue growth rates, and computing bills.
The Printing Press and the Cooling Water
The most captivating aspect of the internet era was that growth seemed virtually limitless.
Write a piece of code, copy it to ten million users, and marginal costs become incredibly low. For the past two decades, Silicon Valley's audacity to disrupt traditional industries by "burning cash for growth" relied on this belief: as long as network effects are strong enough, scale will swallow costs.
But in the AI era, the digital world's printing press finds its throat tightly gripped by the physical world's cooling water pipes.
During the earnings call on April 29, despite Google Cloud's astonishing 63% growth rate (quarterly revenue exceeding $20 billion for the first time), CEO Sundar Pichai's tone carried a hint of helplessness: "If we could meet demand, cloud revenue could have been even higher."

Behind this statement lies the most peculiar business dilemma of the AI era: demand far exceeds supply, but growth is ruthlessly constrained by the physical world.
Google holds a massive $462 billion cloud order backlog, nearly doubling quarter-over-quarter. AI solution products grew nearly 800% year-over-year, Gemini Enterprise paid users increased 40% quarter-over-quarter, and API token usage skyrocketed from 10 billion to 16 billion per minute.
These numbers would be cause for celebration at any internet company. But in Pichai's words, we hear a new type of dilemma emerging in the AI era: customers are already lining up, money is on the way, but the servers aren't built yet, power isn't connected yet, and the advanced chips haven't been manufactured in the fabs.
It's not a lack of demand. It's that there's too much demand, dragging growth back into the physical world.
Microsoft faces a similar predicament. Azure grew 40%, and AI annualized revenue surpassed $37 billion. This figure was only $13 billion in January 2025, nearly tripling in 15 months.
However, Microsoft's capital expenditure fell sequentially to $31.9 billion, a decrease of nearly $6 billion from the previous quarter's $37.5 billion. Microsoft explained this as "infrastructure construction timing." The implication is that money can be approved today, but a data center won't grow overnight; GPUs can be ordered, but electricity, land, cooling systems, and construction timelines cannot be accelerated by capital markets.
When everyone thought we were sprinting towards a virtual world, it is still the most ancient heavy assets and physical laws that ultimately determine the outcome.
Computing power is becoming a new type of "land resource": limited in the short term, slow to build, location-dependent, where the first mover locks in supply. In this land grab, the four tech giants dare to push capital expenditure to the $650 billion level not because they have calculated the returns, but because they fear that if they don't hoard this "land" now, they might not even have a seat at the table tomorrow.
The Posture of Burning Cash
After the market close on April 29, despite both companies reporting earnings above expectations and raising capital expenditure guidance, Google's stock rose 7%, while Meta's plummeted 7%.
To be fair, Meta delivered an impressive report: revenue of $56.31 billion, up 33% year-over-year, the fastest growth rate since 2021; EPS of $10.44, well above Wall Street expectations.
But Mark Zuckerberg committed a taboo. Meta raised its 2026 capital expenditure guidance to between $125 billion and $145 billion. The better the performance, the more nervous the market became. What investors truly fear isn't whether Meta is profitable now, but that it plans to use the cash generated by today's advertising business to fuel an AI gamble with an unclear payback path.
The market's punishment was merciless. The difference lies in the granularity of commercial monetization.
Google, Amazon, and Microsoft's AI spending can at least be placed within a relatively clear ledger.
Google has a $462 billion cloud order backlog, Amazon has AWS's AI annualized revenue, and Microsoft has Copilot paid users and high Remaining Performance Obligations (RPO). For every dollar they burn, while not necessarily immediately recovered, Wall Street at least knows roughly where the money will come back from: enterprise clients, cloud contracts, software subscriptions, computing leasing.
This is why the capital market is willing to keep listening to their stories. The story can be far off, but the path to payment must not be completely invisible.
Meta's problem is that it doesn't have a cloud business to sell externally.
The hundreds of billions it pours in must eventually be realized through a more circuitous route: the Meta AI assistant must increase user stickiness, recommendation algorithms must improve ad conversion, AI-generated content must lengthen user engagement, and smart glasses and future hardware must become new entry points.

This logic isn't invalid; the chain is just too long. When cloud providers burn cash, they put GPUs into an already signed contract. When Meta burns cash, it puts GPUs into an as-yet-unproven model for ad efficiency. The former can be discounted, the latter must first be believed. While logically sound, the monetization chain is too long for Wall Street's patience.
In the capital market, patience is a luxury. Especially when capital expenditure reaches the hundreds of billions level, investors are willing to pay for the future, but not indefinitely for ambiguity.
More anxiety-inducing is the time lag.
Amazon CEO Andy Jassy admitted on the call that the vast majority of funds invested in 2026 won't generate returns until 2027 or even 2028.
This means the tech giants are betting today's cash flow on production capacity that won't be realized for two years. In between lies data center construction, chip supply, power connection, customer demand, and model iteration. Any deviation in any link will be repriced by the capital market.
This is the most dangerous aspect of the AI arms race: the money is spent today, the story is told today, but the answer won't be known for two years.
Blurring Industry Boundaries
AI hasn't quickly pushed search off the table, as many expected two years ago.
When ChatGPT first emerged, the market briefly believed that search advertising would be consumed by direct answers, and companies like Perplexity were highly anticipated. However, in the April 29 earnings report, Google's data showed search query volume hit an all-time high, with advertising revenue reaching $77.25 billion, up 15% year-over-year.
This feels like the "Jevons paradox" of the AI era. In 1865, British economist William Stanley Jevons discovered that improvements in steam engine efficiency didn't reduce coal consumption but led to its massive increase, as efficiency made steam engines affordable for more people, igniting overall demand. Similarly, AI makes search more complex, and also prompts users to ask more questions.
This is another reason why Google finds it easier to convince the market compared to Meta. It has both the cash flow from its legacy search business and the new ledger of its cloud business; it makes money from advertising as well as from enterprise computing needs. AI hasn't dismantled its moat; at least so far, it has seemingly thickened it.
A similar boundary reconstruction is happening in the chip industry. On the same day, mobile chip king Qualcomm reported revenue of $10.6 billion. During the call, CEO Cristiano Amon announced a major decision: Qualcomm is officially entering the data center market, with custom chips for a leading hyperscaler expected to start shipping later this year.

Qualcomm's primary battlefield has always been mobile devices. But as AI computing loads begin to redistribute between the cloud and the edge, it too must redefine its position.
If future AI is entirely handled by cloud-based large models, the value of mobile chips will be compressed. If edge-side AI becomes standard, Qualcomm must prove it belongs not only in phones but also in inference, terminals, and low-power data centers.
Its foray into the data center is less an offense and more a defense.
As AI transitions from a "cloud luxury" to an "edge standard," all industry boundaries begin to blur. Mobile chip companies try to enter data centers, cloud providers start designing their own chips, and chip companies explore models. Qualcomm's "defection" is just the tip of the iceberg of this massive restructuring.
The Same Gold Rush, Two Valuation Languages
In the same AI gold rush, the US stock market has entered a stringent "monetization verification period." Even a semiconductor process control and inspection equipment leader like KLA Corporation will be repriced by the market if it shows even a hint of geopolitical or tariff risk. After the close on April 29, KLA reported revenue of $3.415 billion, beating expectations, and Non-GAAP EPS of $9.40, exceeding the expected $9.16.
However, its stock fell as much as 8% in after-hours trading.
The reason wasn't poor performance but market concern over tariffs and China exposure. KLA's customer list includes numerous Chinese wafer fabs. Against the backdrop of US-China tech decoupling, this "China exposure" hangs like the Sword of Damocles. No matter how strong the performance, it cannot offset the market's instinctive fear of geopolitical risk.
In the A-share market, however, a different language is spoken.
Performance certainly matters here too, but often it's just fuel. The real igniter is the narrative – whether you hold the ticket labeled "Domestic Substitution."
On the evening of April 29, Cambricon Technologies released a striking Q1 report: revenue of 2.885 billion yuan, a staggering 159.56% year-over-year increase, its first time exceeding the 2 billion yuan mark in a single quarter; net profit of 1.013 billion yuan, up 185.04% year-over-year. The next day, Cambricon's stock surged, pushing its total market cap past 670 billion yuan to an all-time high, with year-to-date gains exceeding 62%.

MetaX, a GPU company that went public only in December 2025, also reported on the same day: revenue of 562 million yuan, up 75% year-over-year, with its loss narrowing significantly from 233 million yuan in the same period last year to 98.84 million yuan.
Both are in the AI infrastructure chain, yet the US and A-share markets gave completely different pricing reactions.
KLA faces the complex ledger of the global supply chain: performance, orders, tariffs, China exposure, export controls – each item can enter the valuation model.
Cambricon and MetaX face a different narrative environment. The stronger the external restrictions, the more the strategic value of domestic computing power is amplified. The US market discounts risk; the A-share market premiums scarcity.
The Exit of Smart Money
But just as the market cheered for Cambricon, one detail stood out starkly.
At the end of 2025, super retail investor Zhang Jianping still held 6.8149 million shares of Cambricon, worth about 9.2 billion yuan, making him the second-largest individual shareholder. By this Q1 report, he had quietly exited the top ten shareholders list.
Based on a rough estimate of the stock price range during Q1, the proceeds from this stake reduction likely total at least several billion yuan. The exact price is unknown externally, but it's confirmable that before the performance explosion and the stock hitting new highs, the earliest beneficiary of this narrative cycle chose to lock in profits and leave.
There are always two types of people in the market: those who pay for the narrative, and those who price the narrative.
Zhang Jianping clearly belongs to the latter. He entered Cambricon before it became a universal consensus and then turned around and left after it was written into the grand story of a "Domestic Computing Power Leader."
On this $650 billion earnings night, Silicon Valley giants are anxious over computing power shortages, Wall Street analysts are agonizing over the time lag in monetization, and the A-share market is busy repricing domestic computing power.
In the same AI gold rush, every market speaks its own language. The US market talks about return cycles, the A-share market discusses domestic substitution. Cloud providers talk about order backlogs, Meta discusses ad efficiency. OpenAI didn't release earnings, yet still tugs at the nerves of the entire computing chain.
Everyone is confident they bought a ticket to the AGI era. But no one knows when the show will end or where the exit is. The ticket to the AI era is certainly expensive. But more expensive than the ticket is knowing when it's time to leave.


