BTC
ETH
HTX
SOL
BNB
View Market
简中
繁中
English
日本語
한국어
ภาษาไทย
Tiếng Việt

AI Earnings Showdown Night: $650 Billion Bet on AGI

区块律动BlockBeats
特邀专栏作者
2026-04-30 08:13
This article is about 5066 words, reading the full article takes about 8 minutes
More expensive than the entry ticket is knowing when it's time to leave.
AI Summary
Expand
  • Core Thesis: In Q1 2026, the combined capital expenditure guidance from tech giants Microsoft, Google, Meta, and Amazon totals nearly $650 billion (equivalent to Sweden's annual GDP), signaling that the AI arms race has entered a deep game-theory phase between physical constraints (e.g., electricity, data centers) and capital efficiency (paths to return on investment). Rumors that OpenAI failed to meet its revenue targets triggered market turmoil, highlighting a shift in the AI narrative from faith-based investing to cash flow validation.
  • Key Points:
    1. The combined CapEx guidance from the four tech giants reaches $650 billion, a scale comparable to Sweden's GDP, with the clear aim of securing a ticket to AGI.
    2. Rumors of OpenAI missing revenue targets led Oracle shares to fall 4%, CoreWeave to drop 5.8%, and SoftBank to decline 12%, as markets questioned the return path on massive AI compute investments.
    3. Google Cloud's order backlog stands at $462 billion, but the CEO admits demand cannot be met due to physical infrastructure bottlenecks; AI growth is constrained by electricity, land, and construction timelines.
    4. Despite Meta beating earnings estimates (revenue up 33% YoY), its stock fell 7% due to elevated 2026 CapEx guidance of $125-145 billion and a lack of a clear cloud monetization path.
    5. AI has not disrupted traditional search; Google's ad revenue grew 15% YoY, illustrating the "Jevons Paradox": efficiency gains actually expand demand.
    6. Qualcomm officially enters the data center market, reflecting the blurring of industry boundaries driven by the redistribution of AI compute workloads between the cloud and the edge.
    7. Divergent AI valuation logic between A-shares and US stocks: KLA was sold off due to China exposure, while Cambricon, despite strong earnings performance, saw a quiet reduction in holdings by prominent retail investor Zhang Jianping, revealing the tension between the "Domestic Substitution" narrative premium and real-world execution.

Original author: Sleepy.md

On April 29, 2026, Microsoft, Google, Meta, and Amazon all released their first-quarter financial results on the same day. Looking at the capital expenditure guidance provided by these four companies alone, the figure approaches $650 billion. This scale is equivalent to the entire annual GDP of a country like Sweden.

In other words, the world's four most valuable tech companies are preparing to spend the economic output of a moderately developed nation for an entire year to buy their ticket to the era of AGI.

Right now, everyone's eyes are fixed on that ticket to AGI. At this moment, playfully dubbed the "final showdown night" for global AI assets, if we shift our gaze slightly away from these grand narratives and look into the less conspicuous, hidden corners, you will find an undercurrent of a shadow war involving physical constraints, capital anxiety, and industrial restructuring that has already reached a critical, decisive point.

How Did a Company That Didn't Report Earnings Cause the US Stock Market to Tumble?

What truly controls market sentiment isn't necessarily the most profitable company on paper, but the enterprise considered a "spiritual totem" by everyone.

April 29 was originally supposed to be the most significant day of the US earnings season. But before these listed companies submitted their reports, the market first experienced an unexpected stampede. Goldman Sachs data indicates this was the second worst trading day for AI assets so far this year.

The trigger wasn't a profit warning from some listed company, but rather a report in the *Wall Street Journal* the day before. According to the report, OpenAI failed to meet its 2025 revenue targets, and its goal of reaching 1 billion weekly active users remains distant. What further rattled market nerves was the report's mention that OpenAI CFO Sarah Friar had internally warned that if revenue growth continues to underperform, the company might struggle to support its massive $600 billion computing procurement commitment in the future.

A company that isn't publicly listed and doesn't need to release earnings reports, relying solely on a rumor, caused Oracle's stock to drop 4%, CoreWeave to fall 5.8%, and even SoftBank, far across the Pacific, to plummet 12% in over-the-counter trading.

When a $600 billion computing commitment collides with revenue growth that isn't materializing synchronously, the market suddenly realizes the most dangerous aspect of the AI narrative isn't that no one believes in the future, but that the future is too expensive.

For the past two years, OpenAI has been the religion of Silicon Valley.

GPU procurement, data center construction, cloud provider expansion, startup valuations – many seemingly disparate decisions are all ultimately based on the same judgment: model capabilities will continue to leapfrog, user scale will keep expanding, and AGI will eventually turn all of today's expensive investments into tomorrow's tickets.

The strongest aspect of this logic is its ability to self-reinforce. The more people believe, the higher the valuations; the higher the valuations, the more people dare not disbelieve.

But around April 29, the market seriously questioned the cash flow of this faith for the first time. Even OpenAI must face customer acquisition costs, user retention, revenue growth rates, and computing bills.

The Printing Press and the Cooling Water

The most captivating aspect of the internet era was the seemingly limitless growth.

Write a piece of code, replicate it to ten million users, and the marginal cost becomes extremely low. For the past two decades, Silicon Valley's courage to use "burning cash for growth" to disrupt traditional industries relied on this belief: if the network effect is strong enough, scale will absorb costs.

But in the AI era, the digital world's printing press is firmly strangled by the physical world's cooling water pipes.

During the earnings call on April 29, despite Google Cloud's impressive 63% growth rate (with quarterly revenue exceeding $20 billion for the first time), CEO Sundar Pichai's tone revealed a sense of helplessness: "If we could meet demand, cloud revenue could have been even higher."

Behind this statement lies the most peculiar business dilemma of the AI era: demand far exceeds supply, but growth is ruthlessly constrained by the physical world.

Google holds a massive $462 billion backlog of cloud orders, nearly doubling quarter-over-quarter. AI solution products grew nearly 800% year-over-year, Gemini Enterprise paid users increased 40% quarter-over-quarter, and API token usage skyrocketed from 10 billion to 16 billion per minute.

For any internet company, these numbers would be cause for celebration. But in Pichai's words, we can hear a new type of dilemma emerging in the AI era: customers are already queuing up, money is on its way, but the servers aren't built yet, the power isn't connected, and the advanced chips haven't been manufactured in the fabs.

It's not a lack of demand, but too much demand, dragging growth back into the physical world.

Microsoft faces the same predicament. Azure grew 40%, and its annualized AI revenue surpassed $37 billion – a figure that was only $13 billion in January 2025, nearly tripling in 15 months.

However, Microsoft's capital expenditure decreased sequentially to $31.9 billion, down nearly $6 billion from the previous quarter's $37.5 billion. Microsoft explained this in its earnings report as "the timing of infrastructure build-out." The implication is that money can be approved today, but data centers won't materialize tomorrow; GPUs can be ordered, but electricity, land, cooling systems, and construction lead times cannot be accelerated by the capital markets.

Just when everyone thought we were sprinting towards the virtual world, the ultimate decider remains the most ancient of heavy assets and physical laws.

Computing power is becoming a new type of "land resource": limited in the short term, slow to build, location-dependent, with first movers locking in supply. In this land grab, the four tech giants dare to push capital expenditure to the $650 billion level not because they have fully calculated the return, but because they fear that if they don't hoard this "land" now, they might not even have a seat at the table tomorrow.

The Posture of Burning Cash

After the market close on April 29, despite both companies beating earnings expectations and raising capital expenditure guidance, Google's stock rose 7%, while Meta's plunged 7%.

To be fair, Meta delivered an impressive report: revenue of $56.31 billion, up 33% year-over-year, the fastest growth rate since 2021; EPS of $10.44, significantly exceeding Wall Street estimates.

But Mark Zuckerberg made a misstep. Meta raised its 2026 capital expenditure guidance to between $125 billion and $145 billion. The better the performance, the more nervous the market became. What investors truly fear isn't whether Meta is making money now, but that it plans to use the cash generated by today's advertising business to fuel an AI gamble with an unclear path to recovery.

The market's punishment was swift and severe. The difference behind this lies in the granularity of commercial monetization.

At the very least, the AI spending of Google, Amazon, and Microsoft can be placed into a relatively clear ledger.

Google has a $462 billion cloud order backlog, Amazon has AWS's annualized AI revenue, Microsoft has Copilot paid users and a high Remaining Performance Obligation (RPO). For every dollar they burn, it might not yield immediate returns, but Wall Street at least knows roughly where the money will come back from: enterprise customers, cloud contracts, software subscriptions, computing power leasing.

This is why the capital market is still willing to listen to their stories. The story can be long-term, but the path to payback cannot be completely invisible.

Meta's problem is that it doesn't have a cloud business to sell externally.

The hundreds of billions it pours in must ultimately be realized through a more circuitous path: the Meta AI assistant must increase user stickiness, recommendation algorithms must improve ad conversion, AI-generated content must extend user dwell time, and smart glasses and future hardware must become new entry points.

This logic isn't invalid, but the chain is too long. When cloud providers burn cash, they are putting GPUs into already signed contracts. When Meta burns cash, it's putting GPUs into an advertising efficiency model that hasn't been fully proven. The former can be discounted; the latter can only be believed in first. While logically sound, the monetization chain is too long for Wall Street's patience.

In the capital markets, patience is a luxury. Especially when capital expenditure reaches the hundreds of billions, investors are willing to pay for the future, but they won't pay indefinitely for ambiguity.

More anxiety-inducing is the time lag.

Amazon CEO Andy Jassy admitted on the call that most of the funds invested in 2026 won't generate returns until 2027 or even 2028.

This means the tech giants are betting today's cash flow on capacity realization two years from now. In between lie data center construction, chip supply, power access, customer demand, and model iteration. Any deviation in any link will lead to re-pricing by the capital markets.

This is the most dangerous aspect of the AI arms race: the money is spent today, the story is told today, but the answer won't be revealed for two years.

Blurring Industry Boundaries

AI hasn't quickly driven search off the table, as many predicted two years ago.

When ChatGPT first emerged, the market believed search advertising would be swallowed by direct answers, and companies like Perplexity were thus highly anticipated. However, in the April 29 earnings report, Google's data showed search query volumes hitting an all-time high, with ad revenue reaching $77.25 billion, up 15% year-over-year.

This feels more like the "Jevons paradox" of the AI era. In 1865, British economist William Stanley Jevons discovered that improvements in steam engine efficiency didn't reduce coal consumption; instead, they led to a massive increase because efficiency made steam engines affordable for more people, thus igniting overall demand. Similarly, AI makes search more complex and also prompts users to ask more questions.

This is also why Google finds it easier to convince the market compared to Meta. It possesses both the cash flow from the old search gateway and the new ledger of its cloud business; it can make money from advertising as well as from enterprise computing needs. AI hasn't dismantled its moat; at least so far, it has actually helped thicken it.

A similar boundary restructuring is happening in the chip industry. On the same day, mobile chip king Qualcomm reported revenue of $10.6 billion. On the earnings call, CEO Cristiano Amon announced a major decision: Qualcomm is formally entering the data center market, with custom chips designed for a major hyperscale cloud provider expected to start shipping later this year.

Qualcomm's main battleground has always been mobile devices. But as AI's computational load begins to redistribute between the cloud and the edge, it too must redefine its position.

If future AI is entirely handled by large cloud models, the value of mobile phone chips will be compressed. If edge-side AI becomes standard, Qualcomm must prove it belongs not only in phones but also in inference, terminals, and low-power data centers.

Its move into data centers is less an offense and more a defense.

As AI transforms from a "cloud luxury" to an "edge standard," all industry boundaries begin to blur. Mobile chip companies try to enter data centers, cloud providers start developing their own chips, and chip companies explore models. Qualcomm's "defection" is just the tip of the iceberg of this great restructuring.

The Same Gold Rush, Two Valuation Languages

The same AI gold rush has entered a stringent "monetization falsification period" in the US stock market. Even a leader in semiconductor process control and inspection equipment will face market re-pricing if it exposes itself to even a hint of geopolitical and tariff risk. After the market close on April 29, KLA Corporation reported revenue of $3.415 billion, beating expectations, with Non-GAAP EPS of $9.40, higher than the expected $9.16.

Yet, its stock price plummeted 8% in after-hours trading.

The reason wasn't poor performance, but market concerns about tariffs and exposure to China. KLA's customer list includes numerous Chinese wafer fabs. Against the backdrop of US-China tech decoupling, this "China exposure" hangs like the Sword of Damocles. No matter how brilliant the performance, it cannot offset the market's instinctive fear of geopolitical risk.

In the A-share market, a different language is spoken.

Performance certainly matters here too, but often, performance is just the fuel; what truly ignites the narrative is whether you hold the ticket labeled "domestic substitution."

On the evening of April 29, Cambricon Technologies released a remarkable Q1 report: revenue of 2.885 billion RMB, a staggering 159.56% year-over-year increase, the first time exceeding the 2 billion mark in a single quarter; net profit of 1.013 billion RMB, up 185.04% year-over-year. The next day, Cambricon's stock surged, pushing its total market cap past 670 billion RMB, hitting an all-time high, with gains exceeding 62% year-to-date.

On the same day, Muxi Co., Ltd., a GPU company listed only in December 2025, also released its first Q1 report. Revenue was 562 million RMB, up 75% year-over-year, and its loss narrowed significantly from 233 million RMB in the same period last year to 98.84 million RMB.

Both are in the AI infrastructure chain, yet the US and A-share markets gave completely different pricing reactions.

KLA faces the complex ledger of a globalized supply chain: performance, orders, tariffs, China exposure, export controls – each item can enter the valuation model.

Cambricon and Muxi face a different narrative environment: the stronger the external restrictions, the more easily the strategic value of domestic computing power is amplified. The US stock market discounts risk; the A-share market pays a premium for scarcity.

The Departure of Smart Money

But just as the market celebrated Cambricon, one detail stands out starkly.

At the end of 2025, super retail investor Zhang Jianping held 6.8149 million shares of Cambricon, worth approximately 9.2 billion RMB, making him the company's second-largest individual shareholder. By the time of this Q1 report, he had quietly exited the list of top ten shareholders.

Calculating roughly based on the Q1 stock price range, the scale of this reduction corresponds to funds at least in the tens of billions of RMB. The exact price is unknown, but it's certain that before the performance exploded and the stock price hit new highs, the person who first tasted the dividends of this narrative chose to cash out.

There are always two types of people in the market: those who pay for the narrative, and those who price the narrative.

Zhang Jianping clearly belongs to the latter. He entered Cambricon before it became a universal consensus and walked away after it was written into the grand story of being a "domestic computing power leader."

On this $650 billion earnings night, Silicon Valley giants are anxious amidst computing shortages, Wall Street analysts suffer through the time lag of monetization, while the A-share market busies itself re-pricing domestic computing power.

In the same AI gold rush, each market speaks its own language. The US stock market talks about return cycles, the A-share market talks about domestic substitution; cloud providers talk about order backlogs, Meta talks about advertising efficiency; OpenAI didn't release earnings, yet still holds the entire computing chain's nerves hostage.

Everyone is convinced they bought a ticket to the AGI era. But no one knows when the show will end, or where the exit is. The ticket to the AI era is undoubtedly expensive. But more expensive than the ticket is knowing when it's time to leave.

invest
industry
AI
Welcome to Join Odaily Official Community