AI quyết toán đêm chung kết: 650 tỷ USD đổ vào AGI
- Quan điểm chính: Trong quý 1 năm 2026, hướng dẫn chi tiêu vốn tổng cộng của bốn gã khổng lồ công nghệ Microsoft, Google, Meta và Amazon lên tới gần 650 tỷ USD (tương đương GDP cả năm của Thụy Điển), đánh dấu cuộc chạy đua vũ trang AI bước vào giai đoạn đối đầu sâu sắc giữa khía cạnh vật lý (như điện năng, trung tâm dữ liệu) và hiệu quả vốn (lộ trình lợi nhuận từ đầu tư). Tin đồn OpenAI không hoàn thành mục tiêu doanh thu đã gây chấn động thị trường, làm nổi bật sự chuyển dịch của câu chuyện AI từ niềm tin sang kiểm chứng dòng tiền.
- Các yếu tố then chốt:
- Hướng dẫn chi tiêu vốn tổng cộng của bốn ông lớn đạt 650 tỷ USD, quy mô tương đương GDP Thụy Điển, nhắm thẳng đến tấm vé AGI.
- Tin đồn doanh thu của OpenAI không đạt mục tiêu khiến Oracle giảm 4%, CoreWeave giảm 5,8%, SoftBank giảm 12%, thị trường nghi ngờ về lợi nhuận từ khoản đầu tư lớn vào sức mạnh tính toán AI.
- Lượng đơn đặt hàng tồn đọng của Google Cloud lên tới 462 tỷ USD, nhưng CEO thừa nhận không thể đáp ứng nhu cầu do hạn chế về cơ sở hạ tầng vật lý, tăng trưởng AI bị kìm hãm bởi điện năng, đất đai và chu kỳ xây dựng.
- Kết quả kinh doanh của Meta vượt kỳ vọng (doanh thu tăng 33% so với cùng kỳ), nhưng do hướng dẫn chi tiêu vốn năm 2026 được nâng lên 125-145 tỷ USD và thiếu lộ trình kiếm tiền từ mảng kinh doanh đám mây, cổ phiếu lại giảm 7%.
- AI chưa làm đảo lộn tìm kiếm truyền thống, doanh thu quảng cáo của Google tăng 15% so với cùng kỳ, thể hiện "Nghịch lý Jevons": tăng hiệu quả lại mở rộng nhu cầu.
- Qualcomm chính thức tham gia thị trường trung tâm dữ liệu, phản ánh sự mờ nhạt ranh giới ngành do sự phân bổ lại khối lượng công việc tính toán AI giữa đám mây và thiết bị đầu cuối.
- Sự khác biệt trong định giá AI giữa thị trường chứng khoán Trung Quốc (A-share) và Mỹ: KLA bị bán tháo do rủi ro tiếp xúc với Trung Quốc, Cambricon dù bùng nổ doanh thu nhưng nhà đầu tư cá nhân siêu hạng Chương Kiến Bình đã âm thầm giảm tỷ trọng, thị trường cho thấy sự phân hóa giữa mức độ ưu đãi của câu chuyện "thay thế nội địa" và hiện thực hóa lợi nhuận.
Original author: Sleepy.md
On April 29, 2026, Microsoft, Google, Meta, and Amazon all released their first-quarter financial results on the same day. Looking solely at the capital expenditure guidance provided by these four companies, the figure approaches $650 billion. This scale is equivalent to the entire annual GDP of a country like Sweden.
In other words, the world's four richest tech companies are preparing to spend an amount equal to the annual economic output of a medium-sized developed nation to buy their ticket to the AGI era.
Now, everyone's eyes are fixed on that ticket to AGI. At this moment, being playfully dubbed the "night of reckoning" for global AI assets, if we shift our gaze slightly away from these grand narratives and look into the less conspicuous, hidden corners, you'll find a covert war over physical constraints, capital anxiety, and industrial restructuring has already reached a critical, decisive point.
How Did a Company That Didn't Report Earnings Cause a US Stock Market Rout?
The true controller of market sentiment isn't necessarily the most profitable company on paper, but the one treated as a "symbol of faith" by everyone.
April 29 was initially the most anticipated day of the US earnings season. However, before the listed companies could report, the market experienced an unexpected stampede. Goldman Sachs data indicates this was the second worst trading day for AI assets this year.
The trigger wasn't a disappointing earnings report from a listed company, but rather a report from the *Wall Street Journal* the day before. According to the report, OpenAI failed to meet its 2025 revenue targets, and its goal of reaching 1 billion weekly active users remains distant. What further jolted the market nerves was the report stating that OpenAI CFO Sarah Friar had internally warned that if revenue growth continues to underperform, the company might struggle to sustain its massive $600 billion computing capacity procurement commitment in the future.
A company that isn't publicly listed and doesn't need to report earnings, based solely on a rumor, caused Oracle's stock to drop 4%, CoreWeave to fall 5.8%, and even SoftBank, far across the Pacific, to plummet 12% in the over-the-counter market.
When a $600 billion computing commitment collides with revenue growth that hasn't materialized synchronously, the market suddenly realizes the most dangerous aspect of the AI narrative isn't a lack of belief in the future, but that the future is too expensive.

For the past two years, OpenAI has been the religion of Silicon Valley.
Decisions regarding GPU procurement, data center construction, cloud provider expansion, and startup valuations, while seemingly disparate, all rested on the same underlying bet: model capabilities will continue to leapfrog, user scale will continue to expand, and AGI will eventually turn today's expensive investments into tomorrow's entry ticket.

The greatest strength of this logic is its self-reinforcing nature. The more people believe, the higher the valuation; the higher the valuation, the more people are afraid not to believe.
But around April 29, the market seriously questioned the cash flow fundamentals of this faith for the first time. Even OpenAI must confront customer acquisition costs, user retention, revenue growth rates, and computing bills.
The Printing Press and the Cooling Water
The most enchanting aspect of the internet era was the seemingly infinite potential for growth.
A piece of code, once written, could be replicated for millions of users with marginal costs driven extremely low. Over the past two decades, Silicon Valley's boldness to "burn cash for growth" and disrupt traditional industries was built on this belief: if the network effect is strong enough, scale will swallow costs.
But in the AI era, the digital world's printing press has its neck firmly gripped by the physical world's cooling water pipes.
During the earnings call on April 29, despite a staggering 63% growth in its cloud business (with quarterly revenue exceeding $20 billion for the first time), Google CEO Sundar Pichai's tone betrayed a sense of helplessness: "Cloud revenue could have been higher if we could meet the demand."

Behind this statement lies the strangest business dilemma of the AI era: demand far exceeds supply, yet growth is ruthlessly constrained by the physical world.
Google holds a massive $462 billion backlog of cloud orders, nearly doubling quarter-over-quarter. AI solution product revenue grew nearly 800% year-over-year. Paid users for Gemini Enterprise increased 40% quarter-over-quarter, and API token usage surged from 10 billion to 16 billion per minute.
These numbers would be cause for celebration at any internet company. But in Pichai's words, we hear a new type of predicament arising in the AI era: customers are already queuing, money is already on its way, but servers aren't built yet, power isn't connected yet, and advanced chips haven't been produced from the fabs yet.
The problem isn't a lack of demand; it's that there is too much demand, pulling growth back into the physical world.
Microsoft faces a similar dilemma. Azure grew 40%, and annualized AI revenue surpassed $37 billion. This figure was just $13 billion in January 2025, nearly tripling in 15 months.
However, Microsoft's capital expenditure fell sequentially to $31.9 billion, a decrease of nearly $6 billion from the previous quarter's $37.5 billion. Microsoft explained this in its earnings report as "infrastructure build timing." The implication is that money can be approved today, but data centers won't be built tomorrow; GPUs can be ordered, but electricity, land, cooling systems, and construction timelines cannot be rushed by the capital markets.
Just when everyone thought we were racing towards the virtual world, it's the oldest forms of heavy assets and the laws of physics that will ultimately determine the winner.
Computing power is becoming a new form of "land resource": limited in the short term, slow to build, location-dependent, and critical for early lock-in. In this land grab, the reason the four tech giants dare to push capital expenditure to the $650 billion level is not because they have all calculated the returns, but because they fear that if they don't hoard this "land" now, they might not even have a seat at the table tomorrow.
The Posture of Burning Cash
After the market close on April 29, despite both companies beating earnings expectations and raising capital expenditure guidance, Google's stock rose 7%, while Meta's plummeted 7%.
To be fair, Meta delivered an impressive report: revenue of $56.31 billion, up 33% year-over-year, its fastest growth rate since 2021; EPS of $10.44, significantly beating Wall Street expectations.
But Mark Zuckerberg committed a taboo: Meta raised its 2026 capital expenditure guidance to $125-145 billion. The stronger the performance, the more nervous the market became. The real fear among investors isn't whether Meta is profitable now, but that it's using the cash generated from today's advertising business to fund an AI gamble with unclear payback pathways.
The market's punishment was swift and severe. The difference lies in the granularity of commercial monetization.
AI spending by Google, Amazon, and Microsoft can at least be placed into a relatively clear ledger.
Google has its $462 billion cloud order backlog, Amazon has the annualized AI revenue from AWS, and Microsoft has Copilot paid users and high remaining performance obligations (RPO). For every dollar they burn, while immediate returns aren't guaranteed, Wall Street at least knows roughly where the money will come back from: enterprise clients, cloud contracts, software subscriptions, computing power leasing.
This is why the capital market is willing to continue listening to their stories. The story can be distant, but the payback path cannot be entirely invisible.
Meta's problem is that it doesn't have a cloud business to sell externally.
The hundreds of billions it's pouring in must ultimately be realized through a more circuitous path: the Meta AI assistant needs to increase user engagement, recommendation algorithms need to improve ad conversion, AI-generated content needs to extend user dwell time, and smart glasses and future hardware need to become new entry points.

This logic isn't invalid; it's just that the chain is too long. When cloud providers burn cash, they put GPUs into an already signed order. When Meta burns cash, it puts GPUs into a model for ad efficiency that hasn't been fully proven. The former can be discounted; the latter must first be believed. Although logically sound, the monetization chain is too long for Wall Street's patience.
And in the capital market, patience is a luxury. Especially when capital expenditure reaches the hundreds of billions level, investors are willing to pay for the future, but not indefinitely for ambiguity.
What's more anxiety-inducing is the time lag.
Amazon CEO Andy Jassy admitted during the call that the vast majority of funds invested in 2026 will not generate returns until 2027 or even 2028.
This means the tech giants are betting today's cash flow on capacity that will materialize two years from now. The gap is filled with data center construction, chip supply, power connectivity, customer demand, and model iteration. Any deviation in any link will be repriced by the capital market.
This is the most dangerous aspect of the AI arms race: money is spent today, stories are told today, but the answers won't be revealed for another two years.
Blurring Industry Boundaries
AI hasn't quickly pushed search off the table, as many predicted two years ago.
When ChatGPT first emerged, the market briefly believed that search advertising would be swallowed by direct answers, placing high hopes on companies like Perplexity. However, in the April 29 earnings report, Google's data showed search queries hitting an all-time high, with ad revenue reaching $77.25 billion, up 15% year-over-year.
This looks more like the "Jevons paradox" of the AI era. In 1865, British economist William Stanley Jevons found that improvements in steam engine efficiency didn't reduce coal consumption; they dramatically increased it, because efficiency made steam engines more affordable, igniting overall demand. Similarly, AI makes search more complex and prompts users to ask more questions.
This is another reason why Google finds it easier to persuade the market compared to Meta. It possesses both the cash flow from an old entry point (search) and the new ledger of its cloud business; it profits from both advertising and enterprise computing needs. AI hasn't dismantled its moat; so far, it has actually thickened it.
A similar boundary restructuring is occurring in the chip industry. On the same day, mobile chip giant Qualcomm reported revenue of $10.6 billion. During the call, CEO Cristiano Amon announced a major decision: Qualcomm is officially entering the data center market. Custom chips developed in collaboration with a major hyperscaler cloud provider are expected to start shipping later this year.

Qualcomm's primary battlefield has always been mobile devices. But as AI computing loads begin to redistribute between the cloud and the edge, it must also redefine its position.
If all future AI is handled by massive cloud models, the value of mobile chips will be compressed. If on-device AI becomes standard, Qualcomm must prove it belongs not only in phones, but also in inference, edge devices, and low-power data centers.
Its foray into the data center is less an offensive move and more a defensive one.
As AI transitions from a "cloud luxury" to an "edge standard," all industry boundaries begin to blur. Mobile chip companies try to enter data centers, cloud providers start designing their own chips, and chip companies explore models. Qualcomm's "defection" is just the tip of the iceberg in this massive restructuring.
One Gold Rush, Two Valuation Languages
In the same AI gold rush, the US stock market has entered a stringent "monetization verification period." Even a leader in semiconductor process control and inspection equipment, like KLA Corporation, faces market repricing if it exposes even a hint of geopolitical or tariff risk. After the market close on April 29, KLA reported better-than-expected revenue of $3.415 billion and Non-GAAP EPS of $9.40, exceeding the expected $9.16.
However, its stock price initially fell 8% in after-hours trading.
The reason wasn't poor performance, but market concerns about tariffs and exposure to China. KLA's client list includes numerous Chinese wafer fabs. Against the backdrop of US-China tech decoupling, this "China exposure" hangs like the Sword of Damocles. Strong performance cannot offset the market's instinctive fear of geopolitical risk.
In the A-share market (China), however, a different language is used.
Performance matters here too, but often it is merely fuel; the real ignition source is the narrative—whether you hold the ticket called "domestic substitution."
On the evening of April 29, Cambricon Technologies released an impressive first-quarter report: revenue of 2.885 billion RMB, a staggering 159.56% year-over-year increase, surpassing the 2 billion mark for the first time in a single quarter. Net profit reached 1.013 billion RMB, up 185.04% year-over-year. The next day, Cambricon's stock surged, pushing its total market capitalization above 670 billion RMB, a new all-time high, with year-to-date gains exceeding 62%.

Also reporting on the same day was Muxi (MetaX/沐曦股份), with revenue of 562 million RMB, up 75% year-over-year, and a significantly narrowed net loss from 233 million RMB to 98.84 million RMB. This was the first quarterly report for this GPU company, which was only listed in December 2025.
Despite both being in the AI infrastructure chain, the US and Chinese stock markets gave completely different pricing reactions.
KLA faces the complex ledger of global supply chains, where performance, orders, tariffs, China exposure, and export controls can all enter the valuation model.
Cambricon and Muxi face a different narrative environment: the stronger the external restrictions, the more amplified the strategic value of domestic computing power becomes. The US market discounts risk; the A-share market premiums scarcity.
The Exit of Smart Money
But amidst the market's cheers for Cambricon, one detail stands out sharply.
At the end of 2025, super retail investor Zhang Jianping still held 6.8149 million shares of Cambricon, worth approximately 9.2 billion RMB, making him the company's second-largest individual shareholder. By the time of this first-quarter report, he had quietly exited the list of top ten shareholders.
Based on a rough estimation using the stock price range for the first quarter, the amount of capital involved in this reduction is at least in the billions of RMB. The exact price is unknown to the public, but it is certain that before the performance explosion and the stock price hitting new highs, the person who first capitalized on this round of narrative dividends chose to cash out.
There are always two types of people in the market: those who pay for the narrative, and those who price the narrative.
Zhang Jianping clearly belongs to the latter. He entered Cambricon before it became a consensus, and walked away after it was written into the grand story of being the "leader in domestic computing power."
On this $650 billion earnings night, Silicon Valley giants are anxious over computing power shortages, Wall Street analysts are agonizing over the time lag in monetization, and the A-share market is busy repricing domestic computing power.
In this same AI gold rush, each market speaks its own language. The US market talks about return cycles, the A-share market talks about domestic substitution. Cloud providers discuss order backlogs, Meta discusses ad efficiency. OpenAI didn't release earnings, yet still influences the entire computing chain's nerves.
Everyone is convinced they have bought the entry ticket to the AGI era. But no one knows when the show will end, or where the exit is. The ticket to the AI era is indeed expensive. But more expensive than the ticket is knowing when it's time to leave.


