Cerebras IPO: $48.8 Billion Valuation – Is the "Nvidia Challenger" a Bubble or a New King?
- Core Thesis: This article analyzes the Cerebras (CBRS) IPO prospectus, revealing three paradoxes beneath its superficial "AI challenger" narrative: accounting profits stem from one-time adjustments, customer dependence is highly concentrated on UAE-related parties, technological advantages are limited to the inference niche, and the valuation implies extreme risk.
- Key Elements:
- Cerebras had 2025 revenue of $510 million and GAAP net income of $237.8 million. However, excluding one-time accounting adjustments ($363.3 million) and stock-based compensation, the non-GAAP net loss was $75.7 million, up 247% year-over-year.
- In 2025, 86% of revenue came from UAE-related entities MBZUAI (62%) and G42 (24%). The single customer MBZUAI accounted for 77.9% of accounts receivable, indicating extremely high customer concentration risk.
- OpenAI serves simultaneously as a customer ($20 billion contract), lender ($1 billion), future shareholder (33 million warrants), and strategic controller (including exclusivity clauses), creating a circular, interlocking relationship.
- Technological advantages are evident in inference scenarios. Its CS-3 chip is 2.5 times faster than Nvidia’s B200 in Llama 4 Maverick inference speed. However, it cannot challenge Nvidia's CUDA ecosystem in model training and general-purpose computing.
- The prospectus discloses two material weaknesses in internal controls. Furthermore, CEO Andrew Feldman is a "product-sales" type founder rather than a "technology visionary," and his historical project, SeaMicro, was unsuccessful.
- The IPO pricing implies a valuation of $48.8 billion, with a price-to-sales ratio of 95x, far exceeding CoreWeave (15x) and Nvidia (25x). Sustaining this valuation would require revenue to reach $30-40 billion and achieving profitability.
Original Author: Xiao Hei, Deep Tide TechFlow
Pricing on May 13, trading begins on May 14, Nasdaq ticker: CBRS.
This is the largest IPO globally so far in 2026. The underwriting syndicate includes Morgan Stanley, Citigroup, Barclays, and UBS. This lineup secured 20x oversubscription during the roadshow, pushing the offer price from the initial $115-$125 range up to $150-$160, with an expected raise of $48 billion and a corresponding valuation of $48.8 billion.
Just three months ago, Cerebras' secondary market valuation was around $23 billion. That means, in the final stretch before the IPO, the company's book value more than doubled.
The story's "selling points" have been repeated countless times: Nvidia challenger, wafer-scale chips, inference speed 21 times faster than the B200, a computing contract with OpenAI starting at $1 billion and potentially reaching $20 billion. It's a perfect "AI challenger" script, with every component – technological narrative, geopolitical narrative, marquee customers, massive orders – precisely aligned with the 2026 AI infrastructure theme.
But read the S-1 filing page by page, and you'll find something strange: all the public reports tell one story, while the prospectus tells another.
The Triple Paradox
Breaking down the prospectus item by item, Cerebras presents an investment target defined by a "triple paradox."
Paradox One: Technologically a True Alpha, Financially an Accounting Illusion.
The prospectus discloses: 2025 revenue of $510 million, up 76% year-over-year, and a GAAP net profit of $237.8 million. This sounds excellent – a rapidly growing, profitable AI hardware company is practically a "mythical" target in the current valuation environment. CoreWeave was still loss-making when it went public in March this year; Cerebras delivers a 47% net profit margin.
However, this $237.8 million "net profit" includes $363.3 million from a one-time, non-cash accounting adjustment – a paper gain from a forward contract liability extinguishment related to G42. Excluding this item and adding back $49.8 million in stock-based compensation, the real non-GAAP net loss for 2025 is $75.7 million, a 247% deterioration from the $21.8 million loss in 2024.
In other words, the market sees a "profitable + 76% growth" IPO golden child, while the prospectus reveals a "rapidly growing company with widening losses." Both versions aren't strictly wrong; the difference lies in which one the market chooses to believe.
Paradox Two: Superficially Shed G42, Actually Entwined with OpenAI's Circular Structure.
The story of Cerebras' first failed IPO in 2024 is straightforward: client G42, with its UAE background, contributed 85% of first-half revenue. CFIUS launched a review, forcing the company to withdraw its application.
Returning to the fight a year and a half later, the client list appears diversified, adding heavyweights like OpenAI and AWS. But flipping through the May 2026 S-1, the 2025 client structure looks like this:
- MBZUAI (Mohamed bin Zayed University of Artificial Intelligence): 62%
- G42: 24%
- Combined Total: 86%
G42 merely ceded its "weight" to MBZUAI, which is also located in the UAE and is a related party to G42. MBZUAI, a single customer, accounts for 77.9% of accounts receivable.
Furthermore, the so-called "salvation line" with OpenAI is itself a nested structure. This contract is valued at over $20 billion, with OpenAI committing to purchase 750 megawatts of computing power. But the same document reveals several other things: OpenAI provided Cerebras with a $1 billion loan; OpenAI received nearly 33 million virtually free warrants in Cerebras; OpenAI's Master Relationship Agreement includes exclusivity clauses restricting Cerebras from selling to certain "named competitors."
In essence, OpenAI is simultaneously Cerebras' customer, lender, soon-to-be shareholder, and to some extent, strategic controller. An anonymous analyst made a harsh comment regarding an analysis on Medium: When revenue is circular, valuation is circular, and the IPO aims to allow those generating this revenue to cash out, this isn't a market; it's financial engineering.
The wording may be overly sharp, but factually, this statement is hard to refute.
Paradox Three: Superficially an Nvidia "Challenger," Essentially an Nvidia "Niche Filler."
This point is most easily overlooked by the market.
Cerebras' technology is indeed robust. The WSE-3 features 4 trillion transistors, 900,000 AI cores, and 44GB of on-chip SRAM, making an entire wafer into a single chip, bypassing the inter-chip communication bottleneck that all GPU clusters face. Independent Artificial Analysis benchmarks show that running Llama 4 Maverick (400 billion parameters), the CS-3 outputs over 2500 tokens per second per user, while Nvidia's flagship DGX B200 manages around 1000 tokens, and Groq and SambaNova achieve 549 and 794, respectively.
Numbers don't lie. Cerebras has a generational advantage over GPUs in the specific scenario of inference.
The keyword is "inference." Cerebras' own prospectus clearly states its strength lies in latency-sensitive inference workloads. For large model training and general-purpose computing, it lacks the ability or intention to challenge Nvidia. The CUDA ecosystem, built over nearly 20 years since 2007, comprising toolchains for model training, developer communities, and third-party libraries, remains firmly within Nvidia's moat.
More critically, the market is not static. Nvidia's Vera Rubin architecture, announced at GTC 2026, features 336 billion transistors and claims a 5x performance jump over Blackwell. AMD's MI400 has already reached 320 billion transistors. Google TPU v6, Amazon Trainium 3, Microsoft Maia 2 – hyperscalers are all developing custom chips. Nvidia spent over $18 billion on R&D in fiscal 2025, acquired AI inference startup Groq's assets for $20 billion last December, and invested another $4 billion in two photonics companies in March.
Therefore, a more accurate description is: Cerebras isn't aiming to replace Nvidia; it's carving out a differentiated position within Nvidia's "inference" niche. This is a real business, but a $48.8 billion valuation on $510 million in revenue implies a price-to-sales ratio of 95x.
Andrew Feldman's Third "Product Sale"
Beyond the numbers, it's worth understanding the company's key figure.
Andrew Feldman is an underrated "serial entrepreneur" in Silicon Valley. He's not a technical genius founder nor an academic. A Stanford Graduate School of Business alumnus, he was VP of Marketing at Riverstone Networks (IPO'd in 2001) and VP of Products at Force10 Networks (sold to Dell for $800 million in 2011).
In 2007, he co-founded SeaMicro with Gary Lauterbach, building "energy-efficient servers" by clustering low-power, small-core processors, challenging the mainstream high-power, large-core servers. The idea was ahead of its time, but the market wasn't ready. AMD bought SeaMicro for $334 million in 2012. Feldman stayed as a VP at AMD for two years before leaving.
Then he built Cerebras.
Looking at Feldman's trajectory reveals something interesting: he's not a "chip designer" but an "alternative bettor on compute infrastructure." SeaMicro was a bet that "small cores beat large cores" – half wrong, as AMD bought it for its Freedom Fabric interconnect technology for its server CPU platform, a path that didn't succeed, and the SeaMicro brand quietly faded. Cerebras is a bet that "big chips beat small chips," the exact opposite proposition of SeaMicro.
In a sense, Feldman does the same thing: identify overlooked, seemingly "impossible" paths in computing architecture, place heavy bets, and leverage exceptional salesmanship to push them to market. At SeaMicro, he held onto the Force10 sales team; AMD coveted his sales network. At Cerebras, his most crucial achievement was securing G42, allowing a hardware company that still sourced 80% of 2024 revenue from a single Middle Eastern client to ultimately sign a $20 billion contract with OpenAI.
The footnote to this story: Feldman is a product-selling CEO, not a technology-visionary CEO. He excels at selling a "seemingly crazy" product to clients willing to pay a premium for differentiation. This is his alpha.
Understanding this is vital because it directly determines the judgment on Cerebras' investment value.
So, Is CBRS Worth Investing In?
Overlaying the above triple paradox, the answer is far more complex than a simple "buy" or "don't buy."
If the goal is to capture the IPO first-day frenzy, with 20x oversubscription, the red-hot AI hardware sector, and a lack of pure-play public Nvidia alternatives, CBRS will likely surge on day one. This is event-driven short-term trading and doesn't require deep judgment.
However, for a "long-term hold" investment decision, three things must be carefully considered:
First, is Cerebras worth a 95x price-to-sales ratio?
CoreWeave, which IPO'd in March this year, trades at a P/S ratio of around 15x. Nvidia's current P/S ratio is about 25x. A company with $510 million in 2025 revenue, 86% customer concentration, and still operating at a real loss is being priced at 95x sales. This implies the market demands the company grow revenue to $3-4 billion within the next three to four years while achieving sustained profitability.
Can this happen? The key is whether OpenAI's $20 billion contract materializes on schedule. According to the prospectus, approximately 15% of remaining performance obligations – roughly $3.5 billion – will be recognized in 2026 and 2027. If this pace holds, Cerebras' 2027 revenue could exceed $2 billion, potentially compressing the P/S ratio into a reasonable range. However, any delay, any strategic shift by OpenAI, or any new customer loss could instantly shatter this valuation.
Second, how wide is Cerebras' moat?
The architectural advantage of the WSE-3 is real, but how long will it last? Nvidia Vera Rubin, AMD MI400, and Google TPU v6 are all advancing. The generational cycle in the chip industry is 18-24 months. If Cerebras slows down for even one cycle, its technical edge will be neutralized. While its R&D spending as a percentage of revenue is already high, the absolute amount remains orders of magnitude lower than the giants.
A deeper question is: is the wafer-scale chip path a mainstream route that will be widely adopted, or a "special forces" unit that can only survive in niche scenarios forever? There is no definitive answer. An optimistic view: as inference workloads grow from 30% of total AI computing today to over 70% in the future, Cerebras' niche could become the main battlefield. A pessimistic view: as long as Nvidia improves Rubin's inference performance, the niche will remain just a niche.
Third, Governance Structure and Geopolitical Risk
The prospectus reveals two often-overlooked but important things:
First, Cerebras uses a Class A/Class B dual-class stock structure. Post-IPO, insiders will hold 99.2% of voting power. Even if the founding team only holds 5% of outstanding shares in the future, they will still control the company. This means external minority shareholders have almost no voice in corporate governance.
Second, the company discloses two "material weaknesses in internal control over financial reporting." As an emerging growth company, it can be exempt from SOX 404(b) auditor attestation for five years post-IPO. This is a red flag – not a major one, but worth noting.
Geopolitically, CFIUS has resolved the G42 voting rights issue this time. However, export controls (export licenses for CS-2, CS-3, CS-4 to the UAE) remain a long-term variable. The Trump administration's policy direction on AI chip exports to the Middle East is not yet fully stabilized. Any policy swing could re-ignite tail risks for CBRS.
Conclusion
The CBRS IPO, as an event, is the most noteworthy AI hardware capital event of 2026. It defines the valuation anchor for the AI infrastructure theme in the secondary market, and its performance will ripple through the pricing of all related assets.
As a long-term holding, it is a classic "high odds, high uncertainty" bet. It wagers on the macro narrative of "inference is king," the micro execution of "Cerebras leveraging OpenAI to achieve niche monopoly," and the valuation assumption that "the market will continue to pay a 95x P/S premium for AI hardware." If all three conditions hold simultaneously, returns could be massive. If any one of them collapses, the drawdown will be brutal.
For institutional investors, the typical strategy is to avoid chasing on day one, waiting for Q3 earnings, key customer progress, and valuation digestion. For individual investors, treating it as a small tail-risk allocation within AI hardware holdings is acceptable. Treating it as an all-in conviction play? Please re-read the triple paradox above.
More important than whether CBRS skyrockets on its first trading day is the broader significance of this event: When a company generating 86% of its revenue from two related entities in the UAE and still operating at a real loss can be priced by the market at $48.8 billion, this fact alone tells everyone exactly where the capital frenzy in the AI infrastructure sector has reached.


