Cerebras IPO: $48.8 Billion Valuation – Is the "Nvidia Challenger" a Bubble or a New King?
- Core Thesis: This article analyzes the Cerebras (CBRS) IPO prospectus, revealing three paradoxes beneath its surface-level "AI challenger" narrative: its accounting profit stems from one-time adjustments, its customers are highly concentrated with UAE-affiliated entities, its technological advantage is limited to the inference segment, and its valuation implies extremely high risk.
- Key Elements:
- Cerebras reported 2025 revenue of $510 million and GAAP net income of $237.8 million. However, excluding one-time accounting adjustments ($363.3 million) and stock-based compensation, its non-GAAP net loss was $75.7 million, a 247% increase year-over-year.
- In 2025, 86% of revenue came from UAE-affiliated entities: MBZUAI (62%) and G42 (24%). A single customer, MBZUAI, accounted for 77.9% of accounts receivable, presenting extreme customer concentration risk.
- OpenAI serves simultaneously as a customer ($20 billion contract), lender ($1 billion), future shareholder (33 million warrants), and strategic controller (including exclusivity clauses), creating a circular, interlocking relationship.
- Its technological strength lies in the inference segment. The CS-3 achieves 2.5 times the inference speed of Nvidia's B200 on Llama 4 Maverick. However, it cannot challenge Nvidia's CUDA ecosystem in model training and general-purpose computing.
- The prospectus discloses two material weaknesses in internal controls. Furthermore, CEO Andrew Feldman is a "product sales" type founder rather than a "technological visionary," and his prior project, SeaMicro, was not successful.
- The IPO pricing implies a valuation of $48.8 billion, a price-to-sales ratio of 95x, far exceeding CoreWeave (15x) and Nvidia (25x). This valuation requires revenue to reach $30-40 billion and sustained profitability to be justified.
Original Author: Xiao Hei, Deep Tide TechFlow
Priced on May 13, begins trading on May 14, Nasdaq ticker CBRS.
This is the largest IPO globally so far in 2026. The underwriting syndicate includes Morgan Stanley, Citigroup, Barclays, and UBS. This lineup secured 20 times oversubscription during the roadshow, pushing the offering price from the initial $115-$125 range up to $150-$160, with expected proceeds of $48 billion, corresponding to a valuation of $48.8 billion.
Just three months ago, Cerebras' secondary market valuation was around $23 billion. That means in the final leg before the IPO, the company's book value more than doubled.
The "selling points" of the story have been repeated countless times: Nvidia challenger, wafer-scale chips, inference speed 21x faster than the B200, a computing power contract with OpenAI starting at $1 billion and potentially reaching $20 billion. It's a perfect "AI Challenger" script, with technology narrative, geopolitics narrative, star clients, and massive orders, every component precisely aligning with the 2026 AI infrastructure theme.
But reading the S-1 filing page by page reveals a strange contradiction: all public reports tell one story, while the prospectus tells another.
The Triple Paradox
Breaking down the prospectus item by item, Cerebras presents itself as an investment target built on a "triple paradox."
First Paradox: Technologically a true Alpha, financially it's accounting magic.
The prospectus discloses: 2025 revenue of $510 million, up 76% year-over-year, and GAAP net profit of $237.8 million. It sounds fantastic—a rapidly growing, already profitable AI hardware company is almost a "mythical" target in the current valuation environment. CoreWeave was still losing money when it IPO'd in March this year; Cerebras directly delivers a 47% net profit margin.
However, of that $237.8 million "net profit," $363.3 million came from a one-time, non-cash accounting adjustment—a paper gain from the extinguishment of a forward contract liability related to G42. Excluding this and adding back $49.8 million in stock-based compensation, the real non-GAAP net loss for 2025 was $75.7 million, a 247% deterioration from the $21.8 million loss in 2024.
In other words, the market sees a "profitable + 76% growth" IPO golden child, while the prospectus reveals a "fast-growing company with continuously expanding losses." Neither version is wrong; the difference lies in which one the market chooses to believe.
Second Paradox: On the surface, it has moved away from G42, but in reality, it has donned a circular structure with OpenAI.
The story of Cerebras' first failed IPO in 2024 isn't complicated: G42, a UAE-based client, contributed 85% of H1 revenue, prompting a CFIUS investigation and forcing the company to withdraw its application.
Coming back a year and a half later, the client list seems diversified, adding heavyweights like OpenAI and AWS. But looking at the May 2026 S-1, the 2025 client structure is as follows:
- MBZUAI (Mohamed bin Zayed University of Artificial Intelligence): 62%
- G42: 24%
- Combined total: 86%
G42 merely transferred its "weight" to MBZUAI, which is also located in the UAE and is a related party to G42. MBZUAI, as a single client, accounts for 77.9% of accounts receivable.
And the so-called "redemption line" with OpenAI is itself a nested structure. The contract value exceeds $20 billion, with OpenAI committing to purchase 750 megawatts of computing power. But the same document also discloses several things: OpenAI provided a $1 billion loan to Cerebras; OpenAI received nearly 33 million warrants in Cerebras at a negligible price; OpenAI's Master Relationship Agreement includes exclusivity clauses restricting Cerebras from selling to certain "named competitors."
In other words, OpenAI is simultaneously Cerebras' client, lender, soon-to-be shareholder, and, to some extent, strategic controller. An anonymous analyst made a sharp comment on a Medium analysis: When revenues are circular, valuations are circular, and the IPO exists for those generating these revenues to cash out, it's not a market—it's financial engineering.
The phrasing might be overly harsh, but factually, this statement is hard to refute.
Third Paradox: On the surface, it's Nvidia's "challenger," but in essence, it's Nvidia's "narrow-band filler."
This point is most easily overlooked by the market.
Cerebras' technology is undeniably solid. The WSE-3 features 4 trillion transistors, 900,000 AI cores, and 44GB of on-chip SRAM, turning an entire wafer into a single chip, bypassing the inter-chip communication bottlenecks that all GPU clusters face. Independent Artificial Analysis benchmarks show that running Llama 4 Maverick (400 billion parameters), the CS-3 outputs over 2,500 tokens per second per user, compared to Nvidia's flagship DGX B200 which hits around 1,000 tokens, with Groq and SambaNova achieving 549 and 794, respectively.
Numbers don't lie. In the specific scenario of inference, Cerebras holds a generational advantage over GPUs.
The keyword is "inference." Cerebras' own prospectus clearly states that it excels at latency-sensitive inference workloads. For large model training and general-purpose computing, it lacks the ability or intention to challenge Nvidia. The CUDA ecosystem, accumulated over nearly 20 years since 2007, with its toolchains, developer community, and third-party libraries, remains firmly within Nvidia's moat.
More critically, the market hasn't stood still. Nvidia's Vera Rubin architecture, announced at GTC 2026, boasts 336 billion transistors, claiming performance jumping 5x over Blackwell. AMD's MI400 has already reached 320 billion transistors. Google's TPU v6, Amazon's Trainium 3, and Microsoft's Maia 2—hyperscalers are all developing custom chips. Nvidia's R&D spending in fiscal 2025 exceeded $18 billion, and it spent $20 billion in December last year acquiring assets from AI inference startup Groq, followed by a $4 billion investment in two photonics technology companies in March.
A more accurate description is: Cerebras isn't trying to replace Nvidia; it's carving out a differentiated position within the narrow band of Nvidia's "inference" sector. It's a real business, but a $48.8 billion valuation against $510 million in revenue implies a price-to-sales ratio of 95x.
Andrew Feldman's Third Time "Selling a Product"
Beyond the numbers, it's worth discussing the company's soul.
Andrew Feldman is an underestimated "serial entrepreneur" in Silicon Valley. He isn't a technical genius founder, nor one who emerged from academia. Graduating from Stanford Business School, he served as VP of Marketing at Riverstone Networks (IPO'd in 2001) and VP of Products at Force10 Networks (acquired by Dell for $800 million in 2011).
In 2007, he co-founded SeaMicro with Gary Lauterbach, focusing on "energy-efficient servers" by clustering many low-power, small-core processors to challenge the mainstream high-power, large-core servers. The idea was very forward-thinking, but the market was too early. AMD acquired SeaMicro for $334 million in 2012, and Feldman left after serving as VP at AMD for two years.
Then he built Cerebras.
Looking at Feldman's trajectory reveals an interesting pattern: he's not a "chip designer" but an "alternative bettor on compute infrastructure." SeaMicro was a bet on "small cores beating large cores"—half wrong; AMD bought it aiming to use its Freedom Fabric interconnect technology for its server CPU platform, but that path failed, and the SeaMicro brand faded away. Cerebras is a bet on "big chips beating small chips," the exact opposite proposition of SeaMicro.
In a sense, Feldman does the same thing: find overlooked, seemingly "impossible" paths in computing architecture ignored by the mainstream, place heavy bets, and push them to market with strong sales ability. At SeaMicro, he managed to keep Force10's sales team; AMD acquired it partly for his sales network. At Cerebras, his most crucial achievement was securing G42, enabling a hardware company with 80% revenue from a single Middle Eastern client in 2024 to eventually sign a $20 billion contract with OpenAI.
The footnote to this story: Feldman is a product-selling CEO, not a technology-visionary CEO. He excels at selling a "seemingly crazy" product to clients willing to pay a premium for differentiation—that's his alpha.
Understanding this is crucial because it directly determines the judgment on Cerebras' investment value.
So, Is CBRS Worth Investing In?
Layering the triple paradox above, the answer is much more complex than simply "buy" or "don't buy."
If the goal is to chase the IPO first-day frenzy, the 20x oversubscription, the red-hot AI hardware sector, and the lack of pure-play Nvidia alternative listed targets make it highly likely CBRS will pop on day one. This is event-driven short-term trading, requiring little deep judgment.
But for a "long-term hold" investment decision, three things must be clearly considered:
First, is Cerebras worth a 95x price-to-sales ratio?
CoreWeave's IPO in March this year had a P/S ratio of around 15x. Nvidia's current P/S ratio is about 25x. A company with $510 million in 2025 revenue, 86% client concentration, and real operational losses being priced at 95x P/S implies the market expects it to achieve $3–4 billion in revenue over the next three to four years, along with sustained profitability.
Can it happen? It hinges on whether the $20 billion OpenAI contract materializes as planned. According to the prospectus, around 15% of remaining performance obligations (roughly $3.5 billion) should be recognized in 2026 and 2027. At this pace, Cerebras' 2027 revenue could exceed $2 billion, potentially compressing the P/S ratio into a reasonable range. However, any delay, any strategic shift by OpenAI, or any new client loss could render this valuation fragile instantly.
Second, how wide is Cerebras' moat?
The architectural advantage of the WSE-3 is real, but how long will it last? Nvidia's Vera Rubin, AMD's MI400, and Google's TPU v6 are all advancing. The generational replacement cycle in the chip industry is 18–24 months. If Cerebras slows down even slightly, its technological edge will be neutralized. While its R&D spending as a percentage of revenue is already high, its absolute expenditure remains orders of magnitude lower than the major players.
A deeper question is: will the wafer-scale chip route become a widely adopted mainstream path, or will it forever remain a "special forces" approach viable only in niche scenarios? There is no definitive answer. An optimistic view: as inference workloads grow from 30% of total AI computing today to over 70% in the future, Cerebras' niche could become the main battleground. A pessimistic view: as long as Nvidia improves Rubin's inference performance, the niche will forever remain just a niche.
Third, governance structure and geopolitical risks
The prospectus discloses two easily overlooked but crucial points:
First, Cerebras has a Class A/Class B dual-class share structure, granting insiders 99.2% of voting power post-IPO. Even if the founding team holds only 5% of outstanding shares in the future, they will still control the company. This means external minority shareholders have virtually no say in corporate governance.
Second, the company discloses two "material weaknesses in internal control over financial reporting." As an emerging growth company, it is exempt from SOX 404(b) auditor attestation requirements for up to five years post-IPO. This is a red flag—not a major one, but worth noting.
Geopolitically, CFIUS has cleared the G42 voting rights issue this time, but export controls (export licenses for CS-2, CS-3, CS-4 to the UAE) remain a long-term variable. The Trump administration's policy direction on Middle East AI chip exports hasn't fully stabilized. Any policy swing could reignite tail risks for CBRS.
Conclusion
As an event, the CBRS IPO is the most significant AI hardware capital market event of 2026. It defines the valuation anchor for the AI infrastructure theme in the secondary market, and its performance will influence the pricing of all related assets.
As a long-term holding, it's a classic "high payout, high uncertainty" bet. It wagers on the macro narrative of "inference is king," the micro execution of "Cerebras leveraging OpenAI to achieve narrow-band dominance," and the valuation assumption that "the market will continue paying a 95x P/S premium for AI hardware." All three conditions must hold simultaneously for substantial returns; if any one collapses, the drawdown will be brutal.
For institutional investors, the typical approach is to avoid the first-day frenzy, wait for Q3 reports, monitor key client progress, and let the valuation digest. For individual investors, treating it as a small tail-end asset in an AI hardware allocation is feasible. But treating it as an all-in conviction stock? Please re-read the triple paradox above.
More noteworthy than whether CBRS skyrockets on its first trading day is the broader implication: When a company with 86% of its revenue coming from two related entities in the UAE, still operating at a real loss, can be priced by the market at $48.8 billion, this fact alone tells everyone just how extreme the level of capital frenzy in the AI infrastructure sector has become.


