上市即熔断,单日暴涨超108%,Cerebras真是“下一个英伟达”?
- 核心观点:Cerebras(CBRS)作为AI芯片新贵,凭借晶圆级芯片技术宣称比英伟达快20倍,IPO首日股价暴涨超100%。其核心优势在于超大规模芯片架构和与OpenAI的深度绑定(包括200亿美元订单及潜在股权),但短期内无法取代英伟达,市场地位更偏向高端小众玩家。
- 关键要素:
- 上市表现:Cerebras于昨夜开盘,从185美元发行价一度冲高至385美元,涨幅超108%,目前回落至311美元,仍有超68%涨幅。
- 技术差异:核心产品WSE-3芯片面积达46,225 mm²(餐盘大小),含4万亿晶体管、90万个AI核心,提供125 petaflops计算力,通过单晶圆设计避免多GPU互联瓶颈。
- 业务实绩:2025年收入5.1亿美元(同比增76%),已实现盈利;与OpenAI签署超200亿美元算力框架合同,OpenAI计划分三年采购其服务器并获股权作为交易部分。
- 与OpenAI关系:OpenAI创始人Sam Altman、Greg Brockman是Cerebras早期天使投资者;2025年12月OpenAI提供10亿美元营运贷款;招股书显示OpenAI或以极低行权价获得约3344万份认股权证,若行权将持股10%-11%。
- 竞争局限:Cerebras无法短期内取代英伟达的主因有四:CUDA生态差距巨大(开发者难以切换)、规模效应弱(英伟达2025年收入是数百亿美元,Cerebras仅5.1亿美元)、芯片制造成本高(晶圆级芯片良率挑战大、单价高)、面临Groq、AMD、Google TPU等直接竞争。
- 股价展望:受益于AI热潮和算力缺口,短期仍有上行空间;后续2-3年订单转化和推理需求变化将是关键变量,若不及预期则股价承压。
Original|Odaily Planet Daily (@OdailyChina)
Author|Wenser (@wenser 2010)
Last night, Cerebras (CBRS), bearing the moniker of "the next Nvidia," officially opened for trading. Shortly after, its share price surged from its IPO price of $185 to $350, hitting an intraday high of $385—a gain of over 108%. Although the stock has since retreated to around $311, it still maintains an increase of more than 68%. Previously, Cerebras CEO Andrew Feldman stated in a CNBC interview: "Our chip is the size of a dinner plate and is 20 times faster than Nvidia's chips."
What gives this chipmaker, which has raised $5.5 billion, the confidence to boast such bold claims of being "faster than Nvidia's chips"? How did it manage to secure a $20 billion order from OpenAI in a fiercely competitive landscape? Will its stock price continue its upward trend in the short term? Odaily Planet Daily will provide its own analysis on these questions in this article.

Cerebras' Confidence in Challenging Nvidia: Opening a New AI World with Wafer-Scale Chips
As the gap in AI computing power continues to widen, surging market demand has propelled Nvidia to become the world's most valuable publicly listed company.
Recently, Nvidia's stock price hit new highs, with its market capitalization briefly surpassing $5.5 trillion. In terms of its market cap, it has become an economic entity second only to the GDP of the United States and China, far exceeding major global economies like Germany and Japan, truly making it "richer than a nation."

But unlike the decades-old "veteran powerhouse" Nvidia, Cerebras (CBRS) is a relative newcomer in the chip manufacturing industry.
In 2016, industry veterans Andrew Feldman, Gary Lauterbach, Sean Lie, Michael James, and JP Fricker co-founded Cerebras Systems, headquartered in Sunnyvale, California. Unlike Nvidia's technical strategy of building general-purpose GPUs to maximize market demand, Cerebras' core innovation lies in its Wafer Scale Engine (WSE), currently the world's largest AI chip.

Cerebras founding team of five (2022)
Its core products include:
- WSE-3: Approximately 46,225 mm² in size (about the size of a dinner plate), containing 4 trillion transistors and 900,000 AI-optimized cores, delivering 125 petaflops of computing power. Unlike traditional GPUs, it turns an entire wafer into a single massive processor, avoiding the bottlenecks of multi-GPU interconnects, with 44GB of on-chip SRAM and extremely high memory bandwidth.
- CS-3 System: An AI supercomputer based on the WSE-3, supporting both training and inference. Currently, Cerebras not only sells chips but also offers cloud services (Cerebras Inference), dedicated data centers, and local deployment technical support.
In terms of business model, Cerebras primarily provides ultra-low latency inference for clients like OpenAI, Meta, Perplexity, Mistral, GSK, and the Mayo Clinic. In 2025, Cerebras generated annual revenue of $510 million (up 76% year-over-year), achieved profitability, and is backed by massive order backlogs (including a multi-year, multi-hundred-megawatt compute contract with OpenAI).

Cerebras WSE-3 chip diagram
On the day of its IPO on May 14, Cerebras CEO Andrew Feldman also addressed the company's operational status, its technological moat, and future market direction in a CNBC "Squawk Box" interview:
- Firstly, Feldman stated that the IPO was "the right way to fund our growth," emphasizing that the company is mature and public markets can support significant growth opportunities. He called it the culmination of a decade of effort, expressed immense pride, and noted that the market "understood our story and responded positively."
- Secondly, he repeatedly stressed that Cerebras is the only company in 70 years to have successfully manufactured a "giant chip," with all other attempts failing, making its "technical moat both wide and deep." It was here he mentioned that Cerebras' chip is 58 times larger than competitors like Nvidia's and runs 15-20 times faster, significantly accelerating AI inference and training.
- Finally, addressing concerns about the sustainability of AI spending, Feldman stated that demand is "enormous and continuously growing." The company's chips transform the AI experience (faster responses, real-time agents, etc.). He highlighted key partnerships with OpenAI, AWS, and others, and expressed optimism about the overall AI hardware environment.

On a related note, similar to Musk and Anthropic's bet on "space data centers" (recommended reading: "Musk and Anthropic Are Going to Space to Find Electricity"), Feldman boldly predicted that "data centers in space could very well become a reality within 15 years," showcasing his unwavering confidence in the long-term buildout and rapid expansion of AI infrastructure.
Thus, as the "speed geek" of the AI chip field, Cerebras has successfully broken through by focusing on the extreme performance for large-scale models, emerging as a formidable challenger to Nvidia in areas like large model inference and large-scale training applications.
In this regard, OpenAI's $20 billion order provides ample confidence for its development, and the partnership between the two goes far beyond a simple "chipmaker" and "chip buyer" relationship.
The Complex Relationship Between Cerebras and OpenAI: Customer, Creditor, and Potential Major Shareholder
The connection between Cerebras and OpenAI has a long history. Beyond corporate collaborations, OpenAI founder Sam Altman and co-founder Greg Brockman were early angel investors in Cerebras, holding small stakes. This is likely a significant reason for the deep, multi-faceted ties the two companies now share.
In December 2025, OpenAI provided Cerebras with a $1 billion Working Capital Loan, establishing a creditor-borrower relationship between the two parties.
In January of this year, the "750MW Inference Compute Procurement Agreement" between Cerebras and OpenAI was officially unveiled, with later emphasis on an option to expand the partnership to 2GW. This news was confirmed again in April. According to media reports, OpenAI plans to spend over $20 billion over the next three years to purchase servers powered by Cerebras chips and will receive equity in the company as part of the deal. This makes OpenAI Cerebras' largest customer, without question.

Image source: @Xingpt
Cerebras' subsequent S-1 filing and IPO documents show that OpenAI is expected to obtain approximately 33.44 million warrants for Cerebras shares at an extremely low exercise price of $0.00001 per share. Some of these warrants include vesting conditions, such as compute delivery dates and milestone requirements like Cerebras achieving a market cap exceeding $40 billion.
If all warrants are exercised and conditions met, OpenAI could obtain approximately 10%-11% equity (the exact percentage depends on the total share count post-IPO). Based on the ~$56 billion valuation at the time of the IPO pricing, this stake is worth approximately $5-6 billion; based on the current market cap (which neared $95 billion after the first day of trading), this stake is worth over $10.3 billion. Although not fully exercised yet, referring to OpenAI as a "potential major shareholder of Cerebras" is beyond doubt.

Image source: @Xingpt
Whether Cerebras Can Become the Next Nvidia Remains Uncertain, but Short-Term Stock Price May Continue to Rise
Returning to the initial third question: Can Cerebras become the next Nvidia?
From an industry landscape perspective, the answer is likely no. There are four main reasons:
- First, the ecosystem gap is enormous: As the absolute leader in chip manufacturing, Nvidia's CUDA software stack is the undisputed industry standard, with countless developers, technical frameworks, and toolchains reliant upon it. While Cerebras has its own software stack, it is far from achieving the maturity and compatibility of CUDA, making the switching cost prohibitively high for many developers and enterprises.
- Second, their scale and diversification paths differ vastly: In 2025, Nvidia's revenue was tens of billions of dollars, and its GPUs cover full-stack scenarios including training, inference, graphics, automotive, and data centers. Jensen Huang even boldly predicted at CES 2026 that "the AI chip and infrastructure market could reach $1 trillion by 2027," with Nvidia poised to claim the largest share. In contrast, Cerebras' 2025 revenue was only $510 million, with its customer base relatively concentrated on a single giant like OpenAI, making it less resilient to risk.
- Third, their manufacturing and cost structures are different: Extremely large AI chips mean not only faster speeds but also higher manufacturing difficulty and costs. Cerebras' wafer-scale chip requires an entire wafer per chip, leading to lower yields, greater yield challenges, and higher unit costs at TSMC (a single CS-3 system costs significantly more than a single GPU). Nvidia, on the other hand, can cut dozens of GPUs from a single wafer, achieving stronger economies of scale and higher economic returns.
- Fourth, they face different competitive pressures: Unlike Nvidia's dominant industry position, Cerebras faces direct competition from numerous players like Groq, AMD, Google TPU, and AWS Trainium. Despite its current strong momentum, Cerebras is constrained by time, capital, and resources, positioning it more like a "high-end niche player" rather than a "market dominator."
Based on the above, Cerebras cannot grow into an industry giant like Nvidia in the short term, nor can it disrupt the current competitive landscape. However, from a stock price comparison perspective, its per-share price has already surpassed Nvidia's. Furthermore, fueled by the booming AI trend and the growing computing power gap, Cerebras' stock price and market cap may still have some room to rise this year, before OpenAI and Anthropic go public.
Over the next 2-3 years, if Cerebras can successfully convert orders from OpenAI, AWS, and others into actual revenue, its stock price could climb further. However, if order performance falls short of market expectations or if demand for AI model inference changes, its stock price could face significant downward pressure.
In summary, within 1-3 years, Cerebras is unlikely to replace Nvidia, but it can carve out a significant share in the AI infrastructure niche market, becoming the "King of AI Chip Speed." As for the longer-term competitive landscape, it still needs time to play out.
Recommended Reading
A Decade-Long Bet on Cerebras: How 'Wafer-Scale AI Chips' Landed on Nasdaq
Cerebras AI Chip Breaks Nvidia's Monopoly: A Deep Dive into Cerebras' Technical Design


