BTC
ETH
HTX
SOL
BNB
View Market
简中
繁中
English
日本語
한국어
ภาษาไทย
Tiếng Việt

Listing immediately triggered circuit breaker, surging over 108% in a single day – is Cerebras truly the "next Nvidia"?

Wenser
Odaily资深作者
@wenser2010
2026-05-15 10:32
This article is about 4096 words, reading the full article takes about 6 minutes
Deeply intertwined with OpenAI, its stock price may continue to soar.
AI Summary
Expand
  • Core Thesis: Cerebras (CBRS), a rising star in AI chips, claims its wafer-scale chip technology is 20 times faster than Nvidia. Its stock price surged over 100% on its IPO debut. Its core advantages lie in its ultra-large-scale chip architecture and deep ties with OpenAI (including a $20 billion order and potential equity stake). However, it cannot replace Nvidia in the short term, and its market position is more akin to a high-end niche player.
  • Key Elements:
    1. IPO Performance: Cerebras opened for trading last night, surging from its $185 offering price to a high of $385, an increase of over 108%. It has since retreated to $311, still holding gains of over 68%.
    2. Technology Differentiator: Its core product, the WSE-3 chip, has a die area of 46,225 mm² (the size of a dinner plate), containing 4 trillion transistors and 900,000 AI cores, delivering 125 petaflops of computing power. Its single-wafer design avoids the bottlenecks associated with interconnecting multiple GPUs.
    3. Business Metrics: Revenue in 2025 reached $510 million (up 76% year-over-year), and the company has achieved profitability. It signed a framework agreement for over $20 billion in computing power with OpenAI, under which OpenAI plans to purchase its servers over three years and receive equity as part of the transaction.
    4. Relationship with OpenAI: OpenAI founders Sam Altman and Greg Brockman were early angel investors in Cerebras. In December 2025, OpenAI provided a $1 billion working capital loan. The prospectus indicates OpenAI may receive approximately 33.44 million warrants at an extremely low exercise price, which would grant it a 10%-11% stake upon exercise.
    5. Competitive Limitations: Four main reasons prevent Cerebras from replacing Nvidia in the short term: the vast gap in the CUDA ecosystem (making it difficult for developers to switch), weaker economies of scale (Nvidia's 2025 revenue was hundreds of billions of dollars compared to Cerebras's $510 million), high chip manufacturing costs (wafer-scale chips face significant yield challenges and high unit prices), and direct competition from Groq, AMD, and Google TPUs.
    6. Stock Price Outlook: Benefiting from the AI boom and computing power shortages, the stock likely has short-term upside potential. The conversion of orders over the next 2-3 years and changes in inference demand will be key variables; failure to meet expectations could pressure the stock price.

Original|Odaily (@OdailyChina)

Author|Wenser (@wenser 2010)

Last night, Cerebras (CBRS), touted as the "next Nvidia," officially began trading. Shortly after opening, its price surged from the IPO price of $185 to $350, briefly hitting a high of $385 during the session—a gain of over 108%. Although the stock has since pulled back to around $311, it still holds a gain of over 68%. Previously, Cerebras CEO Andrew Feldman stated in a CNBC interview: "Our chip is the size of a dinner plate and 20 times faster than Nvidia's chips."

What gives this chipmaker, which has raised $5.5 billion, the confidence to make such bold claims about being faster than Nvidia? How did it secure a $20 billion order from OpenAI amidst fierce competition? Will its stock price continue its upward trend in the short term? Odaily will provide its own answers to these questions in this article.

Cerebras' Confidence Against Nvidia: Opening a New AI World with Wafer-Scale Chips

As the gap in AI computing power continues to widen, surging market demand has propelled Nvidia to become the world's most valuable publicly traded company.

Recently, Nvidia's stock hit new highs, pushing its market cap past $5.5 trillion. By market cap alone, it has become an economy second only to the GDP of the US and China, far surpassing major global economies like Germany and Japan—truly "a country of its own."

But unlike the decades-old "established behemoth" Nvidia, Cerebras (CBRS) is merely an upstart in the chip manufacturing industry.

In 2016, industry veterans Andrew Feldman, Gary Lauterbach, Sean Lie, Michael James, and JP Fricker co-founded Cerebras Systems, headquartered in Sunnyvale, California. Unlike Nvidia's strategy of building general-purpose GPUs to maximize market demand, Cerebras' core innovation is the Wafer Scale Engine (WSE), currently the world's largest AI chip.

Cerebras Founding Team of Five (2022)

Its core products include:

  • WSE-3: Approximately 46,225 mm² (dinner-plate sized), containing 4 trillion transistors and 900,000 AI-optimized cores, delivering 125 petaflops of computing power. Unlike traditional GPUs, it makes an entire wafer into a single giant processor, avoiding multi-GPU interconnection bottlenecks, with 44GB of on-chip SRAM and extremely high memory bandwidth.
  • CS-3 System: An AI supercomputer based on the WSE-3, supporting both training and inference; Currently, Cerebras not only sells chips but also offers cloud services (Cerebras Inference), dedicated data centers, and on-premises deployment technical support.

In terms of business model, Cerebras primarily provides ultra-low latency inference for OpenAI, Meta, Perplexity, Mistral, GSK, Mayo Clinic, among others. In 2025, Cerebras generated $510 million in revenue (up 76% year-over-year), achieved profitability, and is backed by massive orders (including a multi-year, hundreds of megawatts computing contract with OpenAI).

Cerebras WSE-3 Chip Diagram

On the day of its IPO, May 14, Cerebras CEO Andrew Feldman also gave positive responses on CNBC's "Squawk Box" regarding the company's operational status, technological moat, and future market direction:

  • First, Feldman stated that the IPO was "the right way to fund our growth," saying the company is mature and public markets can support massive growth opportunities. He emphasized it was the result of a decade of hard work, felt very proud, and noted the market "understood our story and responded positively."
  • Second, he repeatedly stressed that Cerebras is the only company in 70 years to successfully manufacture a "giant chip," with all other attempts failing, meaning the "technical moat is wide and deep." It was here he mentioned that Cerebras' chip is 58 times larger than competitors' like Nvidia's and runs 15-20 times faster, significantly accelerating AI inference and training.
  • Finally, addressing concerns about the sustainability of AI spending, Feldman indicated that demand is "huge and continuously growing." The company's chips cause a qualitative change in the AI experience (faster responses, real-time agents, etc.). He mentioned significant partnerships with OpenAI, AWS, and others, and is optimistic about the overall AI hardware environment.

On a side note, similar to Musk and Anthropic's bet on "space data centers" (recommended reading: "Musk and Anthropic Are Going to Space to Find Electricity"), Feldman boldly predicted that "within 15 years, data centers in space are very likely to become a reality," showing his clear confidence in the long-term construction and rapid expansion of AI infrastructure.

Thus, as a "speed geek" in the AI chip field, Cerebras has successfully broken through by focusing on the extreme performance of ultra-large-scale models, emerging as a strong challenger to Nvidia in areas like large model inference and ultra-large-scale training applications.

In this regard, OpenAI's $20 billion order provides ample confidence for its development, and the cooperation between the two goes far beyond the simple "chipmaker" and "chip buyer" relationship.

The Complex Relationship Between Cerebras and OpenAI: Customer, Creditor, and Potential Major Shareholder

The connection between Cerebras and OpenAI goes way back. Besides corporate-level cooperation, OpenAI founder Sam Altman, co-founder Greg Brockman, and others were early angel investors in Cerebras, holding small stakes. This might be a key reason for the deep, multi-faceted binding between the two companies today.

In December 2025, OpenAI provided Cerebras with a $1 billion Working Capital Loan, establishing a creditor relationship between them.

In January of this year, the "750MW Inference Compute Procurement Agreement" between Cerebras and OpenAI was officially unveiled, later emphasizing an option to expand the cooperation to 2GW. This news was confirmed again in April. According to media reports, OpenAI plans to spend over $20 billion over the next three years to purchase servers powered by Cerebras chips and will receive equity in the company as part of the deal. This made OpenAI Cerebras' largest customer by far, with no close second.

Image

Image Source: @Xingpt

Cerebras' subsequent S-1 prospectus and IPO filing documents show that OpenAI is expected to obtain approximately 33.44 million Cerebras warrants at an extremely low exercise price of $0.00001 per share. Some warrants come with vesting conditions, including compute delivery dates and milestone requirements like Cerebras' market cap exceeding $40 billion.

If all warrants are exercised and conditions are met, OpenAI could acquire approximately 10%-11% equity (the exact percentage depends on post-IPO total shares). Based on the ~$56 billion valuation at the time of the IPO pricing, this equity stake is worth about $5-6 billion; based on the current market cap (nearly $95 billion after the IPO's first day close), this stake is worth over $10.3 billion. Although not fully exercised yet, calling OpenAI a "potential major shareholder" of Cerebras is beyond doubt.

Image

Image Source: @Xingpt

Whether Cerebras Can Become the Next Nvidia Remains Uncertain, But Stock Price May Continue to Rise in the Short Term

Returning to the initial third question: Can Cerebras become the next Nvidia?

From an industry landscape perspective, the answer is undoubtedly no. There are four main reasons:

  • First, a huge gap in ecosystem: As the absolute hegemon in chip manufacturing, Nvidia's CUDA software stack is the undisputed industry standard, with countless developers, frameworks, and toolchains built upon it. While Cerebras has its own software stack, it is far from CUDA's maturity and compatibility, making the switching cost prohibitively high for many developers and enterprises.
  • Second, differences in scale and diversification strategy: In 2025, Nvidia's revenue reached tens of billions of dollars, with GPUs covering all scenarios—training, inference, graphics, automotive, data centers. Jensen Huang even boldly predicted at CES 2026 that "the AI chip and infrastructure market could reach $1 trillion by 2027," with Nvidia holding the largest share. In contrast, Cerebras' 2025 revenue was only $510 million, with customers relatively concentrated on a single giant like OpenAI, making it less resilient to risk.
  • Third, differences in chip manufacturing and cost control: Ultra-large AI chips bring not only faster speeds but also higher manufacturing difficulty and cost. Cerebras' wafer-scale chip requires an entire wafer per chip, leading to low yields, significant yield challenges, and high unit costs for TSMC (a CS-3 system costs far more than a single GPU). On the other hand, Nvidia can cut dozens of GPUs from one wafer, achieving stronger economies of scale and higher economic returns.
  • Fourth, different competitive pressures in the chip industry: Unlike Nvidia's dominant position, Cerebras faces direct competition from multiple industry players like Groq, AMD, Google TPU, and AWS Trainium. Despite its current strong momentum, constrained by time, capital, and resources, its current positioning is more like a "high-end niche player" rather than a "market dominator."

Based on the above information, Cerebras cannot grow into an industry giant like Nvidia in the short term, nor can it disrupt the existing competitive landscape. However, in terms of stock price comparison, its per-share price has already surpassed Nvidia. Additionally, benefiting from the booming AI craze and the growing computing power gap, before OpenAI and Anthropic go public within this year, Cerebras' stock price and market cap may still have some upside potential.

Over the next 2-3 years, if it can successfully convert orders from OpenAI, AWS, etc., into actual revenue, Cerebras' stock price could explore further highs. However, if order performance falls short of market expectations or if demand for AI model inference changes, its stock price will face significant downward pressure.

In summary, within 1-3 years, Cerebras is unlikely to replace Nvidia, but it can carve out a certain share in the AI infrastructure niche market, becoming the "King of AI Chip Speed." As for the longer-term competitive landscape, it still needs time to be proven.

Recommended Reading

A Decade-Long Bet on Cerebras: How the 'Wafer-Scale AI Chip' Made It to Nasdaq

Cerebras AI Chip Breaks Nvidia's Monopoly: A Deep Dive into Cerebras' Technical Design

technology
AI
Musk