Cerebras Just Raised Its IPO Price to $150. It Was $115 on Monday. This Is the AI Hardware Listing Every Investor Is Watching.

The price range for Cerebras Systems' initial public offering was $115 to $125 per share when the company filed its amended S‑1 on May 4. By Sunday, May 10, Reuters was reporting that the company is considering raising it to $150 to $160 per share, increasing the number of shares offered from 28 million to 30 million, and targeting a raise of approximately $4.8 billion, up from $3.5 billion under the original terms.
The reason is demand. The original offering was oversubscribed by more than 20 times, meaning institutional investors placed orders for more than 20 dollars of shares for every one dollar of shares available. When an IPO is 20 times oversubscribed, the lead underwriters have the leverage to reprice. Cerebras and its underwriters — Morgan Stanley, Citigroup, Barclays, and UBS — are exercising that leverage.
Shares are expected to begin trading on the Nasdaq Global Select Market under the ticker symbol CBRS on May 13 or 14, 2026. At the top of the new range, this would be the largest tech IPO globally so far this year, according to Dealogic.
What Cerebras Builds and Why It Matters
Cerebras Systems is a semiconductor company founded in 2016 by Andrew Feldman and a team that includes CEO co‑founders Sean Lie, Gary Lauterbach, Michael James, and Jean‑Philippe Fricker. The company's core product is the Wafer‑Scale Engine, a chip that uses the entire silicon wafer as a single processing unit rather than cutting the wafer into smaller chips and assembling them into a multi‑chip module.
The WSE‑3, the third generation of this design, is approximately 57 times larger than the largest competing GPU die. The size advantage translates directly into performance metrics that matter for AI inference specifically:
- Far more on‑chip memory, allowing larger model KV‑caches to reside entirely on silicon without DRAM communication latency.
- More processing cores in direct communication without inter‑chip interconnect overhead.
- Higher interconnect bandwidth between cores than any multi‑chip GPU cluster can achieve.
- Lower latency for inference operations compared to GPU‑based systems at equivalent compute capacity.
This performance profile makes Cerebras specifically competitive for inference, the computations that allow AI models to respond to user queries, rather than for training, where Nvidia's GPU architecture has entrenched advantages due to the CUDA software ecosystem. As AI labs shift from training models to deploying them, the inference market is growing faster than training, which is precisely why Cerebras' customer base has accelerated.
The OpenAI Relationship and the Financial Picture
The financial transformation that made Cerebras IPO‑ready is largely traceable to a single commercial relationship.
In January 2026, Cerebras announced it would provide up to 750 megawatts of AI computing power to OpenAI through 2028 in a transaction worth more than $20 billion. The company's S‑1 also discloses that OpenAI loaned Cerebras $1 billion, secured by warrants that allow OpenAI to buy over 33 million shares. So while OpenAI is not a large shareholder today, it could become one.
Revenue grew 76 percent year‑over‑year to $510 million in 2025. The company posted $87.9 million in GAAP net income for 2025, a 47 percent net margin that is rare for a hardware company at this stage. The comparison to 2024, when Cerebras posted a loss of approximately $485 million, reflects the step‑change impact of the OpenAI deal ramp.
The risks that the S‑1 discloses are specific and material. Customer concentration is the primary one: G42, the Abu Dhabi‑based AI company, accounted for approximately 62 percent of 2025 revenue, while OpenAI accounted for approximately 24 percent. The top two customers together represent 86 percent of revenue. CEO Andrew Feldman is not selling any shares in the IPO, which is a positive governance signal but does not change the concentration risk.
AWS is also a customer, with plans to deploy Cerebras CS‑3 systems in AWS data centers accessible through Amazon Bedrock. The AWS relationship is commercially significant because it provides a distribution channel beyond direct enterprise sales, reducing the concentration risk somewhat over time.
The company's $24.6 billion remaining performance obligations — essentially contracted backlog — is the most commercially important number in the filing for forward‑looking investors. It confirms that the OpenAI and AWS relationships generate predictable long‑term revenue rather than one‑off transactions.
What the Oversubscription Tells the Market
A 20x oversubscription on a $3.5 billion offering means approximately $70 billion of institutional investor orders were placed for an offering of $3.5 billion. That is not a rounding error. It is a market‑wide declaration that AI hardware IPOs at this stage of the AI buildout cycle are viewed as scarce access to a category that public market investors believe will compound significantly over the next five years.
The revised $150 to $160 range puts Cerebras at roughly 51 to 53 times its 2025 trailing revenue of $510 million. That multiple is aggressive by traditional chip company standards and consistent with the valuation framework investors are applying to AI infrastructure companies more broadly. CoreWeave went public in March at a $23 billion valuation and is not yet profitable at the same scale.
Whether Cerebras earns its post‑IPO valuation depends on three things: whether revenue growth from OpenAI and AWS accelerates beyond the 76 percent 2025 rate, whether a second and third major hyperscaler customer is added to reduce concentration risk, and whether the software and cloud services layer on top of Cerebras silicon can expand gross margins beyond the hardware business alone.
More at cerebras.net





