Is this Startup IPO an Opportunity to Own the Next Big AI Stock?
Image courtesy of 123rf.com

Is this Startup IPO an Opportunity to Own the Next Big AI Stock?

There is no shortage of AI startups. But Cerebras Systems is making rapid compute gains.
Neither the author, Tim Fries, nor this website, The Tokenist, provide financial advice. Please consult our website policy prior to making financial decisions.

One month after unveiling its AI inference tool, Cerebras Inference, at the end of August, Cerebras Systems filed Form S-1 for an initial public offering (IPO) on Tuesday. The AI-focused startup is planning to go public on Nasdaq, exposing its valuation under the CBRS ticker.

However, the filing with the Securities and Exchange Commission (SEC) did not reveal the expected CBRS pricing and number of Class A common shares. Citigroup and Barclays, together with UBS Investment Bank, Wells Fargo Securities, Mizuho, and TD Cowen, are leading the IPO.

Investors are now wondering if Cerebras Systems is a mini-Nvidia in the making. What can the 8-year-old startup bring to the table to push the generative AI envelope?

Cerebras Systems AI Supercomputing

Focused on large scale operations, Cerebras Systems aims to drastically speed up the training of AI models with its flagship 3rd-gen CS-3 system. Powered by Wafer Scale Engine 3 (WSE-3) chips comprised of 4 trillion transistors and 900,000 processor cores, the system can expand to a massive 1,200 TB of memory, allowing for the training of up to 24 trillion parameters.

For comparison, ChatGPT-4 is estimated to have about 1.8 trillion parameters, which is 10x more than ChatGPT-3 as the launching point for the AI hype. Suffice to say, Cerebras Systems represents another major milestone. 

As a supercomputing cluster running on the CS-3 framework, the company is expected to launch Condor Galaxy 3 supercomputer in the second half of 2024, having partnered with Abu Dhabi-based G42 holding group.

“Condor Galaxy 3 and the follow-on supercomputers, will together deliver tens of exaflops of AI compute This marks a significant milestone in AI computing, providing unparalleled processing power and efficiency.”

Andrew Feldman, Cerebras Systems CEO

First announced in March, Condor Galaxy 3 is purportedly bringing 8 exaFLOPs (with 64 CS-3 systems) of compute power on the table. For comparison, Microsoft Azure’s Eagle cluster has a theoretical peak performance of 561 petaFLOPs. One exaFLOP represents billion billion calculations per second, while 1 exaFLOP equals 1,000 petaFLOPs.

For scaling purposes, this translates to Cerebras CS-3 offering double the compute power at the same energy consumption level and cost. 

Cerebras Systems Financial Outlook

With impressive compute power in play, the Sunnyvale-based Cerebras Systems offers either pay-per-model or pay-per-hour business model to train large language models (LLMs). Available for customization, LLMs range from company-specific chatbots used in customer services, AI coders and documentation summaries to content moderation and yet to be rolled out image-based Q&A. 

The aforementioned Cerebras Inference tool launched on August 27th can deliver up to 1,800 tokens per second performance, purportedly 20x faster than Nvidia GPU-based solutions for Llama3.1 8B (Llama 3.2 was released recently on September 25th). Each token represents the smallest unit of text for an AI model to process.

For comparison, ChatGPT-4’s median token processing is 65.54. However, due to heavy capital expenditures to push computing power to the next level, Cerebras Systems is yet to show profitability.

Although the company tripled its revenue in 2023 to $78.7 million vs $24.6 million in 2022, both years delivered net losses, at $127.2 million and $177.7 million respectively. Six months ending June 2024, this trend continued at a net loss of $66.6 million, representing a 14% net loss reduction from the $77.8 million in the year-ago quarter.

In the latest funding series in November 2021, Cerebras Systems raised $250 million. Across six founding rounds in total since 2016, the company raised $720 million, which elevated its valuation to ~$4 billion. 

Cerebras Systems Competitors

During 2024, Cerebras also increased operating expenditures as it needed to hire more talent. Expectedly, the startup cites Nvidia (NASDAQ: NVDA), AMD (NASDAQ: AMD), Intel (NASDAQ: INTC), Microsoft (NASDAQ: MSFT) and Alphabet (NASDAQ: GOOG) as main competitors. Additionally, Groq and Graphcore are competing with Cerebras in the AI inference upscaling niche.

Interestingly, Groq was founded by former Google engineers Jonathan Ross and Douglas Wightman in the same year as Cerebras Systems, 2016. The startup’s HPC cluster GroqRack, coupled with GroqCloud, can achieve up to 300 tokens per second for Llama 3 70B. No doubt, there will be a pricing war between these startups, benefiting end-customers. 

But if Cerebras Systems ends up attracting more IPO capital than expected, it could afford to get ahead of other startups and start competing directly against Big Tech companies.

Have you considered training your own AI model using these companies? Let us know in the comments below.

Disclaimer: The author does not hold or have a position in any securities discussed in the article.

100% FREE TRIAL: Learn how to day trade (the right way) with the #1 voted live trading room!

X