-

Hugging Face Partners with Cerebras to Give Developers Access to Industry’s Fastest AI Inference for Open-Source Models

SUNNYVALE, Calif.--(BUSINESS WIRE)--Cerebras and Hugging Face today announced a new partnership to bring Cerebras Inference to the Hugging Face platform. HuggingFace has integrated Cerebras into HuggingFace Hub, bringing the world’s fastest inference to over five million developers on HuggingFace. Cerebras Inference runs the industry’s most popular models at more than 2,000 tokens/s – 70x faster than leading GPU solutions. Cerebras Inference models including Llama 3.3 70B, will be available to HuggingFace developers, enabling seamless API access to Cerebras CS-3 powered AI models.

Cerebras recently announced industry-leading speeds for Llama 3.3 70B, achieving over 2,200 tokens per second – 70 times faster than GPU-based solutions. Leading industry models like OpenAI o3-mini take minutes to generate reasoning answers – Cerebras Inference completes the same tasks at comparable accuracy in mere seconds.

“We’re excited to partner with Hugging Face to bring our industry-leading inference speeds to the global developer community,” said Andrew Feldman, CEO, Cerebras. “By making Cerebras Inference available through Hugging Face, we’re empowering developers to work faster and more efficiently with open-source AI models, unleashing the potential for even greater innovation across industries.”

For the 5 million Hugging Face developers already using the Inference API, this new integration makes it easier than ever to switch to a faster provider for these popular open-source models. Developers can simply select “Cerebras” as their Inference Provider of choice in the Hugging Face platform.

Why Fast and Accurate Open-Source AI Inference Matters

Fast and precise AI inference is essential for a variety of applications, particularly as demand for higher token counts in inference increases with test-time compute and Agentic AI. Open-source models enable Cerebras to optimize these models for the CS-3, delivering developers faster and more accurate inference speeds— ranging from 10 to 70 times faster performance than GPUs.

“Cerebras has been a leader in inference speed and performance, and we’re thrilled to partner to bring this industry-leading inference on open-source models to our developer community,” said Julien Chaumond, CTO of Hugging Face.

Get Started Today

To try it out, visit any of the Hugging Face model cards already supported by Cerebras Cloud. For instance, you can explore Llama 3.3 70B, select Cerebras as your provider, and experience blazing-fast inference directly via Hugging Face.

About Cerebras Systems

Cerebras Systems is a team of pioneering computer architects, computer scientists, deep learning researchers, and engineers of all types. We have come together to accelerate generative AI by building from the ground up a new class of AI supercomputer. Our flagship product, the CS-3 system, is powered by the world’s largest and fastest commercially available AI processor, our Wafer-Scale Engine-3. CS-3s are quickly and easily clustered together to make the largest AI supercomputers in the world, and make placing models on the supercomputers dead simple by avoiding the complexity of distributed computing. Cerebras Inference delivers breakthrough inference speeds, empowering customers to create cutting-edge AI applications. Leading corporations, research institutions, and governments use Cerebras solutions for the development of pathbreaking proprietary models, and to train open-source models with millions of downloads. Cerebras solutions are available through the Cerebras Cloud and on premise. For further information, visit cerebras.ai or follow us on LinkedIn or X.

Contacts

Cerebras


Release Versions

Contacts

More News From Cerebras

Cerebras Systems Announces Launch of Initial Public Offering

SUNNYVALE, Calif.--(BUSINESS WIRE)--Cerebras Systems, Inc. (“Cerebras”) today announced that it plans to commence the roadshow for its proposed initial public offering of its Class A common stock. Cerebras has filed a registration statement on Form S-1 with the U.S. Securities and Exchange Commission (the “SEC”) to offer an aggregate of 28,000,000 shares of its Class A common stock to the public. In addition, Cerebras intends to grant the underwriters a 30-day option to purchase up to an additi...

Cerebras Systems Announces Filing of Registration Statement for Proposed Initial Public Offering

SUNNYVALE, Calif.--(BUSINESS WIRE)--Cerebras Systems Inc. (“Cerebras”) today announced that it has filed a registration statement on Form S-1 with the U.S. Securities and Exchange Commission (“SEC”) relating to a proposed initial public offering of its Class A common stock. The number of shares of Class A common stock to be offered and the price range for the proposed offering have not yet been determined. The offering is subject to market conditions, and there can be no assurance as to whether...

Cerebras Systems Closes $850 Million Revolving Credit Facility

SUNNYVALE, Calif.--(BUSINESS WIRE)--Cerebras Systems, makers of the fastest AI infrastructure in the industry, today announced the closing of a new five-year syndicated revolving credit facility for up to $850 million. This follows the company’s $1 billion Series G financing closed in September 2025, and an additional $1 billion Series H in January 2026. “We are pleased to have closed our inaugural credit facility with the support of a syndicate of leading financial institutions,” said Bob Komi...
Back to Newsroom