-

Cerebras Signs MoU to Help Accelerate the Deployment of AI

SUNNYVALE, Calif. & RIYADH, Saudi Arabia--(BUSINESS WIRE)--Cerebras Systems today announced the signing of a Memorandum of Understanding (MoU) with Aramco, under which they aim to bring high performance AI inference to industries, universities, and enterprises in Saudi Arabia. Aramco plans to build, train, and deploy world-class large language models (LLMs) using Cerebras’ industry-leading CS-3 systems, in order to help accelerate AI innovation.

Aramco’s new high-performance AI computing infrastructure will be expected to focus on advancing the adoption of AI and providing local industries, enterprises, and universities with access to Cerebras’ CS-3 AI systems. These organizations aim to use Cerebras’ industry-leading AI systems to develop cutting-edge LLMs through sizing and tuning for optimal performance, catering to local business requirements.

Andrew Feldman, Cerebras co-founder and CEO, said: “We are privileged to be working with Aramco to bring high performance, low latency compute and new AI applications to local industries, enterprises, and universities. Together, we plan to accelerate the possibilities of AI, helping to enhance capabilities and create new opportunities for local businesses to foster creativity, unlock value, and promote sustainability.”

Nabil Al Nuaim, Aramco SVP of Digital & Information Technology, said: “This MoU with Cerebras aims to accelerate our abilities to develop an AI-powered digital innovation economy in Saudi Arabia by helping to support the integration of advanced AI solutions, unlocking new opportunities for the country and localizing cutting-edge technologies with regional expertise.”

Under the new MoU, Aramco plans to equip its cloud computing business with the new CS-3 systems to accelerate LLM and AI application development.

For more information, please visit https://cerebras.ai/.

About Cerebras Systems

Cerebras Systems is a team of pioneering computer architects, computer scientists, deep learning researchers, and engineers of all types. We have come together to accelerate generative AI by building a new class of AI supercomputer from the ground up. Our flagship product, the CS-3 system, is powered by the world’s largest and fastest AI processor, our Wafer-Scale Engine-3. CS-3s are quickly and easily clustered together to make the largest AI supercomputers in the world, and they make placing models on the supercomputers dead simple by avoiding the complexity of distributed computing. Leading corporations, research institutions, and governments use Cerebras solutions for the development of pathbreaking proprietary models and to train open-source models with millions of downloads. Cerebras solutions are available through the Cerebras Cloud and on premise. For further information, visit www.cerebras.ai or follow us on LinkedIn or X.

Contacts

Cerebras Systems


Release Versions

Contacts

More News From Cerebras Systems

Cerebras Systems and U.S. Department of Energy Sign MOU to Accelerate the Genesis Mission and U.S. National AI Initiative

SUNNYVALE, Calif.--(BUSINESS WIRE)--Cerebras Systems, makers of the fastest AI infrastructure, today announced that it has signed a Memorandum of Understanding (MOU) with the U.S. Department of Energy (DOE) to explore further collaboration on next-generation AI and high-performance computing (HPC) technologies to accelerate AI+HPC for science and national security. The MOU expresses Cerebras’ intent to support The White House’s Genesis Mission, a new national effort to use AI to transform how s...

Cerebras Delivers End-to-End Training and Inference for Jais 2, the World’s Leading Open Arabic LLM

SUNNYVALE, Calif.--(BUSINESS WIRE)--Cerebras Systems, in partnership with G42’s Inception and MBZUAI’s IFM, today announced the release of Jais 2, the leading open-source Arabic LLM – the first frontier language model both trained and deployed for inference on Cerebras Systems. The organizations combined their expertise with leading machine learning techniques, uniquely enabled on Cerebras wafer-scale clusters, to achieve state-of-the-art quality on Jais 2, using only a fraction of compute used...

Cerebras AI Inference Wins Demo of the Year Award at TSMC North America Technology Symposium

SUNNYVALE, Calif.--(BUSINESS WIRE)--Cerebras AI Inference Wins Demo of the Year Award at TSMC North America Technology Symposium...
Back to Newsroom