-

AI2, The Allen Institute for AI, Today Releases the OLMo: A State-of-the-Art, Truly Open-Source LLM and Framework

The OLMo framework will drive a critical shift in AI development by providing the industry with a unique large, accurate, and open language model framework – creating an alternative to current models that are restrictive and closed.

SEATTLE--(BUSINESS WIRE)--As the world races to deploy AI models that are effective and safe, the demand for Open Large Language Models (LLMs) has exploded. The massive adoption of both open and closed AI models means that AI capabilities have leapfrogged our ability to understand how they are created. Releasing the OLMo framework will provide the industry with an opportunity to understand what is going on inside AI models.

Today, The Allen Institute for AI (AI2) has released OLMo 7B, a truly open-source, state-of-the-art large language model released alongside the pre-training data and training code, something no open models of this scale offer today. This empowers researchers and developers to use the best and open models to advance the science of language models collectively.

“Open foundation models have been critical in driving a burst of innovation and development around generative AI,” said Yann LeCun, Chief AI Scientist at Meta. “The vibrant community that comes from open source is the fastest and most effective way to build the future of AI.”

OLMo and the framework is designed to aid researchers in training and experimenting with large language models. They are available for direct download on Hugging Face and in GitHub. This work was made possible, in part, via a collaboration with the Kempner Institute for the Study of Natural and Artificial Intelligence at Harvard University and partners including AMD, CSC - IT Center for Science (Finland), the Paul G. Allen School of Computer Science & Engineering at the University of Washington and Databricks.

The framework features a suite of completely open AI development tools, including:

  • Full pretraining data: The model is built on AI2’s Dolma set which features three trillion token open corpus for language model pretraining, including code that produces the training data.
  • Training code and model weights: The OLMo framework includes full model weights for four model variants at the 7B scale, each trained to at least 2T tokens. Inference code, training metrics and training logs are all provided.
  • Evaluation: We’ve released the evaluation suite used in development, complete with 500+ checkpoints per model, from every 1000 steps during the training process and evaluation code under the umbrella of the Catwalk project.

“I’m enthusiastic about getting OLMo into the hands of AI researchers,” said Eric Horvitz, Microsoft’s Chief Scientific Officer and a founding member of the AI2 Scientific Advisory Board. “The new offering continues Allen AI's tradition of providing valuable open models, tools, and data, which have spurred numerous advancements in AI across the global community.”

A truly open model

By making OLMo and its training data fully available to the public, AI2 has taken a big step towards collaboratively building the best open language model in the world. In the coming months, AI2 will continue to iterate on OLMo and will bring different model sizes, modalities, datasets, and capabilities into the OLMo family.

“Many language models today are published with limited transparency. Without having access to training data, researchers cannot scientifically understand how a model is working. It’s the equivalent of drug discovery without clinical trials or studying the solar system without a telescope,” said Hanna Hajishirzi, OLMo project lead, a senior director of NLP Research at AI2, and a professor in the UW’s Allen School. “With our new framework, researchers will finally be able to study the science of LLMs, which is critical to building the next generation of safe and trustworthy AI.”

With OLMo, AI researchers and developers will experience:

  • More Precision: With full insight into the training data behind the model, researchers will be able to work faster and no longer need to depend on qualitative assumptions of how we feel the model is performing but can test it scientifically.
  • Less Carbon: Currently one training run is equivalent to the emissions of nine US homes for one year. By opening the full training and evaluation ecosystem, it radically reduces developmental redundancies, which is critical in the decarbonization of AI
  • Lasting results: Keeping models and their datasets in the open and not behind APIs enables researchers to learn and build from previous models and work.

“With OLMo, open actually means ‘open’ and everyone in the AI research community will have access to all aspects of model creation, including training code, evaluation methods, data, and so on," said Noah Smith, OLMo project lead, a senior director of NLP Research at AI2, and a professor in the UW’s Allen School. “AI was once an open field centered on an active research community, but as models grew, became more expensive, and started turning into commercial products, AI work started to happen behind closed doors. With OLMo we hope to work against this trend and empower the research community to come together to better understand and engage with language models in a scientific way, leading to more responsible AI technology that benefits everyone.”

“With AI2’s deep expertise in natural language processing combined with AMD high-performance computing engines, the OLMo models developed on the LUMI Supercomputer powered by AMD EPYC™ CPUs and AMD Instinct™ accelerators offer a unique opportunity to truly expand AI experimentation and innovation and advance the industry like never before. This new open framework will provide the AI research community across the world with trusted resources and a platform to contribute to and work directly on language models.” — Ian Ferreria, Senior Director, AI Solutions, AMD

"We are happy that we can contribute to this important initiative by providing the computing capacity from the LUMI supercomputer along with our expertise. Public supercomputers like LUMI play a vital role in the infrastructure for open and transparent AI.” — Dr. Pekka Manninen, Director of Science and Technology, CSC

LUMI supercomputer in Finland is hosted by CSC, and owned by EuroHPC Joint Undertaking and 10 European countries. LUMI is the fastest supercomputer in Europe, and is known for its entirely carbon-free operations and was critical in supporting the pre-training work necessary to develop OLMo.

“We’re excited to be collaborating with the Allen Institute for AI on the release of their OLMo open-source model and framework. OLMo sets the standard for what it means to be open. Everyone in academia, industry, and the broader community will benefit enormously from access to not only the model but all of the training details, including the data, code, and intermediate checkpoints. I am especially proud that this model was developed on our Mosaic AI model training platform. As with all great open source releases, the best is yet to come now that these artifacts and tools are in the hands of the community.” — Jonathan Frankle, Chief Scientist (Neural Networks), Databricks

For more information on the OLMo framework and The Allen Institute for AI visit here

Contacts

Sophie Lebrecht
sophiel@allenai.org

The Allen Institute for AI


Release Versions

Contacts

Sophie Lebrecht
sophiel@allenai.org

More News From The Allen Institute for AI

Ai2 Provides the First-Ever Look Inside an LLM With the Launch of OLMoTrace

SEATTLE--(BUSINESS WIRE)--As AI adoption accelerates—even in high-stakes industries like healthcare, life science, finance, and security—one pressing challenge remains: how can we trust and fully harness its potential if we can’t understand how it makes decisions? The lack of transparency in LLMs limits not only our ability to build trusted AI solutions that can be validated, audited, and regulated - it also fundamentally limits our ability to build better, scientifically grounded AI. To help a...

Ai2 (Allen Institute for AI) Announces Partnership with Google Cloud to Accelerate Open AI Innovation

SEATTLE & SUNNYVALE, Calif.--(BUSINESS WIRE)--Today, Ai2 announced a partnership with Google Cloud to make its portfolio of open AI models available in Vertex AI Model Garden. The collaboration will help set a new standard for openness that leverages Google Cloud’s infrastructure resources and AI development platform with Ai2’s open models that will advance AI research and offer enterprise-quality deployment for the public sector. While AI reshapes industries and enterprises at an unprecedented...

Introducing Molmo: A Family of State-of-the-Art Open Multimodal Models

SEATTLE--(BUSINESS WIRE)--Today, the Allen Institute for AI (Ai2) announced the launch of Molmo, a family of state-of-the-art multimodal models. This family includes our best Molmo model, closing the gap between close and open models, the most open and powerful multimodal model today, and the most efficient model. Currently, most advanced multimodal models can perceive the world and communicate with us, Molmo goes beyond that to enable one to act in their worlds, unlocking a whole new generatio...
Back to Newsroom