Reader

Ai2 Launches OLMo 2, a Fully Open-Source Foundation Model

| InfoQ | Default

The Allen Institute for AI research team has introduced OLMo 2, a new family of open-source language models available in 7 billion (7B) and 13 billion (13B) parameter configurations. Trained on up to 5 trillion tokens, these models redefines training stability, adopting staged training processes, and incorporating diverse datasets.

By Daniel Dominguez