Particle.news
Download on the App Store

Cohere Releases Tiny Aya, Open Multilingual Models Built for On-Device Use

The launch targets broad local adoption by pairing a compact 3.35B model with transparent dataset releases.

Overview

  • The Tiny Aya family is publicly available now on HuggingFace, Kaggle, Ollama, and the Cohere Platform for local deployment.
  • TinyAya-Base has 3.35 billion parameters and supports more than 70 languages for everyday, offline-capable use cases.
  • Cohere offers TinyAya-Global for instruction following plus regional variants—Earth (Africa), Fire (South Asia), and Water (Asia Pacific, West Asia, Europe)—to improve linguistic grounding and cultural nuance.
  • The models were trained on a single cluster of 64 Nvidia H100 GPUs, reflecting a relatively modest compute footprint compared with larger industry efforts.
  • Cohere is releasing training and evaluation datasets on HuggingFace and plans a technical report detailing its training methodology, and it highlights tokenizer efficiency that reduces tokens per sentence across languages.