Cohere Drops Tiny Aya: Multilingual AI That Runs Offline

Cohere launches 3.35B-parameter open-weight models supporting 70+ languages, built for offline use on modest hardware.

Cohere Drops Tiny Aya: Multilingual AI That Runs Offline

Cohere just dropped Tiny Aya, a new family of compact multilingual AI models designed to work without an internet connection. The models pack 3.35 billion parameters and support more than 70 languages — a serious play for global accessibility.

The enterprise AI company unveiled the lineup at India's AI Summit. What's notable here isn't just the language coverage. The entire model family was trained on a single cluster of 64 NVIDIA H100 GPUs. That's relatively lean infrastructure for a multilingual model of this scope.

Tiny Aya ships as open-weight, meaning developers can download, inspect, and build on top of the models freely. The offline capability makes it particularly useful in regions with spotty connectivity — which likely explains the India summit debut.

For companies chasing multilingual AI without cloud dependency, this is one to watch.