Mistral logo

Mistral

Saba

Mistral Saba is a 24B-parameter language model specifically designed for the Middle East and South Asia, delivering accurate and contextually relevant responses while maintaining efficient performance. Trained on curated regional datasets, it supports multiple Indian-origin languages—including Tamil and Malayalam—alongside Arabic. This makes it a versatile option for a range of regional and multilingual applications. Read more at the blog post [here](https://mistral.ai/en/news/mistral-saba)

Input / 1M tokens
$0.200
Output / 1M tokens
$0.600
Context window
33K tokens
Provider
Mistral
Cached input / 1M
$0.020
Knowledge cutoff
2024-09-30

Performance

Median streaming throughput and first-token latency measured by Artificial Analysis.

Output tokens / sec
Time to first token