mistral-saba-24b
Mistral Saba 24B is a specialized model trained to excel in Arabic, Farsi, Urdu, Hebrew, and Indic languages. Designed for high-performance multilingual capabilities, it delivers exceptional results across a wide range of tasks in these languages while maintaining strong performance in English. With a 32K token context window and tool use capabilities, it's ideal for complex multilingual applications requiring deep language understanding and regional context.
Experience the exceptional multilingual capabilities of mistral-saba-24b
with Groq speed:
pip install groq
from groq import Groq
client = Groq()
completion = client.chat.completions.create(
model="mistral-saba-24b",
messages=[
{
"role": "user",
"content": "Explain why fast inference is critical for reasoning models"
}
]
)
print(completion.choices[0].message.content)