allam-2-7b
ALLaM-2-7B is a powerful bilingual language model designed to advance Arabic Language Technology (ALT), developed by the National Center for Artificial Intelligence (NCAI) at the Saudi Data and AI Authority (SDAIA). This instruction-tuned model is trained from scratch using a unique two-step pretraining recipe: 4T English tokens followed by 1.2T mixed Arabic/English tokens. This approach retains English capabilities without catastrophic forgetting while effectively transferring knowledge between language distributions, making it ideal for Arabic and English conversational applications.
Experience the capabilities of allam-2-7b
with Groq speed:
pip install groq
from groq import Groq
client = Groq()
completion = client.chat.completions.create(
model="allam-2-7b",
messages=[
{
"role": "user",
"content": "Explain why fast inference is critical for reasoning models"
}
]
)
print(completion.choices[0].message.content)