Explore all available models on GroqCloud.
GPT-OSS 20B is OpenAI's compact open-weight language model with 20 billion parameters, built in browser search and code execution, and reasoning capabilities.
GPT-OSS 120B is OpenAI's flagship open-weight language model with 120 billion parameters, built in browser search and code execution, and reasoning capabilities.
Note: Production models are intended for use in your production environments. They meet or exceed our high standards for speed, quality, and reliability. Read more here.
MODEL ID | DEVELOPER | CONTEXT WINDOW (TOKENS) | MAX COMPLETION TOKENS | MAX FILE SIZE | DETAILS |
---|---|---|---|---|---|
llama-3.1-8b-instant | Meta | 131,072 | 131,072 | - | |
llama-3.3-70b-versatile | Meta | 131,072 | 32,768 | - | |
meta-llama/llama-guard-4-12b | Meta | 131,072 | 1,024 | 20 MB | |
openai/gpt-oss-120b | OpenAI | 131,072 | 65,536 | - | |
openai/gpt-oss-20b | OpenAI | 131,072 | 65,536 | - | |
whisper-large-v3 | OpenAI | - | - | 100 MB | |
whisper-large-v3-turbo | OpenAI | - | - | 100 MB |
Note: Preview models are intended for evaluation purposes only and should not be used in production environments as they may be discontinued at short notice. Read more about deprecations here.
MODEL ID | DEVELOPER | CONTEXT WINDOW (TOKENS) | MAX COMPLETION TOKENS | MAX FILE SIZE | DETAILS |
---|---|---|---|---|---|
deepseek-r1-distill-llama-70b | DeepSeek / Meta | 131,072 | 131,072 | - | |
meta-llama/llama-4-maverick-17b-128e-instruct | Meta | 131,072 | 8,192 | 20 MB | |
meta-llama/llama-4-scout-17b-16e-instruct | Meta | 131,072 | 8,192 | 20 MB | |
meta-llama/llama-prompt-guard-2-22m | Meta | 512 | 512 | - | |
meta-llama/llama-prompt-guard-2-86m | Meta | 512 | 512 | - | |
moonshotai/kimi-k2-instruct | Moonshot AI | 131,072 | 16,384 | - | |
playai-tts | PlayAI | 8,192 | 8,192 | - | |
playai-tts-arabic | PlayAI | 8,192 | 8,192 | - | |
qwen/qwen3-32b | Alibaba Cloud | 131,072 | 40,960 | - |
Systems are a collection of models and tools that work together to answer a user query.
Note: Preview systems are intended for evaluation purposes only and should not be used in production environments as they may be discontinued at short notice. Read more about deprecations here.
Deprecated models are models that are no longer supported or will no longer be supported in the future. See our deprecation guidelines and deprecated models here.
Hosted models are directly accessible through the GroqCloud Models API endpoint using the model IDs mentioned above. You can use the https://api.groq.com/openai/v1/models
endpoint to return a JSON list of all active models:
1import requests
2import os
3
4api_key = os.environ.get("GROQ_API_KEY")
5url = "https://api.groq.com/openai/v1/models"
6
7headers = {
8 "Authorization": f"Bearer {api_key}",
9 "Content-Type": "application/json"
10}
11
12response = requests.get(url, headers=headers)
13
14print(response.json())