Groq's Responses API is fully compatible with OpenAI's Responses API, making it easy to integrate advanced conversational AI capabilities into your applications. The Responses API supports both text and image inputs while producing text outputs, stateful conversations, and function calling to connect with external systems.
The Responses API is currently in beta. Please let us know your feedback in our Community.
To use the Responses API with OpenAI's client libraries, configure your client with your Groq API key and set the base URL to https://api.groq.com/openai/v1
:
import OpenAI from "openai";
const client = new OpenAI({
apiKey: process.env.GROQ_API_KEY,
baseURL: "https://api.groq.com/openai/v1",
});
const response = await client.responses.create({
model: "llama-3.3-70b-versatile",
input: "Tell me a fun fact about the moon in one sentence.",
});
console.log(response.output_text);
You can find your API key here.
In addition to a model's regular tool use capabilities, the Responses API supports various built-in tools to extend your model's capabilities.
While all models support the Responses API, these built-in tools are only supported for the following models:
Model ID | Browser Search | Code Execution |
---|---|---|
openai/gpt-oss-20b | ✅ | ✅ |
openai/gpt-oss-120b | ✅ | ✅ |
Here are examples using code execution and browser search:
Enable your models to write and execute Python code for calculations, data analysis, and problem-solving - see our code execution documentation for more details.
import OpenAI from "openai";
const client = new OpenAI({
apiKey: process.env.GROQ_API_KEY,
baseURL: "https://api.groq.com/openai/v1",
});
const response = await client.responses.create({
model: "openai/gpt-oss-20b",
input: "What is 1312 X 3333? Output only the final answer.",
tool_choice: "required",
tools: [
{
type: "code_interpreter",
container: {
"type": "auto"
}
}
]
});
console.log(response.output_text);
Give your models access to real-time web content and up-to-date information - see our browser search documentation for more details.
import OpenAI from "openai";
const client = new OpenAI({
apiKey: process.env.GROQ_API_KEY,
baseURL: "https://api.groq.com/openai/v1",
});
const response = await client.responses.create({
model: "openai/gpt-oss-20b",
input: "Analyze the current weather in San Francisco and provide a detailed forecast.",
tool_choice: "required",
tools: [
{
type: "browser_search"
}
]
});
console.log(response.output_text);
Use structured outputs to ensure the model's response follows a specific JSON schema. This is useful for extracting structured data from text, ensuring consistent response formats, or integrating with downstream systems that expect specific data structures.
For a complete list of models that support structured outputs, see our structured outputs documentation.
import OpenAI from "openai";
const openai = new OpenAI({
apiKey: process.env.GROQ_API_KEY,
baseURL: "https://api.groq.com/openai/v1",
});
const response = await openai.responses.create({
model: "moonshotai/kimi-k2-instruct",
instructions: "Extract product review information from the text.",
input: "I bought the UltraSound Headphones last week and I'm really impressed! The noise cancellation is amazing and the battery lasts all day. Sound quality is crisp and clear. I'd give it 4.5 out of 5 stars.",
text: {
format: {
type: "json_schema",
name: "product_review",
schema: {
type: "object",
properties: {
product_name: { type: "string" },
rating: { type: "number" },
sentiment: {
type: "string",
enum: ["positive", "negative", "neutral"]
},
key_features: {
type: "array",
items: { type: "string" }
}
},
required: ["product_name", "rating", "sentiment", "key_features"],
additionalProperties: false
}
}
}
});
console.log(response.output_text);
{
"product_name": "UltraSound Headphones",
"rating": 4.5,
"sentiment": "positive",
"key_features": [
"noise cancellation",
"long battery life",
"crisp and clear sound quality"
]
}
Explore more advanced use cases in our built-in browser search and code execution documentation.