Arize Phoenix developed by Arize AI is an open-source AI observability library that enables comprehensive tracing and monitoring for your AI applications. By integrating Arize's observability tools with your Groq-powered applications, you can gain deep insights into your LLM worklflow's performance and behavior with features including:
pip install arize-phoenix-otel openinference-instrumentation-groq groq
export GROQ_API_KEY="your-groq-api-key"
export PHOENIX_API_KEY="your-phoenix-api-key"
project_name
below.In Arize Phoenix, traces capture the complete journey of an LLM request through your application, while spans represent individual operations within that trace. The instrumentation automatically captures important metrics and metadata.
import os
from phoenix.otel import register
from openinference.instrumentation.groq import GroqInstrumentor
from groq import Groq
# Configure environment variables for Phoenix
os.environ["OTEL_EXPORTER_OTLP_HEADERS"] = f"api_key={os.getenv('PHOENIX_API_KEY')}"
os.environ["PHOENIX_CLIENT_HEADERS"] = f"api_key={os.getenv('PHOENIX_API_KEY')}"
os.environ["PHOENIX_COLLECTOR_ENDPOINT"] = "https://app.phoenix.arize.com"
# Configure Phoenix tracer
tracer_provider = register(
project_name="default",
endpoint="https://app.phoenix.arize.com/v1/traces",
)
# Initialize Groq instrumentation
GroqInstrumentor().instrument(tracer_provider=tracer_provider)
# Create Groq client
client = Groq(api_key=os.getenv("GROQ_API_KEY"))
# Make an instrumented LLM call
chat_completion = client.chat.completions.create(
messages=[{
"role": "user",
"content": "Explain the importance of AI observability"
}],
model="llama-3.3-70b-versatile",
)
print(chat_completion.choices[0].message.content)
Running the above code will create an automatically instrumented Groq application! The traces will be available in your Phoenix dashboard within the default
project, showing
detailed information about:
Challenge: Update an existing Groq-powered application you've built to add Arize Phoenix tracing!
For more detailed documentation and resources on building observable LLM applications with Groq and Arize, see: