Groq

BrowserBase + Groq: Scalable Browser Automation with AI

BrowserBase provides cloud-based headless browser infrastructure that makes browser automation simple and scalable. Combined with Groq's fast inference through MCP, you can control browsers using natural language instructions.

Key Features:

  • Natural Language Control: Describe actions in plain English instead of writing selectors
  • Cloud Infrastructure: No browser instances or server resources to manage
  • Anti-Detection: Bypass bot-detection automatically with built-in stealth
  • Session Persistence: Maintain cookies and authentication across requests
  • Visual Documentation: Capture screenshots and recordings for debugging

Quick Start

1. Install the required packages:

bash
pip install openai python-dotenv

2. Get your setup:

Connect your BrowserBase credentials at Smithery and copy your MCP URL.

bash
export GROQ_API_KEY="your-groq-api-key"
export SMITHERY_MCP_URL="your-smithery-mcp-url"

3. Create your first browser automation agent:

python
import os
from openai import OpenAI

client = OpenAI(
    base_url="https://api.groq.com/api/openai/v1",
    api_key=os.getenv("GROQ_API_KEY")
)

tools = [{
    "type": "mcp",
    "server_url": os.getenv("SMITHERY_MCP_URL"),
    "server_label": "browserbase",
    "require_approval": "never"
}]

response = client.responses.create(
    model="qwen/qwen3-32b",
    input="Navigate to https://news.ycombinator.com and extract the top 3 headlines",
    tools=tools,
    temperature=0.1,
    top_p=0.4
)

print(response.output_text)

Advanced Examples

Multi-Step Workflows

Chain multiple browser actions together:

python
response = client.responses.create(
    model="qwen/qwen3-32b",
    input="""Navigate to https://example.com/login
    Fill in username: [email protected]
    Fill in password: demo123
    Click login button
    Wait for dashboard
    Extract all table data""",
    tools=tools,
    temperature=0.1
)

print(response.output_text)

E-commerce Price Monitoring

Automate price tracking across retailers:

python
urls = [
    "https://amazon.com/product1",
    "https://walmart.com/product1",
    "https://target.com/product1"
]

for url in urls:
    response = client.responses.create(
        model="qwen/qwen3-32b",
        input=f"Navigate to {url} and extract product name, price, and availability",
        tools=tools,
        temperature=0.1
    )
    print(response.output_text)

Form Automation

Automate form filling:

python
response = client.responses.create(
    model="qwen/qwen3-32b",
    input="""Navigate to https://example.com/contact
    Fill form with:
    - Name: John Doe
    - Email: [email protected]
    - Message: Interested in your services
    Submit form and confirm submission""",
    tools=tools,
    temperature=0.1
)

print(response.output_text)

Available BrowserBase Actions

ActionDescription
browserbase_create_sessionStart a new browser session
browserbase_navigateNavigate to any URL
browserbase_clickClick on elements
browserbase_typeType text into input fields
browserbase_screenshotCapture page screenshots
browserbase_get_contentExtract page content
browserbase_waitWait for elements or page loads
browserbase_scrollScroll to load dynamic content

Challenge: Build an automated lead generation system that visits business directories, extracts contact information, validates emails, and stores results—all controlled by natural language!

Additional Resources

Was this page helpful?