Skip to main content

Getting Started

This page only documents starter paths that were revalidated against Crazyrouter production on 2026-03-23. For the fastest start, separate the integration styles first:
  • OpenAI-compatible: https://crazyrouter.com/v1
  • Anthropic native: https://crazyrouter.com
  • Gemini native: https://crazyrouter.com/v1beta/models/...

Shortest migration rule

When migrating from an OpenAI-compatible client, you usually only change two things:
  1. set base_url to https://crazyrouter.com/v1
  2. replace the key with your Crazyrouter sk-xxx

Migrating from OpenAI

from openai import OpenAI

client = OpenAI(
    api_key="sk-xxx",
    base_url="https://crazyrouter.com/v1"
)

response = client.chat.completions.create(
    model="gpt-5.4",
    messages=[{"role": "user", "content": "Hello"}],
    max_tokens=64
)
This production recheck confirmed that:
  • gpt-5.4 works through /v1/chat/completions
  • if you need reasoning summaries or OpenAI-style web search, you should prefer /v1/responses
Related pages:

Migrating from Anthropic

Option A: stay on the OpenAI-compatible layer

from openai import OpenAI

client = OpenAI(
    api_key="sk-xxx",
    base_url="https://crazyrouter.com/v1"
)

response = client.chat.completions.create(
    model="claude-sonnet-4-6",
    messages=[{"role": "user", "content": "Hello"}],
    max_tokens=64
)

Option B: use native Anthropic Messages

import anthropic

client = anthropic.Anthropic(
    api_key="sk-xxx",
    base_url="https://crazyrouter.com"
)

response = client.messages.create(
    model="claude-sonnet-4-6",
    max_tokens=128,
    messages=[{"role": "user", "content": "Hello"}]
)
If you need:
  • standard Claude chat: prefer claude-sonnet-4-6
  • an explicit thinking block: prefer claude-opus-4-6-thinking
Related page:

Migrating from Gemini

Option A: OpenAI-compatible layer

from openai import OpenAI

client = OpenAI(
    api_key="sk-xxx",
    base_url="https://crazyrouter.com/v1"
)

response = client.chat.completions.create(
    model="gemini-3-pro-preview",
    messages=[{"role": "user", "content": "Hello"}],
    max_tokens=64
)

Option B: Gemini native

curl "https://crazyrouter.com/v1beta/models/gemini-3-pro-preview:generateContent?key=YOUR_API_KEY" \
  -H "Content-Type: application/json" \
  -d '{
    "contents": [
      {
        "role": "user",
        "parts": [
          {"text": "Hello"}
        ]
      }
    ]
  }'
If you need:
  • simple reuse of existing OpenAI SDK code: start with the compatibility layer
  • structured outputs, Google Search, or thinking: prefer Gemini native
Related pages:

Environment variables

If you use the OpenAI-compatible route, a good default is:
export OPENAI_API_KEY=sk-xxx
export OPENAI_BASE_URL=https://crazyrouter.com/v1
Then your code can be:
from openai import OpenAI
client = OpenAI()

Starter recommendations

GoalRecommended starting path
Fastest GPT integrationOpenAI-compatible + gpt-5.4
Fastest Claude integrationAnthropic native + claude-sonnet-4-6
Fastest Gemini integrationOpenAI-compatible or Gemini native + gemini-3-pro-preview
Need reasoning summariesResponses API
Need a Claude thinking blockclaude-opus-4-6-thinking
Need Gemini Google Search / thinking / responseSchemaGemini native
Crazyrouter supports OpenAI, Anthropic, and Gemini protocols at the same time. In practice, the more reliable approach is usually not forcing every model through one protocol, but choosing the route that best matches the capability you need.