Skip to main content
OrcaRouter speaks the OpenAI API. Point your existing OpenAI SDK at OrcaRouter’s base URL and your code continues to work — streaming, tools, JSON mode, vision, everything.

Python

from openai import OpenAI

client = OpenAI(
    base_url="https://api.orcarouter.ai/v1",
    api_key="sk-orca-...",
)

TypeScript / Node

import OpenAI from "openai";

const openai = new OpenAI({
  baseURL: "https://api.orcarouter.ai/v1",
  apiKey: "sk-orca-...",
});

Async Python

from openai import AsyncOpenAI

client = AsyncOpenAI(
    base_url="https://api.orcarouter.ai/v1",
    api_key="sk-orca-...",
)

Using environment variables

The OpenAI SDK reads OPENAI_API_KEY and OPENAI_BASE_URL by default. Set them once and the SDK picks them up without per-call config:
export OPENAI_API_KEY="sk-orca-..."
export OPENAI_BASE_URL="https://api.orcarouter.ai/v1"

What changes in your code

Only the base URL and the API key. Model names, request parameters, response shape, streaming protocol, error handling — all unchanged. You can call gpt-4o, claude-3-5-sonnet-latest, gemini-2.5-pro, etc. through the same client object; OrcaRouter handles the cross-provider translation internally.

Anthropic SDK

If you use the official Anthropic Python or TypeScript SDK, point it at https://api.orcarouter.ai/v1 the same way (we expose /v1/messages). See the Anthropic Messages endpoint reference.