Skip to main content
Anything that talks to the OpenAI API talks to OrcaRouter. The configuration pattern is always the same: set the base URL to https://api.orcarouter.ai/v1 and use an sk-orca-... API key.

LangChain

from langchain_openai import ChatOpenAI

llm = ChatOpenAI(
    model="gpt-4o",
    base_url="https://api.orcarouter.ai/v1",
    api_key="sk-orca-...",
)

LangChain.js

import { ChatOpenAI } from "@langchain/openai";

const model = new ChatOpenAI({
  model: "gpt-4o",
  openAIApiKey: "sk-orca-...",
  configuration: { baseURL: "https://api.orcarouter.ai/v1" },
});

Vercel AI SDK

import { createOpenAI } from "@ai-sdk/openai";

const openai = createOpenAI({
  baseURL: "https://api.orcarouter.ai/v1",
  apiKey: "sk-orca-...",
});

const model = openai("gpt-4o");

Mastra

import { openai } from "@ai-sdk/openai";

process.env.OPENAI_API_KEY = "sk-orca-...";
process.env.OPENAI_BASE_URL = "https://api.orcarouter.ai/v1";

// use `openai("gpt-4o")` normally

Claude Code (CLI)

export ANTHROPIC_API_KEY="sk-orca-..."
export ANTHROPIC_BASE_URL="https://api.orcarouter.ai/v1"
claude  # points at OrcaRouter's /v1/messages endpoint

Codex CLI / OpenAI CLI

export OPENAI_API_KEY="sk-orca-..."
export OPENAI_API_BASE="https://api.orcarouter.ai/v1"
codex

Model Context Protocol (MCP) servers

Any MCP server that calls an OpenAI-compatible API — mcp-server-openai, mcp-server-anthropic — accepts the same base_url / api_key pair.

PydanticAI

from pydantic_ai import Agent
from pydantic_ai.models.openai import OpenAIModel

model = OpenAIModel(
    "gpt-4o",
    base_url="https://api.orcarouter.ai/v1",
    api_key="sk-orca-...",
)
agent = Agent(model)

Anything else

If the framework lets you override the OpenAI base URL or Anthropic base URL, it works with OrcaRouter. If the framework hardcodes the base URL, you can usually patch the client instance or set OPENAI_BASE_URL / ANTHROPIC_BASE_URL environment variables before import.