Quick Start
Get started in 3 steps:Portkey also supports Anthropic’s native
/messages endpoint. See Using with Anthropic SDK below.Add Provider in Model Catalog
Before making requests, add a provider:- Go to Model Catalog → Add Provider
- Select your provider (OpenAI, Anthropic, etc.)
- Choose existing credentials or enter your API key
- Name your provider (e.g.,
openai-prod)
@openai-prod (the name you chose with @ prefix).
Complete Model Catalog Guide →
Set up budgets, rate limits, and manage credentials
Switch Between Providers
Change the model string to use different providers—same code, different models:Examples
Vision
Function Calling
Image Generation
Embeddings
Audio Transcription
Using with OpenAI SDK
Use your existing OpenAI code with Portkey—just change 2 parameters:Using with Anthropic SDK
Portkey fully supports Anthropic’s native/messages endpoint. Use the Anthropic SDK directly with Portkey:
x-portkey-* headers.
Anthropic Integration Guide →
Learn about prompt caching, extended thinking, and more Anthropic features
Gateway Features
Add production features through configs:Automatic Retries
Retry failed requests with exponential backoff
Fallbacks
Automatically switch to backup providers
Caching
Cache responses to reduce costs and latency
Load Balancing
Distribute requests across multiple providers
Gateway Configs Guide →
Learn how to create and use configs
Supported Integrations
Portkey integrates with the entire AI ecosystem:LLM Providers
1,600+ models from OpenAI, Anthropic, Google, Mistral, Cohere, and 30+ providers
Agent Frameworks
LangChain, CrewAI, AutoGen, OpenAI Agents, Strands, and more
Libraries
LangChain, LlamaIndex, Vercel AI SDK, and popular frameworks
Guardrails
Aporia, Pillar, Patronus, and content safety providers
Vector Databases
Pinecone, Weaviate, and vector store integrations
MCP Servers
Model Context Protocol servers and tools

