Overview
The Anthropic Format Converter allows you to call Gemini and OpenAI models using the Anthropic SDK and API format. This is useful when you have existing code using the Anthropic SDK but want to access other providers’ models.
Configuration
SDK Configuration
Set your base URL to:
https://api.pipellm.ai/anthropic
The SDK automatically appends /v1/messages, so you only need to set the base
URL to /anthropic.
cURL / Direct API
For direct API calls, use the full endpoint:
https://api.pipellm.ai/anthropic/v1/messages
Usage Examples
Python SDK
TypeScript SDK
cURL
import anthropic
client = anthropic.Anthropic(
api_key="your-pipellm-api-key",
base_url="https://api.pipellm.ai/anthropic"
)
message = client.messages.create(
model="gemini-2.0-flash", # or "gpt-4o"
max_tokens=1024,
messages=[
{"role": "user", "content": "Hello, how are you?"}
]
)
print(message.content[0].text)
import Anthropic from "@anthropic-ai/sdk";
const client = new Anthropic({
apiKey: "your-pipellm-api-key",
baseURL: "https://api.pipellm.ai/anthropic",
});
const message = await client.messages.create({
model: "gemini-2.0-flash", // or 'gpt-4o'
max_tokens: 1024,
messages: [{ role: "user", content: "Hello, how are you?" }],
});
console.log(message.content[0].text);
curl https://api.pipellm.ai/anthropic/v1/messages \
-H "Content-Type: application/json" \
-H "x-api-key: your-pipellm-api-key" \
-H "anthropic-version: 2023-06-01" \
-d '{
"model": "gemini-2.0-flash",
"max_tokens": 1024,
"messages": [
{"role": "user", "content": "Hello, how are you?"}
]
}'
Supported Models
You can use any Gemini or OpenAI model available on PipeLLM with the Anthropic format:
| Provider | Example Models |
|---|
| Gemini | gemini-2.0-flash, gemini-1.5-pro |
| OpenAI | gpt-4o, gpt-4o-mini, o1 |
Feature Support
| Feature | Status |
|---|
| Streaming | ✅ Supported |
| Tool Use | ✅ Supported |
| Vision | ✅ Supported |
| System Prompts | ✅ Supported |
| Thinking | ✅ Supported |