Overview
The OpenAI Format Converter allows you to call Gemini and Anthropic models using the OpenAI SDK and API format. This is useful when you have existing code using the OpenAI SDK but want to access other providers’ models.
Configuration
SDK Configuration
Set your base URL to:
https://api.pipellm.ai/openai/v1
The SDK automatically appends /chat/completions, so you only need to set the
base URL to /openai/v1.
cURL / Direct API
For direct API calls, use the full endpoint:
https://api.pipellm.ai/openai/v1/chat/completions
Usage Examples
Python SDK
TypeScript SDK
cURL
from openai import OpenAI
client = OpenAI(
api_key="your-pipellm-api-key",
base_url="https://api.pipellm.ai/openai/v1"
)
response = client.chat.completions.create(
model="gemini-2.0-flash", # or "claude-sonnet-4-20250514"
messages=[
{"role": "user", "content": "Hello, how are you?"}
]
)
print(response.choices[0].message.content)
import OpenAI from "openai";
const client = new OpenAI({
apiKey: "your-pipellm-api-key",
baseURL: "https://api.pipellm.ai/openai/v1",
});
const response = await client.chat.completions.create({
model: "gemini-2.0-flash", // or 'claude-sonnet-4-20250514'
messages: [{ role: "user", content: "Hello, how are you?" }],
});
console.log(response.choices[0].message.content);
curl https://api.pipellm.ai/openai/v1/chat/completions \
-H "Content-Type: application/json" \
-H "Authorization: Bearer your-pipellm-api-key" \
-d '{
"model": "gemini-2.0-flash",
"messages": [
{"role": "user", "content": "Hello, how are you?"}
]
}'
Supported Models
You can use any Gemini or Anthropic model available on PipeLLM with the OpenAI format:
| Provider | Example Models |
|---|
| Gemini | gemini-2.0-flash, gemini-1.5-pro |
| Anthropic | claude-sonnet-4-20250514, claude-3-5-haiku-latest |
Feature Support
| Feature | Status |
|---|
| Streaming | ✅ Supported |
| Tool Use | ✅ Supported |
| Vision | ✅ Supported |
| System Prompts | ✅ Supported |
| Thinking | ✅ Supported |