Skip to main content

Overview

The OpenAI Format Converter allows you to call Gemini and Anthropic models using the OpenAI SDK and API format. This is useful when you have existing code using the OpenAI SDK but want to access other providers’ models.

Configuration

SDK Configuration

Set your base URL to:
https://api.pipellm.ai/openai/v1
The SDK automatically appends /chat/completions, so you only need to set the base URL to /openai/v1.

cURL / Direct API

For direct API calls, use the full endpoint:
https://api.pipellm.ai/openai/v1/chat/completions

Usage Examples

from openai import OpenAI

client = OpenAI(
    api_key="your-pipellm-api-key",
    base_url="https://api.pipellm.ai/openai/v1"
)

response = client.chat.completions.create(
    model="gemini-2.0-flash",  # or "claude-sonnet-4-20250514"
    messages=[
        {"role": "user", "content": "Hello, how are you?"}
    ]
)

print(response.choices[0].message.content)

Supported Models

You can use any Gemini or Anthropic model available on PipeLLM with the OpenAI format:
ProviderExample Models
Geminigemini-2.0-flash, gemini-1.5-pro
Anthropicclaude-sonnet-4-20250514, claude-3-5-haiku-latest

Feature Support

FeatureStatus
Streaming✅ Supported
Tool Use✅ Supported
Vision✅ Supported
System Prompts✅ Supported
Thinking✅ Supported