Skip to main content

Overview

The Anthropic Format Converter allows you to call Gemini and OpenAI models using the Anthropic SDK and API format. This is useful when you have existing code using the Anthropic SDK but want to access other providers’ models.

Configuration

SDK Configuration

Set your base URL to:
https://api.pipellm.ai/anthropic
The SDK automatically appends /v1/messages, so you only need to set the base URL to /anthropic.

cURL / Direct API

For direct API calls, use the full endpoint:
https://api.pipellm.ai/anthropic/v1/messages

Usage Examples

import anthropic

client = anthropic.Anthropic(
    api_key="your-pipellm-api-key",
    base_url="https://api.pipellm.ai/anthropic"
)

message = client.messages.create(
    model="gemini-2.0-flash",  # or "gpt-4o"
    max_tokens=1024,
    messages=[
        {"role": "user", "content": "Hello, how are you?"}
    ]
)

print(message.content[0].text)

Supported Models

You can use any Gemini or OpenAI model available on PipeLLM with the Anthropic format:
ProviderExample Models
Geminigemini-2.0-flash, gemini-1.5-pro
OpenAIgpt-4o, gpt-4o-mini, o1

Feature Support

FeatureStatus
Streaming✅ Supported
Tool Use✅ Supported
Vision✅ Supported
System Prompts✅ Supported
Thinking✅ Supported