Skip to main content

Overview

The Gemini Format Converter allows you to call OpenAI and Anthropic models using the Gemini SDK and API format. This is useful when you have existing code using the Gemini SDK but want to access other providers’ models.

Configuration

SDK Configuration

Set your base URL to:
https://api.pipellm.ai/gemini
The SDK automatically appends the model path, so you only need to set the base URL to /gemini.

cURL / Direct API

For direct API calls, use the full endpoint:
https://api.pipellm.ai/gemini/v1beta/models/{model}:generateContent
https://api.pipellm.ai/gemini/v1beta/models/{model}:streamGenerateContent

Usage Examples

import google.generativeai as genai
from google.generativeai import types

# Configure with PipeLLM
genai.configure(
    api_key="your-pipellm-api-key",
    transport="rest",
    client_options={"api_endpoint": "api.pipellm.ai/gemini"}
)

# Use OpenAI or Anthropic models
model = genai.GenerativeModel("gpt-4o")  # or "claude-sonnet-4-20250514"

response = model.generate_content("Hello, how are you?")
print(response.text)

Supported Models

You can use any OpenAI or Anthropic model available on PipeLLM with the Gemini format:
ProviderExample Models
OpenAIgpt-4o, gpt-4o-mini, o1
Anthropicclaude-sonnet-4-20250514, claude-3-5-haiku-latest

Feature Support

FeatureStatus
Streaming✅ Supported
Tool Use✅ Supported
Vision✅ Supported
System Prompts✅ Supported
Thinking✅ Supported