Skip to main content
PipeLLM is the infrastructure layer for production AI systems. Start with the model gateway, add managed tools like WebSearch, and connect the agent frameworks or coding assistants your team already uses.

Platform Surfaces

Gateway API

Use one gateway layer for provider routing, protocol compatibility, and model access.

Managed Tools

Add WebSearch and retrieval surfaces that work well inside agent workflows.

Agent Integrations

Connect Claude Code, OpenClaw, OpenCode, and LangChain to PipeLLM.

The Problem

Every AI provider has its own SDK, API format, and authentication method. Switching between OpenAI, Anthropic, and Gemini means rewriting integration code, managing multiple API keys, and risking vendor lock-in.

Gateway Compatibility

PipeLLM provides native protocol conversion at the gateway level. Your existing code works as-is — just change the base URL.

Use OpenAI SDK

Call Claude, Gemini, DeepSeek, Llama and more — all through the OpenAI SDK you already know.

Use Anthropic SDK

Call GPT, Gemini, Grok, DeepSeek and more — all through the Anthropic SDK you already know.

Use Gemini SDK

Call GPT, Claude, DeepSeek and more — all through the Gemini SDK you already know.
# Call Claude using OpenAI SDK
from openai import OpenAI

client = OpenAI(
    api_key="your-pipellm-key",
    base_url="https://api.pipellm.ai/openai/v1"  # ← just change this
)

response = client.chat.completions.create(
    model="claude-sonnet-4-6",  # ← use any model
    messages=[{"role": "user", "content": "Hello!"}]
)

Who Is PipeLLM For

Startups & Dev Teams

Evaluate and switch models in minutes, not weeks. Test GPT, Claude, Gemini, DeepSeek side-by-side with your existing codebase. No SDK migrations, no new auth flows — just swap the model name.

Enterprises

Add latest AI capabilities to legacy systems without rewriting code. Your production integrations built on OpenAI SDK can instantly access Claude or Gemini. Zero migration cost, full protocol compatibility.

What You Get

  • Universal SDK Compatibility — Use OpenAI, Anthropic, or Gemini SDK interchangeably to access 50+ models
  • Native Protocol Conversion — Not a simple proxy; full request/response format translation with streaming, tool use, vision, and thinking support
  • Enterprise-Grade Routing — High-concurrency gateway with automatic failover and load balancing
  • Managed Tool Surfaces — Add WebSearch and retrieval routes without building a separate tool service
  • Agent-Ready Integrations — Plug existing coding agents and frameworks into PipeLLM-compatible routes
  • Zero Vendor Lock-In — Switch providers by changing one line; your integration code stays the same
  • Pay As You Go — No subscriptions, no minimums, only pay for what you use

Get Started

Quick Start

Get your API key and make your first request in 2 minutes

Converters

Learn when to use native routes versus protocol conversion

Supported Providers

OpenAI Compatible

GPT, Grok, GLM, DeepSeek, Groq

Anthropic

Opus, Sonnet, Haiku

Gemini

Gemini 3, 2.5, 2.0