opencode.json. The cleanest PipeLLM
setup is to use PipeLLM’s OpenAI converter route so one provider configuration
can access Claude, GPT, and Gemini models.
Recommended Route
https://api.pipellm.ai/v1 only if you want OpenAI-compatible native
routing and do not need cross-protocol access.
Config Locations
OpenCode loads configuration from:~/.config/opencode/opencode.jsonfor global user configopencode.jsonin the project root for project-specific config
Quick Setup
- Export your PipeLLM key:
- Add a custom provider to
opencode.json:
- Start OpenCode:
Notes
- OpenCode custom providers accept any provider id.
pipellmis just an example. @ai-sdk/openai-compatibleis the right package when you are targeting Chat Completions-style endpoints.- If a model or workflow specifically needs the OpenAI Responses API, that is a separate integration path and should not reuse this Chat Completions setup.
Related Docs
Developer Tools
Compare supported coding assistants
OpenAI Converter
Keep OpenAI-compatible format across providers
OpenCode Providers Docs
Official OpenCode custom provider examples