OpenCode providers and configuration

OpenCode supports 75+ LLM providers through the Artificial Intelligence SDK and Models.dev and can run local models. The guide explains adding credentials with opencode auth login and customizing providers in opencode.json.

OpenCode integrates with the Artificial Intelligence SDK and Models.dev to support more than 75 large language model providers and local model runtimes. To use a provider you add credentials via the command opencode auth login and then configure the provider in your project opencode.json. Credentials entered with opencode auth login are stored locally at ~/.local/share/opencode/auth.json. The provider section in opencode.json lets you set visible names, map model IDs, and override options such as baseURL for proxies or custom endpoints.

The documentation lists step-by-step instructions for many providers. Typical flows are: obtain an API key or appropriate environment credentials from the provider console, run opencode auth login and select the provider, paste the key or follow the browser-based authorization, then run the /models command to pick a model. Examples include Amazon Bedrock (environment variables or bearer token), Anthropic, Azure OpenArtificial Intelligence (resource name plus API key), Google Vertex Artificial Intelligence (service account or gcloud auth), Hugging Face inference providers, and managed providers such as DeepSeek, Deep Infra, Groq, Moonshot Artificial Intelligence, Together Artificial Intelligence, and OpenRouter. Local runtimes are supported via LM Studio and Ollama by configuring a provider entry that points at a local baseURL and uses the @ai-sdk/openai-compatible package.

For custom or OpenArtificial Intelligence-compatible providers the docs show how to register an “Other” provider id via opencode auth login, store credentials, and add a provider block in opencode.json. Advanced options include embedding an apiKey via env syntax, adding custom headers, mapping model IDs to display names, and setting model limits (limit.context and limit.output) so OpenCode can track remaining context. Troubleshooting notes advise verifying opencode auth list, matching provider ids between auth and config, using the appropriate npm package for each provider, and ensuring options.baseURL points to the correct endpoint.

50

Impact Score

Contact Us

Got questions? Use the form to contact us.

Contact Form

Clicking next sends a verification code to your email. After verifying, you can enter your message.