mirror of
https://github.com/aljazceru/opencode.git
synced 2025-12-23 10:44:21 +01:00
129 lines
3.0 KiB
Plaintext
129 lines
3.0 KiB
Plaintext
---
|
|
title: Models
|
|
description: Configuring an LLM provider and model.
|
|
---
|
|
|
|
opencode uses the [AI SDK](https://ai-sdk.dev/) and [Models.dev](https://models.dev) to support for **75+ LLM providers** and it supports running local models.
|
|
|
|
---
|
|
|
|
## Providers
|
|
|
|
Most popular providers are preloaded by default. If you've added the credentials for a provider through `opencode auth login`, they'll be available when you start opencode.
|
|
|
|
Learn more about [providers](/docs/providers).
|
|
|
|
---
|
|
|
|
## Select a model
|
|
|
|
Once you've configured your provider you can select the model you want by typing in:
|
|
|
|
```bash frame="none"
|
|
/models
|
|
```
|
|
|
|
---
|
|
|
|
## Recommended models
|
|
|
|
There are a lot of models out there, with new models coming out every week.
|
|
|
|
:::tip
|
|
Consider using one of the models we recommend.
|
|
:::
|
|
|
|
However, there are a only a few of them that are good at both generating code and tool calling.
|
|
|
|
Here are the ones we recommend with opencode:
|
|
|
|
- Claude Sonnet 4
|
|
- Claude Opus 4
|
|
- Kimi K2
|
|
- Qwen3 Coder
|
|
- GPT 4.1
|
|
- Gemini 2.5 Pro
|
|
|
|
---
|
|
|
|
## Set a default
|
|
|
|
To set one of these as the default model, you can set the `model` key in your
|
|
opencode config.
|
|
|
|
```json title="opencode.json" {3}
|
|
{
|
|
"$schema": "https://opencode.ai/config.json",
|
|
"model": "lmstudio/google/gemma-3n-e4b"
|
|
}
|
|
```
|
|
|
|
Here the full ID is `provider_id/model_id`.
|
|
|
|
If you've configured a [custom provider](/docs/providers#custom), the `provider_id` is key from the `provider` part of your config, and the `model_id` is the key from `provider.models`.
|
|
|
|
---
|
|
|
|
## Configure models
|
|
|
|
You can globally configure a model's options through the config.
|
|
|
|
```jsonc title="opencode.jsonc" {7-12,19-24}
|
|
{
|
|
"$schema": "https://opencode.ai/config.json",
|
|
"provider": {
|
|
"openai": {
|
|
"models": {
|
|
"gpt-5": {
|
|
"options": {
|
|
"reasoningEffort": "high",
|
|
"textVerbosity": "low",
|
|
"reasoningSummary": "auto",
|
|
"include": ["reasoning.encrypted_content"],
|
|
},
|
|
},
|
|
},
|
|
},
|
|
"anthropic": {
|
|
"models": {
|
|
"claude-sonnet-4-20250514": {
|
|
"options": {
|
|
"thinking": {
|
|
"type": "enabled",
|
|
"budgetTokens": 16000,
|
|
},
|
|
},
|
|
},
|
|
},
|
|
},
|
|
},
|
|
}
|
|
```
|
|
|
|
Here we're configuring global settings for two models: `gpt-5` when accessed via the `openai` provider, and `claude-sonnet-4-20250514` when accessed via the `anthropic` provider.
|
|
|
|
You can also configure these options for any agents that you are using. The agent config overrides any global options here. [Learn more](/docs/agents/#additional).
|
|
|
|
---
|
|
|
|
## Loading models
|
|
|
|
When opencode starts up, it checks for models in the following priority order:
|
|
|
|
1. The `--model` or `-m` command line flag. The format is the same as in the config file: `provider_id/model_id`.
|
|
|
|
2. The model list in the opencode config.
|
|
|
|
```json title="opencode.json"
|
|
{
|
|
"$schema": "https://opencode.ai/config.json",
|
|
"model": "anthropic/claude-sonnet-4-20250514"
|
|
}
|
|
```
|
|
|
|
The format here is `provider/model`.
|
|
|
|
3. The last used model.
|
|
|
|
4. The first model using an internal priority.
|