docs: include "limit" example (#3925)

Co-authored-by: opencode-agent[bot] <opencode-agent[bot]@users.noreply.github.com>
Co-authored-by: rekram1-node <rekram1-node@users.noreply.github.com>
This commit is contained in:
Matthew Fitzpatrick
2025-11-05 07:47:11 -08:00
committed by GitHub
parent 03f7f18260
commit c9dfe6d964

View File

@@ -482,7 +482,6 @@ To use Google Vertex AI with OpenCode:
4. Run the `/models` command to select a model like _Kimi-K2-Instruct_ or _GLM-4.6_.
---
### LM Studio
@@ -935,9 +934,9 @@ You can use any OpenAI-compatible provider with opencode. Most modern AI provide
##### Example
Here's an example setting the `apiKey` and `headers` options.
Here's an example setting the `apiKey`, `headers`, and model `limit` options.
```json title="opencode.json" {9,11}
```json title="opencode.json" {9,11,17-20}
{
"$schema": "https://opencode.ai/config.json",
"provider": {
@@ -953,7 +952,11 @@ Here's an example setting the `apiKey` and `headers` options.
},
"models": {
"my-model-name": {
"name": "My Model Display Name"
"name": "My Model Display Name",
"limit": {
"context": 200000,
"output": 65536
}
}
}
}
@@ -961,7 +964,14 @@ Here's an example setting the `apiKey` and `headers` options.
}
```
We are setting the `apiKey` using the `env` variable syntax, [learn more](/docs/config#env-vars).
Configuration details:
- **apiKey**: Set using `env` variable syntax, [learn more](/docs/config#env-vars).
- **headers**: Custom headers sent with each request.
- **limit.context**: Maximum input tokens the model accepts.
- **limit.output**: Maximum tokens the model can generate.
The `limit` fields allow OpenCode to understand how much context you have left. Standard providers pull these from models.dev automatically.
---