docs: add new providers doc, reorg sidebar, edits

This commit is contained in:
Jay V
2025-07-30 18:16:11 -04:00
parent c38b091895
commit 160923dcf0
5 changed files with 536 additions and 174 deletions

View File

@@ -62,21 +62,33 @@ export default defineConfig({
},
sidebar: [
"docs",
"docs/cli",
"docs/ide",
"docs/share",
"docs/modes",
"docs/agents",
"docs/rules",
"docs/github",
"docs/config",
"docs/models",
"docs/themes",
"docs/keybinds",
// "docs/providers",
"docs/providers",
"docs/enterprise",
"docs/mcp-servers",
"docs/troubleshooting",
{
label: "Usage",
items: [
"docs/cli",
"docs/ide",
"docs/share",
"docs/github",
]
},
{
label: "Configure",
items: [
"docs/modes",
"docs/rules",
"docs/agents",
"docs/models",
"docs/themes",
"docs/keybinds",
"docs/mcp-servers",
]
}
],
components: {
Hero: "./src/components/Hero.astro",

View File

@@ -1,23 +1,23 @@
---
title: GitHub
description: Use opencode in GitHub Issues and Pull-Requests
description: Use opencode in GitHub issues and pull-requests
---
opencode integrates directly into your GitHub workflow. Mention `/opencode` or `/oc` in your comment, and opencode will execute tasks within your GitHub Actions runner.
opencode integrates with your GitHub workflow. Mention `/opencode` or `/oc` in your comment, and opencode will execute tasks within your GitHub Actions runner.
---
## Features
- **Triage Issues**: Ask opencode to look into an issue and explain it to you.
- **Fix and Implement**: Ask opencode to fix an issue or implement a feature. And it will work in a new branch and submits a PR with all the changes.
- **Triage issues**: Ask opencode to look into an issue and explain it to you.
- **Fix and implement**: Ask opencode to fix an issue or implement a feature. And it will work in a new branch and submits a PR with all the changes.
- **Secure**: opencode runs inside your GitHub's runners.
---
## Installation
Run the following command in the terminal from your GitHub repo:
Run the following command in a project that is in a GitHub repo:
```bash
opencode github install
@@ -29,10 +29,17 @@ This will walk you through installing the GitHub app, creating the workflow, and
### Manual Setup
1. Install the GitHub app https://github.com/apps/opencode-agent. Make sure it is installed on the target repository.
2. Add the following workflow file to `.github/workflows/opencode.yml` in your repo. Set the appropriate `model` and required API keys in `env`.
Or you can set it up manually.
```yml
1. **Install the GitHub app**
Head over to [**github.com/apps/opencode-agent**](https://github.com/apps/opencode-agent). Make sure it's installed on the target repository.
2. **Add the workflow**
Add the following workflow file to `.github/workflows/opencode.yml` in your repo. Make sure to set the appropriate `model` and required API keys in `env`.
```yml title=".github/workflows/opencode.yml" {24,26}
name: opencode
on:
@@ -62,39 +69,49 @@ This will walk you through installing the GitHub app, creating the workflow, and
# share: true
```
3. Store the API keys in secrets. In your organization or project **settings**, expand **Secrets and variables** on the left and select **Actions**. Add the required API keys.
3. **Store the API keys in secrets**
In your organization or project **settings**, expand **Secrets and variables** on the left and select **Actions**. And add the required API keys.
---
## Configuration
- `model`: The model used by opencode. Takes the format of `provider/model` (**required**)
- `model`: The model used by opencode. Takes the format of `provider/model`. This is **required**.
- `share`: Share the session. Sessions are shared by default for public repos.
---
## Usage Examples
## Examples
- Explain an issue
Here are some examples of how you can use opencode in GitHub.
Leave the following comment on a GitHub issue. `opencode` will read the entire thread, including all comments, and reply with a clear explanation.
- **Explain an issue**
Add this comment in a GitHub issue.
```
/opencode explain this issue
```
- Fix an issue
opencode will read the entire thread, including all comments, and reply with a clear explanation.
Leave the following comment on a GitHub issue. opencode will create a new branch, implement the changes, and open a PR with the changes.
- **Fix an issue**
In a GitHub issue, say:
```
/opencode fix this
```
And opencode will create a new branch, implement the changes, and open a PR with the changes.
- Review PRs and make changes
Leave the following comment on a GitHub PR. opencode will implement the requested change and commit it to the same PR.
Leave the following comment on a GitHub PR.
```
Delete the attachment from S3 when the note is removed /oc
```
opencode will implement the requested change and commit it to the same PR.

View File

@@ -105,7 +105,7 @@ $ opencode auth login
```
Alternatively, you can select one of the other providers and add their API keys.
Alternatively, you can select one of the other providers. [Learn more](/docs/providers#directory).
---

View File

@@ -9,157 +9,47 @@ opencode uses the [AI SDK](https://ai-sdk.dev/) and [Models.dev](https://models.
## Providers
You can configure providers in your opencode config under the `provider` section.
---
### Defaults
Most popular providers are preloaded by default. If you've added the credentials for a provider through `opencode auth login`, they'll be available when you start opencode.
Learn more about [providers](/docs/providers).
---
### Custom
## Select a model
You can add custom providers by specifying the npm package for the provider and the models you want to use.
Once you've configured your provider you can select the model you want by typing in:
```json title="opencode.json" {5,9-11}
{
"$schema": "https://opencode.ai/config.json",
"provider": {
"moonshot": {
"npm": "@ai-sdk/openai-compatible",
"options": {
"baseURL": "https://api.moonshot.ai/v1"
},
"models": {
"kimi-k2-0711-preview": {}
}
}
}
}
```bash frame="none"
/models
```
---
### Base URL
## Recommended models
You can customize the base URL for any provider by setting the `baseURL` option. This is useful when using proxy services or custom endpoints.
There are a lot of models out there, with new models coming out every week.
```json title="opencode.json" {6}
{
"$schema": "https://opencode.ai/config.json",
"provider": {
"anthropic": {
"options": {
"baseURL": "https://api.anthropic.com/v1"
}
}
}
}
```
:::tip
Consider using one of the models we recommend.
:::
However, there are a only a few of them that are good at both generating code and tool calling.
Here are the ones we recommend with opencode:
- Claude Sonnet 4
- Claude Opus 4
- Kimi K2
- Qwen3 Coder
- GPT 4.1
- Gemini 2.5 Pro
---
### OpenRouter
## Set a default
Many OpenRouter models are preloaded by default - you can customize these or add your own.
Here's an example of specifying a provider
```json title="opencode.json"
{
"$schema": "https://opencode.ai/config.json",
"provider": {
"openrouter": {
"models": {
"moonshotai/kimi-k2": {
"options": {
"provider": {
"order": ["baseten"],
"allow_fallbacks": false
}
}
}
}
}
}
}
```
You can also add additional models
```json title="opencode.json"
{
"$schema": "https://opencode.ai/config.json",
"provider": {
"openrouter": {
"models": {
"somecoolnewmodel": {},
}
}
}
}
```
---
### Local
You can configure local model like ones served through LM Studio or Ollama. To
do so, you'll need to specify a couple of things.
Here's an example of configuring a local model from LM Studio:
```json title="opencode.json" {4-15}
{
"$schema": "https://opencode.ai/config.json",
"provider": {
"lmstudio": {
"npm": "@ai-sdk/openai-compatible",
"name": "LM Studio (local)",
"options": {
"baseURL": "http://127.0.0.1:1234/v1"
},
"models": {
"google/gemma-3n-e4b": {
"name": "Gemma 3n-e4b (local)"
}
}
}
}
}
```
In this example:
- `lmstudio` is the custom provider ID. We'll use this later.
- `npm` specifies the package to use for this provider. Here, `@ai-sdk/openai-compatible` is used for any OpenAI-compatible API.
- `name` is the display name for the provider in the UI.
- `options.baseURL` is the endpoint for the local server.
- `models` is a map of model IDs to their configurations. The model name will be displayed in the model selection list.
Similarly, to configure a local model from Ollama:
```json title="opencode.json" {5,7}
{
"$schema": "https://opencode.ai/config.json",
"provider": {
"ollama": {
"npm": "@ai-sdk/openai-compatible",
"options": {
"baseURL": "http://localhost:11434/v1"
},
"models": {
"llama2": {}
}
}
}
}
```
To set one of these as the default model, you can set the `model` key at the
root.
To set one of these as the default model, you can set the `model` key in your
opencode config.
```json title="opencode.json" {3}
{
@@ -168,17 +58,9 @@ root.
}
```
Here the full ID is `provider_id/model_id`, where `provider_id` is the key in the `provider` list we set above and `model_id` is the key from the `provider.models` list.
Here the full ID is `provider_id/model_id`.
---
## Select a model
If you have multiple models, you can select the model you want by typing in:
```bash frame="none"
/models
```
If you've configured a [custom provider](/docs/providers#custom), the `provider_id` is key from the `provider` part of your config, and the `model_id` is the key from `provider.models`.
---

View File

@@ -5,3 +5,454 @@ description: Using any LLM provider in opencode.
opencode uses the [AI SDK](https://ai-sdk.dev/) and [Models.dev](https://models.dev) to support for **75+ LLM providers** and it supports running local models.
To add a provider you need to:
1. Add the API keys for the provider using `opencode auth login`.
2. Configure the provider in your opencode config.
---
### Credentials
When you add a provider's API keys with `opencode auth login`, they are stored
in `~/.local/share/opencode/auth.json`.
---
### Config
You can customize the providers through the `provider` section in your opencode
config.
---
#### Base URL
You can customize the base URL for any provider by setting the `baseURL` option. This is useful when using proxy services or custom endpoints.
```json title="opencode.json" {6}
{
"$schema": "https://opencode.ai/config.json",
"provider": {
"anthropic": {
"options": {
"baseURL": "https://api.anthropic.com/v1"
}
}
}
}
```
---
## Directory
Let's look at some of the providers in detail. If you'd like to add a provider to the
list, feel free to open a PR.
---
### Amazon Bedrock
To use Amazon Bedrock with opencode:
1. Head over to the **Model catalog** in the Amazon Bedrock console and request
access to the models you want.
:::tip
You need to have access to the model you want in Amazon Bedrock.
:::
1. You'll need either to set one of the following environment variables:
- `AWS_ACCESS_KEY_ID`: You can get this by creating an IAM user and generating
an access key for it.
- `AWS_PROFILE`: First login through AWS IAM Identity Center (or AWS SSO) using
`aws sso login`. Then get the name of the profile you want to use.
- `AWS_BEARER_TOKEN_BEDROCK`: You can generate a long-term API key from the
Amazon Bedrock console.
Once you have one of the above, set it while running opencode.
```bash
AWS_ACCESS_KEY_ID=XXX opencode
```
Or add it to a `.env` file in the project root.
```bash title=".env"
AWS_ACCESS_KEY_ID=XXX
```
Or add it to your bash profile.
```bash title="~/.bash_profile"
export AWS_ACCESS_KEY_ID=XXX
```
2. Run the `/models` command to select the model you want.
---
### Anthropic
We recommend signing up for [Claude Pro](https://www.anthropic.com/news/claude-pro) or [Max](https://www.anthropic.com/max), it's the most cost-effective way to use opencode.
Once you've singed up, run `opencode auth login` and select Anthropic.
```bash
$ opencode auth login
┌ Add credential
◆ Select provider
│ ● Anthropic (recommended)
│ ○ OpenAI
│ ○ Google
│ ...
```
This will ask you login with your Anthropic account in your browser. Now all the
the Anthropic models should be available when you use the `/models` command.
---
### GitHub Copilot
To use your GitHub Copilot subscription with opencode:
:::note
Some models might need a [Pro+
subscription](https://github.com/features/copilot/plans) to use.
:::
1. Run `opencode auth login` and select GitHub Copilot.
```bash
$ opencode auth login
┌ Add credential
◇ Select provider
│ GitHub Copilot
◇ ──────────────────────────────────────────────╮
│ │
│ Please visit: https://github.com/login/device │
│ Enter code: 8F43-6FCF │
│ │
├─────────────────────────────────────────────────╯
◓ Waiting for authorization...
```
2. Navigate to [github.com/login/device](https://github.com/login/device) and enter the code.
3. Now run the `/models` command to select the model you want.
---
### Groq
1. Head over to the [Groq console](https://console.groq.com/), click **Create API Key**, and copy the key.
2. Run `opencode auth login` and select Groq.
```bash
$ opencode auth login
┌ Add credential
◆ Select provider
│ ● Groq
│ ...
```
3. Enter the API key for the provider.
```bash
$ opencode auth login
┌ Add credential
◇ Select provider
│ Groq
◇ Enter your API key
│ _
```
4. Run the `/models` command to select the one you want.
---
### LM Studio
You can configure opencode to use local models through LM Studio.
```json title="opencode.json" "lmstudio" {5, 6, 8, 10-14}
{
"$schema": "https://opencode.ai/config.json",
"provider": {
"lmstudio": {
"npm": "@ai-sdk/openai-compatible",
"name": "LM Studio (local)",
"options": {
"baseURL": "http://127.0.0.1:1234/v1"
},
"models": {
"google/gemma-3n-e4b": {
"name": "Gemma 3n-e4b (local)"
}
}
}
}
}
```
In this example:
- `lmstudio` is the custom provider ID. This can be any string you want.
- `npm` specifies the package to use for this provider. Here, `@ai-sdk/openai-compatible` is used for any OpenAI-compatible API.
- `name` is the display name for the provider in the UI.
- `options.baseURL` is the endpoint for the local server.
- `models` is a map of model IDs to their configurations. The model name will be displayed in the model selection list.
---
### Ollama
You can configure opencode to use local models through Ollama.
```json title="opencode.json" "ollama" {5, 6, 8, 10-14}
{
"$schema": "https://opencode.ai/config.json",
"provider": {
"ollama": {
"npm": "@ai-sdk/openai-compatible",
"name": "Ollama (local)",
"options": {
"baseURL": "http://localhost:11434/v1"
},
"models": {
"llama2": {
"name": "Llama 2"
}
}
}
}
}
```
In this example:
- `ollama` is the custom provider ID. This can be any string you want.
- `npm` specifies the package to use for this provider. Here, `@ai-sdk/openai-compatible` is used for any OpenAI-compatible API.
- `name` is the display name for the provider in the UI.
- `options.baseURL` is the endpoint for the local server.
- `models` is a map of model IDs to their configurations. The model name will be displayed in the model selection list.
---
### OpenAI
https://platform.openai.com/api-keys
1. Head over to the [OpenAI Platform console](https://platform.openai.com/api-keys), click **Create new secret key**, and copy the key.
2. Run `opencode auth login` and select OpenAI.
```bash
$ opencode auth login
┌ Add credential
◆ Select provider
│ ● OpenAI
│ ...
```
3. Enter the API key for the provider.
```bash
$ opencode auth login
┌ Add credential
◇ Select provider
│ OpenAI
◇ Enter your API key
│ _
```
4. Run the `/models` command to select the one you want.
---
### OpenRouter
1. Head over to the [OpenRouter dashboard](https://openrouter.ai/settings/keys), click **Create API Key**, and copy the key.
2. Run `opencode auth login` and select OpenRouter.
```bash
$ opencode auth login
┌ Add credential
◆ Select provider
│ ● OpenRouter
│ ○ Anthropic
│ ○ Google
│ ...
```
3. Enter the API key for the provider.
```bash
$ opencode auth login
┌ Add credential
◇ Select provider
│ OpenRouter
◇ Enter your API key
│ _
```
4. Many OpenRouter models are preloaded by default, run the `/models` command to select the one you want.
You can also add additional models through your opencode config.
```json title="opencode.json" {6}
{
"$schema": "https://opencode.ai/config.json",
"provider": {
"openrouter": {
"models": {
"somecoolnewmodel": {},
}
}
}
}
```
5. You can also customize them through your opencode config. Here's an example of specifying a provider
```json title="opencode.json"
{
"$schema": "https://opencode.ai/config.json",
"provider": {
"openrouter": {
"models": {
"moonshotai/kimi-k2": {
"options": {
"provider": {
"order": ["baseten"],
"allow_fallbacks": false
}
}
}
}
}
}
}
```
---
### Custom
To add any **OpenAI-compatible** provider that's not listed in `opencode auth login`:
:::tip
You can use any OpenAI-compatible provider with opencode.
:::
1. Scroll down to **Other**.
```bash
$ opencode auth login
┌ Add credential
◆ Select provider
│ ...
│ ● Other
```
2. Enter the ID for the provider.
```bash
$ opencode auth login
┌ Add credential
◆ Select provider
│ ...
│ ● Other
◇ Enter provider id
│ _
```
You can use any ID you want, we'll use this later in the opencode config.
3. Add the API keys for the provider.
```bash
$ opencode auth login
┌ Add credential
◇ Select provider
│ Other
◇ Enter provider id
│ coolnewprovider
▲ This only stores a credential for coolnewprovider - you will need configure it in opencode.json, check the docs for examples.
◆ Enter your API key
│ _
```
4. Configure the provider in your [opencode config](/docs/config).
```json title="opencode.json" "coolnewprovider" {7,9-11}
{
"$schema": "https://opencode.ai/config.json",
"provider": {
"coolnewprovider": {
"npm": "@ai-sdk/openai-compatible",
"options": {
"baseURL": "https://api.newaicompany.com/v1"
},
"models": {
"newmodel-m1-0711-preview": {}
}
}
}
}
```
A couple of things to note here:
- We are using the provider ID we entered earlier, `coolnewprovider` in
this example.
- The `baseURL` is the OpenAI-compatible endpoint for the provider.
- And we are listing the models we want to use. This will show up when we run
the `/models` command.