mirror of
https://github.com/aljazceru/goose.git
synced 2025-12-18 14:44:21 +01:00
docs: more info on configuring Ollama (#804)
This commit is contained in:
@@ -15,13 +15,14 @@ Goose is compatible with a wide range of LLM providers, allowing you to choose a
|
|||||||
|
|
||||||
| Provider | Description | Parameters |
|
| Provider | Description | Parameters |
|
||||||
|-----------------------------------------------|---------------------------------------------------|---------------------------------------|
|
|-----------------------------------------------|---------------------------------------------------|---------------------------------------|
|
||||||
| [OpenAI](https://platform.openai.com/api-keys) | Provides GPT-4, GPT-3.5-turbo, and other advanced language models. | `OPENAI_API_KEY` |
|
|
||||||
| [Databricks](https://www.databricks.com/) | Unified data analytics and AI platform for building and deploying models. | `DATABRICKS_HOST`, `DATABRICKS_TOKEN` |
|
|
||||||
| [Ollama](https://ollama.com/) | Local model runner supporting DeepSeek, Llama, Mistral, and other open-source models. | N/A |
|
|
||||||
| [Anthropic](https://www.anthropic.com/) | Offers Claude, an advanced AI model for natural language tasks. | `ANTHROPIC_API_KEY` |
|
| [Anthropic](https://www.anthropic.com/) | Offers Claude, an advanced AI model for natural language tasks. | `ANTHROPIC_API_KEY` |
|
||||||
|
| [Databricks](https://www.databricks.com/) | Unified data analytics and AI platform for building and deploying models. | `DATABRICKS_HOST`, `DATABRICKS_TOKEN` |
|
||||||
| [Gemini](https://ai.google.dev/gemini-api/docs) | Advanced LLMs by Google with multimodal capabilities (text, images). | `GOOGLE_API_KEY` |
|
| [Gemini](https://ai.google.dev/gemini-api/docs) | Advanced LLMs by Google with multimodal capabilities (text, images). | `GOOGLE_API_KEY` |
|
||||||
| [Groq](https://groq.com/) | High-performance inference hardware and tools for LLMs. | `GROQ_API_KEY` |
|
| [Groq](https://groq.com/) | High-performance inference hardware and tools for LLMs. | `GROQ_API_KEY` |
|
||||||
| [OpenRouter](https://openrouter.ai/) | API gateway for unified access to various models with features like rate-limiting management | `OPENROUTER_API_KEY` |
|
| [Ollama](https://ollama.com/) | Local model runner supporting Qwen, Llama, DeepSeek, and other open-source models. **Because this provider runs locally, you must first [download and run a model](/docs/getting-started/providers#local-llms-ollama).** | N/A |
|
||||||
|
| [OpenAI](https://platform.openai.com/api-keys) | Provides GPT-4, GPT-3.5-turbo, and other advanced language models. | `OPENAI_API_KEY` |
|
||||||
|
| [OpenRouter](https://openrouter.ai/) | API gateway for unified access to various models with features like rate-limiting management. | `OPENROUTER_API_KEY` |
|
||||||
|
|
||||||
|
|
||||||
## Configure Provider
|
## Configure Provider
|
||||||
|
|
||||||
@@ -89,3 +90,76 @@ To configure your chosen provider or see available options, run `goose configure
|
|||||||
|
|
||||||
</TabItem>
|
</TabItem>
|
||||||
</Tabs>
|
</Tabs>
|
||||||
|
|
||||||
|
## Local LLMs (Ollama)
|
||||||
|
|
||||||
|
Ollama provides local LLMs, which requires a bit more set up before you can use it with Goose.
|
||||||
|
|
||||||
|
:::warning Extensions Not Supported with DeepSeek
|
||||||
|
DeepSeek models do not support tool calling. So if using one of these models, all Goose [extensions must be disabled](/docs/getting-started/using-extensions#enablingdisabling-extensions).
|
||||||
|
:::
|
||||||
|
|
||||||
|
1. [Download Ollama](https://ollama.com/download).
|
||||||
|
2. Run the [model of your choice](https://ollama.com/search):
|
||||||
|
|
||||||
|
Example:
|
||||||
|
|
||||||
|
```sh
|
||||||
|
ollama run qwen2.5
|
||||||
|
```
|
||||||
|
|
||||||
|
3. In a separate terminal window, configure with Goose:
|
||||||
|
|
||||||
|
```sh
|
||||||
|
goose configure
|
||||||
|
```
|
||||||
|
|
||||||
|
4. Choose to `Configure Providers`
|
||||||
|
|
||||||
|
```
|
||||||
|
┌ goose-configure
|
||||||
|
│
|
||||||
|
◆ What would you like to configure?
|
||||||
|
│ ● Configure Providers (Change provider or update credentials)
|
||||||
|
│ ○ Toggle Extensions
|
||||||
|
│ ○ Add Extension
|
||||||
|
└
|
||||||
|
```
|
||||||
|
|
||||||
|
5. Choose `Ollama` as the model provider
|
||||||
|
|
||||||
|
```
|
||||||
|
┌ goose-configure
|
||||||
|
│
|
||||||
|
◇ What would you like to configure?
|
||||||
|
│ Configure Providers
|
||||||
|
│
|
||||||
|
◆ Which model provider should we use?
|
||||||
|
│ ○ Anthropic
|
||||||
|
│ ○ Databricks
|
||||||
|
│ ○ Google Gemini
|
||||||
|
│ ○ Groq
|
||||||
|
│ ● Ollama (Local open source models)
|
||||||
|
│ ○ OpenAI
|
||||||
|
│ ○ OpenRouter
|
||||||
|
└
|
||||||
|
```
|
||||||
|
|
||||||
|
6. Enter the model you have running
|
||||||
|
|
||||||
|
```
|
||||||
|
┌ goose-configure
|
||||||
|
│
|
||||||
|
◇ What would you like to configure?
|
||||||
|
│ Configure Providers
|
||||||
|
│
|
||||||
|
◇ Which model provider should we use?
|
||||||
|
│ Ollama
|
||||||
|
│
|
||||||
|
◇ Enter a model from that provider:
|
||||||
|
│ qwen2.5
|
||||||
|
│
|
||||||
|
◇ Welcome! You're all set to explore and utilize my capabilities. Let's get started on solving your problems together!
|
||||||
|
│
|
||||||
|
└ Configuration saved successfully
|
||||||
|
```
|
||||||
@@ -3,6 +3,7 @@ title: Troubleshooting
|
|||||||
---
|
---
|
||||||
|
|
||||||
# Troubleshooting
|
# Troubleshooting
|
||||||
|
|
||||||
Goose, like any system, may run into occasional issues. This guide provides solutions for common problems.
|
Goose, like any system, may run into occasional issues. This guide provides solutions for common problems.
|
||||||
|
|
||||||
### Goose Edits Files
|
### Goose Edits Files
|
||||||
@@ -28,12 +29,24 @@ For particularly large or complex tasks, consider breaking them into smaller ses
|
|||||||
:::
|
:::
|
||||||
|
|
||||||
---
|
---
|
||||||
|
|
||||||
### Context Length Exceeded Error
|
### Context Length Exceeded Error
|
||||||
|
|
||||||
This error occurs when the input provided to Goose exceeds the maximum token limit of the LLM being used. To resolve this try breaking down your input into smaller parts. You can also use `.goosehints` as a way to provide goose with detailed context. Refer to the [Using Goosehints Guide][goosehints] for more information.
|
This error occurs when the input provided to Goose exceeds the maximum token limit of the LLM being used. To resolve this try breaking down your input into smaller parts. You can also use `.goosehints` as a way to provide goose with detailed context. Refer to the [Using Goosehints Guide][goosehints] for more information.
|
||||||
|
|
||||||
---
|
---
|
||||||
|
|
||||||
|
### Using Ollama Provider
|
||||||
|
|
||||||
|
Ollama provides local LLMs, which means you must first [download Ollama and run a model](/docs/getting-started/providers#local-llms-ollama) before attempting to use this provider with Goose. If you do not have the model downloaded, you'll run into the follow error:
|
||||||
|
|
||||||
|
> ExecutionError("error sending request for url (http://localhost:11434/v1/chat/completions)")
|
||||||
|
|
||||||
|
|
||||||
|
Another thing to note is that the DeepSeek models do not support tool calling, so all Goose [extensions must be disabled](/docs/getting-started/using-extensions#enablingdisabling-extensions) to use one of these models. Unfortunately, without the use of tools, there is not much Goose will be able to do autonomously if using DeepSeek. However, Ollama's other models such as `qwen2.5` do support tool calling and can be used with Goose extensions.
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
### Handling Rate Limit Errors
|
### Handling Rate Limit Errors
|
||||||
Goose may encounter a `429 error` (rate limit exceeded) when interacting with LLM providers. The recommended solution is to use OpenRouter. See [Handling LLM Rate Limits][handling-rate-limits] for more info.
|
Goose may encounter a `429 error` (rate limit exceeded) when interacting with LLM providers. The recommended solution is to use OpenRouter. See [Handling LLM Rate Limits][handling-rate-limits] for more info.
|
||||||
|
|
||||||
|
|||||||
Reference in New Issue
Block a user