moving ollama host note (#1004)

This commit is contained in:
Angie Jones
2025-02-03 08:14:44 -06:00
committed by GitHub
parent 6c15f42e8e
commit e7e03061e8

View File

@@ -164,12 +164,6 @@ Ollama provides local LLMs, which requires a bit more set up before you can use
1. [Download Ollama](https://ollama.com/download).
2. Run any [model supporting tool-calling](https://ollama.com/search?c=tools):
:::info Ollama Endpoint Construction
For Ollama, we set default host to `localhost` and port to `11434` if you don't provide it. When constructing the URL, we preprend `"http://"` if the scheme is not http or https.
If you're running Ollama on port 80 or 443, you have to set `OLLMA_HOST=http://host:port`
:::
:::warning Limited Support for models without tool calling
Goose extensively uses tool calling, so models without it (e.g. `DeepSeek-r1`) can only do chat completion. If using models without tool calling, all Goose [extensions must be disabled](/docs/getting-started/using-extensions#enablingdisabling-extensions). As an alternative, you can use a [custom DeepSeek-r1 model](/docs/getting-started/providers#deepseek-r1) we've made specifically for Goose.
:::
@@ -219,6 +213,10 @@ goose configure
5. Enter the host where your model is running
:::info Endpoint
For Ollama, if you don't provide a host, we set it to `localhost:11434`. When constructing the URL, we preprend `http://` if the scheme is not `http` or `https`. If you're running Ollama on port 80 or 443, you'll have to set `OLLMA_HOST=http://host:{port}`
:::
```
┌ goose-configure