chore: update docs for ollama endpoint (#992)

This commit is contained in:
Salman Mohammed
2025-01-31 13:47:41 -05:00
committed by GitHub
parent 7a427c6d46
commit 16df22f817
2 changed files with 7 additions and 0 deletions

View File

@@ -164,6 +164,12 @@ Ollama provides local LLMs, which requires a bit more set up before you can use
1. [Download Ollama](https://ollama.com/download).
2. Run any [model supporting tool-calling](https://ollama.com/search?c=tools):
:::info Ollama Endpoint Construction
For Ollama, we set default host to `localhost` and port to `11434` if you don't provide it. When constructing the URL, we preprend `"http://"` if the scheme is not http or https.
If you're running Ollama on port 80 or 443, you have to set `OLLMA_HOST=http://host:port`
:::
:::warning Limited Support for models without tool calling
Goose extensively uses tool calling, so models without it (e.g. `DeepSeek-r1`) can only do chat completion. If using models without tool calling, all Goose [extensions must be disabled](/docs/getting-started/using-extensions#enablingdisabling-extensions). As an alternative, you can use a [custom DeepSeek-r1 model](/docs/getting-started/providers#deepseek-r1) we've made specifically for Goose.
:::