From e7e03061e8bb689e2c67a21d7004f73a0ac1ecaf Mon Sep 17 00:00:00 2001 From: Angie Jones Date: Mon, 3 Feb 2025 08:14:44 -0600 Subject: [PATCH] moving ollama host note (#1004) --- documentation/docs/getting-started/providers.md | 10 ++++------ 1 file changed, 4 insertions(+), 6 deletions(-) diff --git a/documentation/docs/getting-started/providers.md b/documentation/docs/getting-started/providers.md index 8691b37c..7db69d03 100644 --- a/documentation/docs/getting-started/providers.md +++ b/documentation/docs/getting-started/providers.md @@ -164,12 +164,6 @@ Ollama provides local LLMs, which requires a bit more set up before you can use 1. [Download Ollama](https://ollama.com/download). 2. Run any [model supporting tool-calling](https://ollama.com/search?c=tools): - -:::info Ollama Endpoint Construction -For Ollama, we set default host to `localhost` and port to `11434` if you don't provide it. When constructing the URL, we preprend `"http://"` if the scheme is not http or https. -If you're running Ollama on port 80 or 443, you have to set `OLLMA_HOST=http://host:port` -::: - :::warning Limited Support for models without tool calling Goose extensively uses tool calling, so models without it (e.g. `DeepSeek-r1`) can only do chat completion. If using models without tool calling, all Goose [extensions must be disabled](/docs/getting-started/using-extensions#enablingdisabling-extensions). As an alternative, you can use a [custom DeepSeek-r1 model](/docs/getting-started/providers#deepseek-r1) we've made specifically for Goose. ::: @@ -219,6 +213,10 @@ goose configure 5. Enter the host where your model is running +:::info Endpoint +For Ollama, if you don't provide a host, we set it to `localhost:11434`. When constructing the URL, we preprend `http://` if the scheme is not `http` or `https`. If you're running Ollama on port 80 or 443, you'll have to set `OLLMA_HOST=http://host:{port}` +::: + ``` ┌ goose-configure │