docs: Add deepseek to free use guide (#809)

Co-authored-by: angiejones <jones.angie@gmail.com>
This commit is contained in:
Adewale Abati
2025-01-27 20:22:18 +01:00
committed by GitHub
parent ff13a4efd8
commit 71d3850c59
2 changed files with 89 additions and 8 deletions

View File

@@ -95,8 +95,8 @@ To configure your chosen provider or see available options, run `goose configure
Ollama provides local LLMs, which requires a bit more set up before you can use it with Goose.
:::warning Extensions Not Supported with DeepSeek
DeepSeek models do not support tool calling. So if using one of these models, all Goose [extensions must be disabled](/docs/getting-started/using-extensions#enablingdisabling-extensions).
:::warning Custom DeekSeek Model
The native `DeepSeek-r1` model does not support tool calling. If using this model, all Goose [extensions must be disabled](/docs/getting-started/using-extensions#enablingdisabling-extensions). As an alternative, you can use a [custom DeepSeek-r1 model](/docs/getting-started/using-goose-free#deepseek-r1) we've made specifically for Goose.
:::
1. [Download Ollama](https://ollama.com/download).

View File

@@ -8,17 +8,13 @@ import TabItem from '@theme/TabItem';
# Using Goose for Free
:::info Supported Environments
Goose currently works only on **macOS** and **Linux** systems, and supports both **ARM** and **x86** architectures. If you'd like to request support for additional operating systems, please [open an issue on GitHub](https://github.com/block/goose/issues/new?template=Blank+issue).
:::
Goose is a free and open source developer AI agent that you can start using right away, but not all supported [LLM Providers][providers] provide a free tier.
Below, we outline a couple of free options and how to get started with them.
## Google Gemini
Google Gemini provides free access to its AI capabilities with some limitations. To start using the Gemini API with Goose, you need an API Key from [Google AI studio](https://aistudio.google.com/app/apikey).
Google Gemini provides a free tier. To start using the Gemini API with Goose, you need an API Key from [Google AI studio](https://aistudio.google.com/app/apikey).
To set up Google Gemini with Goose, follow these steps:
@@ -29,7 +25,9 @@ To set up Google Gemini with Goose, follow these steps:
goose configure
```
2. Select `Configure Providers` from the menu.
3. Follow the prompts to choose `Google Gemini` as the provider and enter your API key.
3. Follow the prompts to choose `Google Gemini` as the provider.
4. Enter your API key when prompted.
5. Enter the Gemini model of your choice.
```
┌ goose-configure
@@ -63,6 +61,89 @@ To set up Google Gemini with Goose, follow these steps:
</TabItem>
</Tabs>
## DeepSeek-R1
:::warning
Depending on the model's size, you'll need a relatively powerful device to smoothly run local LLMs.
:::
Ollama provides open source LLMs, such as `DeepSeek-r1`, that you can install and run locally.
Note that the native `DeepSeek-r1` model doesn't support tool calling, however, we have a [custom model](https://ollama.com/michaelneale/deepseek-r1-goose) you can use with Goose.
1. Download and install Ollama from [ollama.com](https://ollama.com/download).
2. In a terminal window, run the following command to install the custom DeepSeek-r1 model:
```sh
ollama run michaelneale/deepseek-r1-goose
```
<Tabs groupId="interface">
<TabItem value="cli" label="Goose CLI" default>
3. In a separate terminal window, configure with Goose:
```sh
goose configure
```
4. Choose to `Configure Providers`
```
┌ goose-configure
◆ What would you like to configure?
│ ● Configure Providers (Change provider or update credentials)
│ ○ Toggle Extensions
│ ○ Add Extension
```
5. Choose `Ollama` as the model provider
```
┌ goose-configure
◇ What would you like to configure?
│ Configure Providers
◆ Which model provider should we use?
│ ○ Anthropic
│ ○ Databricks
│ ○ Google Gemini
│ ○ Groq
│ ● Ollama (Local open source models)
│ ○ OpenAI
│ ○ OpenRouter
```
6. Enter the installed deepseek-r1 model from above
```
┌ goose-configure
◇ What would you like to configure?
│ Configure Providers
◇ Which model provider should we use?
│ Ollama
◇ Enter a model from that provider:
│ michaelneale/deepseek-r1-goose
◇ Welcome! You're all set to explore and utilize my capabilities. Let's get started on solving your problems together!
└ Configuration saved successfully
```
</TabItem>
<TabItem value="ui" label="Goose Desktop">
3. Click `...` in the top-right corner.
4. Navigate to `Settings` -> `Browse Models` -> and select `Ollama` from the list.
5. Enter `michaelneale/deepseek-r1-goose` for the model name.
</TabItem>
</Tabs>
## Limitations
These free options are a great way to get started with Goose and explore its capabilities. However, if you need more advanced features or higher usage limits, you can upgrade to a paid plan with your LLM provider.