feat: V1.0 (#734)

Co-authored-by: Michael Neale <michael.neale@gmail.com>
Co-authored-by: Wendy Tang <wendytang@squareup.com>
Co-authored-by: Jarrod Sibbison <72240382+jsibbison-square@users.noreply.github.com>
Co-authored-by: Alex Hancock <alex.hancock@example.com>
Co-authored-by: Alex Hancock <alexhancock@block.xyz>
Co-authored-by: Lifei Zhou <lifei@squareup.com>
Co-authored-by: Wes <141185334+wesrblock@users.noreply.github.com>
Co-authored-by: Max Novich <maksymstepanenko1990@gmail.com>
Co-authored-by: Zaki Ali <zaki@squareup.com>
Co-authored-by: Salman Mohammed <smohammed@squareup.com>
Co-authored-by: Kalvin C <kalvinnchau@users.noreply.github.com>
Co-authored-by: Alec Thomas <alec@swapoff.org>
Co-authored-by: lily-de <119957291+lily-de@users.noreply.github.com>
Co-authored-by: kalvinnchau <kalvin@block.xyz>
Co-authored-by: Copilot <175728472+Copilot@users.noreply.github.com>
Co-authored-by: Rizel Scarlett <rizel@squareup.com>
Co-authored-by: bwrage <bwrage@squareup.com>
Co-authored-by: Kalvin Chau <kalvin@squareup.com>
Co-authored-by: Alice Hau <110418948+ahau-square@users.noreply.github.com>
Co-authored-by: Alistair Gray <ajgray@stripe.com>
Co-authored-by: Nahiyan Khan <nahiyan.khan@gmail.com>
Co-authored-by: Alex Hancock <alexhancock@squareup.com>
Co-authored-by: Nahiyan Khan <nahiyan@squareup.com>
Co-authored-by: marcelle <1852848+laanak08@users.noreply.github.com>
Co-authored-by: Yingjie He <yingjiehe@block.xyz>
Co-authored-by: Yingjie He <yingjiehe@squareup.com>
Co-authored-by: Lily Delalande <ldelalande@block.xyz>
Co-authored-by: Adewale Abati <acekyd01@gmail.com>
Co-authored-by: Ebony Louis <ebony774@gmail.com>
Co-authored-by: Angie Jones <jones.angie@gmail.com>
Co-authored-by: Ebony Louis <55366651+EbonyLouis@users.noreply.github.com>
This commit is contained in:
Bradley Axen
2025-01-24 13:04:43 -08:00
committed by GitHub
parent eccb1b2261
commit 1c9a7c0b05
688 changed files with 71147 additions and 19132 deletions

View File

@@ -0,0 +1,8 @@
{
"label": "Configuration",
"position": 5,
"link": {
"type": "generated-index",
"description": "Extend goose functionalities with extensions and custom configurations"
}
}

View File

@@ -0,0 +1,172 @@
---
sidebar_position: 2
title: Managing Goose Extensions
---
import Tabs from '@theme/Tabs';
import TabItem from '@theme/TabItem';
Extensions are add-ons that provide a way to extend the functionality of Goose. They also provide a way to connect Goose with applications and tools you already use in your workflow. These extensions can be used to add new features, automate tasks, or integrate with other systems.
Extensions are based on the [Model Context Protocol (MCP)](https://github.com/modelcontextprotocol), so you can connect
Goose to a wide ecosystem of capabilities.
## Adding Extensions
You can add extensions to Goose through the CLI or the UI.
<Tabs groupId="interface">
<TabItem value="cli" label="Goose CLI" default>
1. After the initial Goose setup, You can add extensions through the configuration system.
```
goose configure
```
2. Choose `Add Extension` to see the options.
You can also edit the config file directly, which is stored in `~/.config/goose/config.yaml`.
</TabItem>
<TabItem value="ui" label="Goose UI">
1. Locate the menu (...) in the top right corner of the Goose UI.
2. Select `Settings` from the menu.
3. Under `Extensions`, you can add (+) a new extension manually,
4. Or [`Browse Extensions`][extensions] to find curated extensions.
5. Click 'Install' on extension you'd like to add and it installs right in the Goose app.
</TabItem>
</Tabs>
## Removing Extensions
You can remove extensions installed on Goose
<Tabs groupId="interface">
<TabItem value="cli" label="Goose CLI" default>
At the moment, you can remove extensions by editing the config file directly, which is stored in `~/.config/goose/config.yaml`.
</TabItem>
<TabItem value="ui" label="Goose UI">
1. Locate the menu (...) in the top right corner of the Goose UI.
2. Select `Settings` from the menu.
3. Under `Extensions`, find the extension you'd like to remove and click on the settings icon beside it.
4. In the dialog that appears, click `Remove Extension`.
</TabItem>
</Tabs>
## Built-in Extensions
Out of the box, Goose is installed with a few extensions out of the box but with only the `Developer` extension enabled by default.
Here are the default extensions:
1. **Developer**: The `Developer` extension provides a set of general development tools that are useful for software development.
2. **Non-Developer**: The `Non-Developer` extension provides general computer control tools that don't require you to be a developer or engineer.
3. **Memory**: The `Memory` extension teaches goose to remember your preferences as you use it
4. **JetBrains**: The `JetBrains` extension provides an integration for working with JetBrains IDEs.
5. **Google Drive**: The `Google Drive` extension provides an integration for working with Google Drive for file management and access.
#### Toggling Built-in Extensions
<Tabs groupId="interface">
<TabItem value="cli" label="Goose CLI" default>
1. Run Goose configuration:
```bash
goose configure
```
2. Choose `Add Extension`
3. Choose `Built-in Extension`
Alternatively, you can enable a built-in extension by specifying its name in this command:
```
goose mcp {name}
```
</TabItem>
<TabItem value="ui" label="Goose UI">
1. Locate the menu (...) in the top right corner of the Goose UI.
2. Select `Settings` from the menu.
3. Under `Extensions`, you can toggle the built-in extensions on or off.
</TabItem>
</Tabs>
:::tip
All of Goose's built-in extensions are MCP servers in their own right. If you'd like
to use the MCP servers included with Goose with any other agent, you are free to do so.
:::
## MCP Servers
You can run any MCP server as a Goose extension.
<Tabs groupId="interface">
<TabItem value="cli" label="Goose CLI" default>
1. Run `goose configure`
2. Choose `Add Extension`
3. Choose `Command-line Extension`
You'll then be prompted to enter a command and any environment variables needed. For example, to connect to the [Fetch Server](https://github.com/modelcontextprotocol/servers/tree/main/src/fetch), enter `uvx mcp-server-fetch` as the command.
You can also edit the resulting config entry directly, which would look like this:
```yaml
extensions:
fetch:
name: fetch
cmd: uvx
args: [mcp-server-fetch]
enabled: true
envs: {}
type: stdio
```
</TabItem>
<TabItem value="ui" label="Goose UI">
1. Locate the menu (...) in the top right corner of the Goose UI.
2. Select `Settings` from the menu.
3. Under `Extensions`, you can add a MCP server as an extension manually by clicking on the (+) button to the right.
4. In the dialog that appears, enter the details of the MCP server including any environment variables needed.
</TabItem>
</Tabs>
## Discovering Extensions
Goose comes with a [central directory][extensions] of extensions that you can install and use. You can install extensions from the Goose CLI or from the Goose GUI. The page will give you a test command to try out extensions, and if you want to keep them, you can add through `goose configure`.
You can test out an extension for a single session with
```sh
goose session --with-extension "command to run"
```
## Starting a Session with Extensions
You can start a tailored goose session with specific extensions directly from the CLI. To do this, run the following command:
```bash
goose session --with-extension "{extension command}"
```
:::note
You may need to set necessary environment variables for the extension to work correctly.
```bash
goose session --with-extension "VAR=value command arg1 arg2"
```
:::
## Developing Extensions
Goose extensions are implemented with MCP - a system that allows AI models and agents to securely connect with local or remote resources using standard protocols. Learn how to build your own [extension as an MCP server](https://modelcontextprotocol.io/quickstart/server).
[extensions]: https://block.github.io/goose/v1/extensions

View File

@@ -0,0 +1,116 @@
---
sidebar_position: 1
---
import Tabs from '@theme/Tabs';
import TabItem from '@theme/TabItem';
# Supported LLM Providers
You can use Goose with your preferred LLM. Goose supports a variety of LLM providers. To configure your chosen provider or see available options, run `goose configure` in the CLI or visit the `Provider Settings` page in the Goose UI.
<Tabs groupId="interface">
<TabItem value="cli" label="Goose CLI" default>
1. Run the following command:
```sh
goose configure
```
2. Select `Configure Providers` from the menu.
![Provider Config](../assets/guides/goose-providers-cli.png)
</TabItem>
<TabItem value="ui" label="Goose UI">
**To update your LLM provider and API key:**
1. Click on the three dots in the top-right corner.
2. Select `Provider Settings` from the menu.
3. Click Edit, enter your API key, and click `Set as Active`.
</TabItem>
</Tabs>
## Available Providers
- OpenAI
- Databricks
- Ollama
- Anthropic
- Google Gemini
- Groq
- OpenRouter
### OpenAI
OpenAI offers powerful language models that include GPT-4, GPT-3.5-turbo, and more.
1. Run the following command and choose "Configure Providers":
```sh
goose configure
```
2. Select `OpenAI` from the list of available providers.
3. Enter your `OPENAI_API_KEY` when prompted, which you can obtain by registering at [OpenAI's platform](https://platform.openai.com/api-keys).
### Databricks
Databricks is a data analytics and AI platform that provides access to various AI models and tools. They offer integration with popular models and custom model deployment.
1. Run the following command and choose "Configure Providers":
```sh
goose configure
```
2. Select `Databricks` as your provider.
3. Enter your `DATABRICKS_HOST` and `DATABRICKS_TOKEN`, which can be generated in your [Databricks Account Settings](https://www.databricks.com/).
### Ollama
Ollama is an open-source project that allows running large language models locally. It supports various open-source models and provides an API for integration.
1. Run the following command and choose "Configure Providers":
```sh
goose configure
```
2. Select `Ollama` and follow the steps to download and set up your models as detailed on [Ollama's site](https://ollama.com/). Requires `OLLAMA_HOST`.
### Anthropic
Anthropic is an AI research company that offers advanced language models through its API. Their primary model is Claude, which comes in various versions.
1. Run the following command and choose "Configure Providers":
```sh
goose configure
```
2. Choose `Anthropic` and provide the `ANTHROPIC_API_KEY`, obtainable via [Anthropic's platform](https://www.anthropic.com/).
### Google Gemini
Google Gemini is a suite of large language models developed by Google. It offers multimodal capabilities and can be accessed through the [Google AI Studio](https://ai.google.dev/gemini-api/docs).
1. Run the following command and choose "Configure Providers":
```sh
goose configure
```
2. Pick `Google Gemini` from the list of providers and input your `GOOGLE_API_KEY`. .
### Groq
Groq is an AI company that offers high-performance inference for large language models. They provide access to various models through their API.
1. Run the following command and choose "Configure Providers":
```sh
goose configure
```
2. Select `Groq` from the list of providers and input your `GROQ_API_KEY`, set up via the [Groq Console](https://groq.com/).
### OpenRouter
OpenRouter is a platform that provides access to multiple AI models from various providers through a single API. It simplifies the process of using different AI models in applications.
1. Run the following command and choose "Configure Providers":
```sh
goose configure
```
2. Select `OpenRouter` from the list of providers and input your `OPENROUTER_API_KEY`, set up via the [OpenRouter Console](https://openrouter.ai/).