Reorganizing tutorials (#3028)

This commit is contained in:
Angie Jones
2025-06-22 17:24:55 -05:00
committed by GitHub
parent 6dbb4737f8
commit 0c81198ac4
65 changed files with 231 additions and 79 deletions

View File

@@ -25,11 +25,11 @@ Goose relies heavily on tool calling capabilities and currently works best with
| [Azure OpenAI](https://learn.microsoft.com/en-us/azure/ai-services/openai/) | Access Azure-hosted OpenAI models, including GPT-4 and GPT-3.5. Supports both API key and Azure credential chain authentication. | `AZURE_OPENAI_ENDPOINT`, `AZURE_OPENAI_DEPLOYMENT_NAME`, `AZURE_OPENAI_API_KEY` (optional) |
| [Databricks](https://www.databricks.com/) | Unified data analytics and AI platform for building and deploying models. | `DATABRICKS_HOST`, `DATABRICKS_TOKEN` |
| [Gemini](https://ai.google.dev/gemini-api/docs) | Advanced LLMs by Google with multimodal capabilities (text, images). | `GOOGLE_API_KEY` |
| [GCP Vertex AI](https://cloud.google.com/vertex-ai) | Google Cloud's Vertex AI platform, supporting Gemini and Claude models. **Credentials must be configured in advance. Follow the instructions at https://cloud.google.com/vertex-ai/docs/authentication.** | `GCP_PROJECT_ID`, `GCP_LOCATION` and optional `GCP_MAX_RETRIES` (6), `GCP_INITIAL_RETRY_INTERVAL_MS` (5000), `GCP_BACKOFF_MULTIPLIER` (2.0), `GCP_MAX_RETRY_INTERVAL_MS` (320_000). |
| [GCP Vertex AI](https://cloud.google.com/vertex-ai) | Google Cloud's Vertex AI platform, supporting Gemini and Claude models. **Credentials must be [configured in advance](https://cloud.google.com/vertex-ai/docs/authentication).** | `GCP_PROJECT_ID`, `GCP_LOCATION` and optional `GCP_MAX_RETRIES` (6), `GCP_INITIAL_RETRY_INTERVAL_MS` (5000), `GCP_BACKOFF_MULTIPLIER` (2.0), `GCP_MAX_RETRY_INTERVAL_MS` (320_000). |
| [GitHub Copilot](https://docs.github.com/en/copilot/using-github-copilot/ai-models) | Access to GitHub Copilot's chat models including gpt-4o, o1, o3-mini, and Claude models. Uses device code authentication flow for secure access. | Uses GitHub device code authentication flow (no API key needed) |
| [Groq](https://groq.com/) | High-performance inference hardware and tools for LLMs. | `GROQ_API_KEY` |
| [Ollama](https://ollama.com/) | Local model runner supporting Qwen, Llama, DeepSeek, and other open-source models. **Because this provider runs locally, you must first [download and run a model](/docs/getting-started/providers#local-llms-ollama).** | `OLLAMA_HOST` |
| [Ramalama](https://ramalama.ai/) | Local model using native [OCI](https://opencontainers.org/) container runtimes, [CNCF](https://www.cncf.io/) tools, and supporting models as OCI artifacts. Ramalama API an compatible alternative to Ollama and can be used with the Goose Ollama provider. Supports Qwen, Llama, DeepSeek, and other open-source models. **Because this provider runs locally, you must first [download and run a model](/docs/getting-started/providers#local-llms-ollama).** | `OLLAMA_HOST` |
| [Ollama](https://ollama.com/) | Local model runner supporting Qwen, Llama, DeepSeek, and other open-source models. **Because this provider runs locally, you must first [download and run a model](/docs/getting-started/providers#local-llms).** | `OLLAMA_HOST` |
| [Ramalama](https://ramalama.ai/) | Local model using native [OCI](https://opencontainers.org/) container runtimes, [CNCF](https://www.cncf.io/) tools, and supporting models as OCI artifacts. Ramalama API an compatible alternative to Ollama and can be used with the Goose Ollama provider. Supports Qwen, Llama, DeepSeek, and other open-source models. **Because this provider runs locally, you must first [download and run a model](/docs/getting-started/providers#local-llms).** | `OLLAMA_HOST` |
| [OpenAI](https://platform.openai.com/api-keys) | Provides gpt-4o, o1, and other advanced language models. Also supports OpenAI-compatible endpoints (e.g., self-hosted LLaMA, vLLM, KServe). **o1-mini and o1-preview are not supported because Goose uses tool calling.** | `OPENAI_API_KEY`, `OPENAI_HOST` (optional), `OPENAI_ORGANIZATION` (optional), `OPENAI_PROJECT` (optional), `OPENAI_CUSTOM_HEADERS` (optional) |
| [OpenRouter](https://openrouter.ai/) | API gateway for unified access to various models with features like rate-limiting management. | `OPENROUTER_API_KEY` |
| [Snowflake](https://docs.snowflake.com/user-guide/snowflake-cortex/aisql#choosing-a-model) | Access the latest models using Snowflake Cortex services, including Claude models. **Requires a Snowflake account and programmatic access token (PAT)**. | `SNOWFLAKE_HOST`, `SNOWFLAKE_TOKEN` |
@@ -278,7 +278,7 @@ To set up Google Gemini with Goose, follow these steps:
</Tabs>
### Local LLMs (Ollama or Ramalama)
### Local LLMs
Ollama and Ramalama are both options to provide local LLMs, each which requires a bit more set up before you can use one of them with Goose.

View File

@@ -12,11 +12,7 @@ Extensions are based on the [Model Context Protocol (MCP)](https://github.com/mo
Goose to a wide ecosystem of capabilities.
:::tip Tutorials
Check out the [step-by-step tutorials](/docs/category/tutorials) for adding and using several Goose Extensions
:::
:::tip Featured Extension
Looking for isolated development environments? Check out our guide on [Isolated Development Environments](/docs/guides/isolated-development-environments) using the new container-use extension.
Check out the [step-by-step tutorials](/docs/category/mcp-servers) for adding and using several Goose Extensions
:::

View File

@@ -1,6 +1,6 @@
{
"label": "Architecture Overview",
"position": 5,
"position": 6,
"link": {
"type": "generated-index",
"description": "Extend Goose functionalities with extensions and custom configurations"

View File

@@ -27,7 +27,7 @@ LLMs have context windows, which are limits on how much conversation history the
Turning on too many extensions can degrade performance. Enable only essential [extensions and tools](/docs/guides/tool-permissions) to improve tool selection accuracy, save context window space, and stay within provider tool limits.
### Teach Goose your preferences
Help Goose remember how you like to work by using [`.goosehints`](/docs/guides/using-goosehints/) for permanent project preferences and the [Memory extension](/docs/tutorials/memory-mcp) for things you want Goose to dynamically recall later. Both can help save valuable context window space while keeping your preferences available.
Help Goose remember how you like to work by using [`.goosehints`](/docs/guides/using-goosehints/) for permanent project preferences and the [Memory extension](/docs/mcp/memory-mcp) for things you want Goose to dynamically recall later. Both can help save valuable context window space while keeping your preferences available.
### Protect sensitive files
Goose is often eager to make changes. You can stop it from changing specific files by creating a [.gooseignore](/docs/guides/using-gooseignore) file. In this file, you can list all the file paths you want it to avoid.

View File

@@ -8,7 +8,7 @@ sidebar_position: 14
`.gooseignore` is a text file that defines patterns for files and directories that Goose will not access. This means Goose cannot read, modify, delete, or run shell commands on these files when using the Developer extension's tools.
:::info Developer extension only
The .gooseignore feature currently only affects tools in the [Developer](/docs/tutorials/developer-mcp) extension. Other extensions are not restricted by these rules.
The .gooseignore feature currently only affects tools in the [Developer](/docs/mcp/developer-mcp) extension. Other extensions are not restricted by these rules.
:::
This guide will show you how to use `.gooseignore` files to prevent Goose from changing specific files and directories.

View File

@@ -0,0 +1,8 @@
{
"label": "MCP Servers",
"position": 5,
"link": {
"type": "generated-index",
"description": "How to integrate and use MCP servers as Goose extensions"
}
}

View File

@@ -10,7 +10,7 @@ import YouTubeShortEmbed from '@site/src/components/YouTubeShortEmbed';
<YouTubeShortEmbed videoUrl="https://www.youtube.com/embed/PF6hpDaI9Mc" />
This tutorial covers how to add the [Knowledge Graph Memory MCP Server](https://github.com/modelcontextprotocol/servers/tree/main/src/memory) as a Goose extension. This enables Goose to analyze relationships, detect patterns, and gain a deeper understanding of your data. The knowledge graph builds on the [memory extension](/docs/tutorials/memory-mcp) by mapping complex relationships between concepts and providing persistent memory across Goose sessions.
This tutorial covers how to add the [Knowledge Graph Memory MCP Server](https://github.com/modelcontextprotocol/servers/tree/main/src/memory) as a Goose extension. This enables Goose to analyze relationships, detect patterns, and gain a deeper understanding of your data. The knowledge graph builds on the [memory extension](/docs/mcp/memory-mcp) by mapping complex relationships between concepts and providing persistent memory across Goose sessions.
:::tip TLDR
<Tabs groupId="interface">

View File

@@ -404,9 +404,9 @@ This audit reveals several critical accessibility issues that should be addresse
### Further Automation with GitHub Extension
You can take this accessibility audit a step further by combining the [GitHub Extension](/docs/tutorials/github-mcp) with the Puppeteer Extension. With this setup, Goose doesnt just find issues and apply fixes, it can also handle the entire Git workflow for you. The GitHub Extension allows Goose to commit changes, create a pull request, and even generate a PR description, so all you have to do is review and merge.
You can take this accessibility audit a step further by combining the [GitHub Extension](/docs/mcp/github-mcp) with the Puppeteer Extension. With this setup, Goose doesnt just find issues and apply fixes, it can also handle the entire Git workflow for you. The GitHub Extension allows Goose to commit changes, create a pull request, and even generate a PR description, so all you have to do is review and merge.
1. Enable the GitHub extension by following the steps in the **[GitHub Extension Tutorial](/docs/tutorials/github-mcp#configuration)**.
1. Enable the GitHub extension by following the steps in the **[GitHub Extension Tutorial](/docs/mcp/github-mcp#configuration)**.
:::tip

View File

@@ -14,8 +14,16 @@ import LinuxDesktopInstallButtons from '@site/src/components/LinuxDesktopInstall
# Goose in 5 minutes
Goose is an open source AI agent that supercharges your software development by automating coding tasks. This quick tutorial will guide you through getting started with Goose!
Goose is an extensible open source AI agent enhances your software development by automating coding tasks.
This quick tutorial will guide you through:
- ✅ Installing Goose
- ✅ Configuring your LLM
- ✅ Building a small app
- ✅ Adding an MCP server
Let's begin 🚀
## Install Goose
@@ -139,6 +147,11 @@ Goose relies heavily on tool calling capabilities and currently works best with
Sessions are single, continuous conversations between you and Goose. Let's start one.
<Tabs groupId="interface">
<TabItem value="ui" label="Goose Desktop" default>
After choosing an LLM provider, youll see the session interface ready for use.
Type your questions, tasks, or instructions directly into the input field, and Goose will immediately get to work.
</TabItem>
<TabItem value="cli" label="Goose CLI">
1. Make an empty directory (e.g. `goose-demo`) and navigate to that directory from the terminal.
2. To start a new session, run:
@@ -154,11 +167,6 @@ Sessions are single, continuous conversations between you and Goose. Let's start
:::
</TabItem>
<TabItem value="ui" label="Goose Desktop" default>
After choosing an LLM provider, youll see the session interface ready for use.
Type your questions, tasks, or instructions directly into the input field, and Goose will immediately get to work.
</TabItem>
</Tabs>
## Write Prompt
@@ -179,6 +187,14 @@ Goose will create a plan and then get right to work on implementing it. Once don
While you're able to manually navigate to your working directory and open the HTML file in a browser, wouldn't it be better if Goose did that for you? Let's give Goose the ability to open a web browser by enabling the `Computer Controller` extension.
<Tabs groupId="interface">
<TabItem value="ui" label="Goose Desktop" default>
1. Locate the menu (`...`) in the top right corner of the Goose Desktop.
2. Select `Advanced settings` from the menu.
3. Under the `Extensions` section, toggle the `Computer Controller` extension to enable it. This [extension](https://block.github.io/goose/v1/extensions/detail/nondeveloper) enables webscraping, file caching, and automations.
4. Scroll back to the top and click `<- Back` in the upper left corner to return to your session.
5. Now that Goose has browser capabilities, let's ask it to launch your game in a browser:
</TabItem>
<TabItem value="cli" label="Goose CLI">
1. End the current session by entering `Ctrl+C` so that you can return to the terminal's command prompt.
2. Run the configuration command
@@ -213,13 +229,6 @@ While you're able to manually navigate to your working directory and open the HT
```
5. Ask Goose to launch your game in a browser:
</TabItem>
<TabItem value="ui" label="Goose Desktop" default>
1. Locate the menu (`...`) in the top right corner of the Goose Desktop.
2. Select `Advanced settings` from the menu.
3. Under the `Extensions` section, toggle the `Computer Controller` extension to enable it. This [extension](https://block.github.io/goose/v1/extensions/detail/nondeveloper) enables webscraping, file caching, and automations.
4. Scroll back to the top and click `<- Back` in the upper left corner to return to your session.
5. Now that Goose has browser capabilities, let's ask it to launch your game in a browser:
</TabItem>
</Tabs>
```

View File

@@ -38,7 +38,7 @@ This error occurs when the input provided to Goose exceeds the maximum token lim
### Using Ollama Provider
Ollama provides local LLMs, which means you must first [download Ollama and run a model](/docs/getting-started/providers#local-llms-ollama) before attempting to use this provider with Goose. If you do not have the model downloaded, you'll run into the following error:
Ollama provides local LLMs, which means you must first [download Ollama and run a model](/docs/getting-started/providers#local-llms) before attempting to use this provider with Goose. If you do not have the model downloaded, you'll run into the following error:
> ExecutionError("error sending request for url (http://localhost:11434/v1/chat/completions)")

View File

@@ -3,6 +3,6 @@
"position": 4,
"link": {
"type": "generated-index",
"description": "How to integrate and use MCP servers as Goose extensions"
"description": "How to use Goose in various ways"
}
}

View File

@@ -1,6 +1,5 @@
---
title: Building Custom Extensions
sidebar_position: 1
description: Create your own custom MCP Server to use as a Goose extension
---

View File

@@ -1,6 +1,5 @@
---
title: Isolated Development Environments
sidebar_position: 25
---
import Tabs from '@theme/Tabs';
@@ -29,7 +28,7 @@ The **[Container Use MCP](https://github.com/dagger/container-use)** server prov
### Install Container Use
Head on over to the [Container Use README](https://github.com/dagger/container-use/blob/main/README.md) for up to date install instructions for this fast moving project.
Head on over to the [Container Use README](https://github.com/dagger/container-use/blob/main/README.md) for up-to-date install instructions for this fast moving project.
## Adding to Goose
@@ -185,4 +184,4 @@ If you encounter issues:
With container-use enabled in Goose, you're ready to develop with confidence. Try starting a conversation about a project you've been hesitant to experiment with, and let Goose set up a safe, isolated environment for your exploration.
Remember: with isolated environments, there's no such thing as a failed experimentonly learning opportunities that don't affect your main codebase.
Remember: with isolated environments, there's no such thing as a failed experiment - only learning opportunities that don't affect your main codebase.