mirror of
https://github.com/aljazceru/goose.git
synced 2025-12-17 22:24:21 +01:00
Reorganizing tutorials (#3028)
This commit is contained in:
@@ -116,7 +116,7 @@ Without tool calling, LLMs would only guess answers based on their training data
|
||||
|
||||
It's worth noting that not all agents are the same when it comes to tool access. Most proprietary agents are tightly scoped to a specific LLM and a predefined set of tools, as companies build agents tailored for their own applications.
|
||||
|
||||
Other agents, like Goose, are more extensible, allowing users to configure it with the LLM of their choice, as well as add tools for various APIs, databases, and even [local environments like IDEs](/docs/tutorials/jetbrains-mcp). However, for agents to scale across different tools and systems without requiring custom integrations for each one, they need a standardized way to discover, call, and manage tools. This is exactly what the [Model Context Protocol (MCP)](https://modelcontextprotocol.io/introduction) provides.
|
||||
Other agents, like Goose, are more extensible, allowing users to configure it with the LLM of their choice, as well as add tools for various APIs, databases, and even [local environments like IDEs](/docs/mcp/jetbrains-mcp). However, for agents to scale across different tools and systems without requiring custom integrations for each one, they need a standardized way to discover, call, and manage tools. This is exactly what the [Model Context Protocol (MCP)](https://modelcontextprotocol.io/introduction) provides.
|
||||
|
||||
## MCP Ecosystem
|
||||
|
||||
@@ -132,9 +132,9 @@ At the time of this writing, there are more than [1000 MCP servers](https://www.
|
||||
|
||||
For example, let's say I want Goose to develop a new web app for me in my WebStorm IDE based on a Figma design and then commit the code to a new repo in GitHub. I can add the following MCP Servers as Goose extensions to give it all of these capabilities:
|
||||
|
||||
* [Figma](/docs/tutorials/figma-mcp)
|
||||
* [JetBrains](/docs/tutorials/jetbrains-mcp)
|
||||
* [GitHub](/docs/tutorials/github-mcp)
|
||||
* [Figma](/docs/mcp/figma-mcp)
|
||||
* [JetBrains](/docs/mcp/jetbrains-mcp)
|
||||
* [GitHub](/docs/mcp/github-mcp)
|
||||
|
||||
With this, I can prompt my AI agent in natural language and it'll take care of the work:
|
||||
|
||||
|
||||
@@ -18,7 +18,7 @@ One of the most common mistakes users make is trying to accomplish too much in a
|
||||
|
||||
Every message adds to the context window, which is the amount of conversation history Goose can retain at any given time. This history is made up of tokens, the individual pieces of text (words or even parts of words) that Goose processes to generate responses. More tokens don’t just increase processing time, they also contribute to LLM usage costs. And once the context window fills up, older messages get pushed out, which can lead to loss of important details or unexpected behavior.
|
||||
|
||||
Think of it like keeping too many browser tabs open. Eventually, it impacts performance. Instead, start fresh sessions for distinct tasks. Don't worry about losing context; that's exactly what the [Memory extension](/docs/tutorials/memory-mcp) is for. Keeping sessions focused and concise ensures more accurate, relevant responses while also keeping your LLM costs under control.
|
||||
Think of it like keeping too many browser tabs open. Eventually, it impacts performance. Instead, start fresh sessions for distinct tasks. Don't worry about losing context; that's exactly what the [Memory extension](/docs/mcp/memory-mcp) is for. Keeping sessions focused and concise ensures more accurate, relevant responses while also keeping your LLM costs under control.
|
||||
|
||||
|
||||
## 2. Minimize Active Extensions
|
||||
@@ -27,7 +27,7 @@ When it comes to Goose extensions, less is often more. It's tempting to enable [
|
||||
|
||||
Consider this: if you're cooking in a kitchen, having every possible utensil and appliance out on the counter doesn't make you a better chef. It just creates clutter and confusion. The same principle applies here.
|
||||
|
||||
Go ahead and install any extensions that interest you, but [keep them disabled](/docs/getting-started/using-extensions#enablingdisabling-extensions) until you need them. Start with the built-in [Developer extension](/docs/tutorials/developer-mcp) enabled, which is surprisingly powerful on its own, and only enable others when you need their specific capabilities. This leads to faster responses, lower token usage, and often more focused solutions.
|
||||
Go ahead and install any extensions that interest you, but [keep them disabled](/docs/getting-started/using-extensions#enablingdisabling-extensions) until you need them. Start with the built-in [Developer extension](/docs/mcp/developer-mcp) enabled, which is surprisingly powerful on its own, and only enable others when you need their specific capabilities. This leads to faster responses, lower token usage, and often more focused solutions.
|
||||
|
||||
:::tip Bonus Tip
|
||||
Before starting a complex task, ask Goose about its current capabilities. A simple prompt like "Do you have tools available to work with [specific technology/service]?" can save time and prevent false starts. Goose can tell you whether it has the necessary tools for your task, and if not, suggest which extensions you might need to enable. This quick check ensures you have the right tools ready before diving in too deep.
|
||||
|
||||
@@ -29,7 +29,7 @@ Throughout the stream, Adewale shared valuable tips to prepare your design for G
|
||||
# Getting Started with Goose and Figma
|
||||
Whether you're a designer wanting to rapidly turn concepts into working code or a developer curious about streamlining design implementation, you can download Goose with its built-in [Developer extension](https://block.github.io/goose/docs/getting-started/installation) and add the [Figma extension](https://block.github.io/goose/v1/extensions/).
|
||||
|
||||
For step-by-step instructions, check out the [Figma tutorial](https://block.github.io/goose/docs/tutorials/figma-mcp).
|
||||
For step-by-step instructions, check out the [Figma tutorial](/docs/mcp/figma-mcp).
|
||||
|
||||
<head>
|
||||
<meta property="og:title" content="Goose Flight School: Turn Figma Designs Into Code With Goose" />
|
||||
|
||||
@@ -46,7 +46,7 @@ These powerful debugging and analysis capabilities are only the beginning. This
|
||||
|
||||
To keep up with the exciting developments as they release, you can check out both of the [Goose](https://github.com/block/goose) and [Langfuse](https://github.com/langfuse/langfuse) repositories on GitHub.
|
||||
|
||||
You can also watch the [livestream discussing the Goose and Langfuse integration](https://www.youtube.com/live/W39BQjsTS9E?feature=shared), and follow the [tutorial showing you how to integrate Langfuse with Goose](https://block.github.io/goose/docs/tutorials/langfuse).
|
||||
You can also watch the [livestream discussing the Goose and Langfuse integration](https://www.youtube.com/live/W39BQjsTS9E?feature=shared), and follow the [tutorial showing you how to integrate Langfuse with Goose](/docs/tutorials/langfuse).
|
||||
|
||||
Also, be sure to subscribe to our [events calendar](https://calget.com/c/t7jszrie) to catch upcoming events.
|
||||
|
||||
|
||||
@@ -40,7 +40,7 @@ If you want Goose to remember your preferences, you can say,
|
||||
|
||||
>_**Goose, remember I’m not a developer. Explain things at a high level unless I ask for technical details**_
|
||||
|
||||
If you have the [Memory Extension](/docs/tutorials/memory-mcp) enabled, Goose will save this preference so you won’t have to remind it every time.
|
||||
If you have the [Memory Extension](/docs/mcp/memory-mcp) enabled, Goose will save this preference so you won’t have to remind it every time.
|
||||
:::
|
||||
|
||||
## Chain-of-Thought Prompting
|
||||
|
||||
@@ -20,7 +20,7 @@ There's so many amazing MCP servers out there to make my work life better, inclu
|
||||
I used GPT-4o for this task
|
||||
:::
|
||||
|
||||
With this prompt, Goose reviews my uncompleted tasks in Asana (note that I have my workspace, project, and user IDs stored in [memory](/docs/tutorials/memory-mcp)).
|
||||
With this prompt, Goose reviews my uncompleted tasks in Asana (note that I have my workspace, project, and user IDs stored in [memory](/docs/mcp/memory-mcp)).
|
||||
|
||||
Rather than bouncing between different types of work, which is a productivity killer, Goose sorts my tasks into categories based on context. For example:
|
||||
|
||||
|
||||
@@ -53,7 +53,7 @@ The features don't end here. The team is actively exploring several exciting fea
|
||||
# Community and Contributing
|
||||
The project is open source, and welcomes contributions from the community. If you'd like to support the project or directly contribute to it, you can check out [the VSCode MCP repo on GitHub](https://github.com/block/vscode-mcp), or [join the Block Open Source Discord](https://discord.gg/block-opensource) if you'd like to ask the team any questions or start discussions.
|
||||
|
||||
You can also follow the [tutorial showing you how to integrate VS Code with Goose](https://block.github.io/goose/docs/tutorials/vscode-mcp).
|
||||
You can also follow the [tutorial showing you how to integrate VS Code with Goose](/docs/mcp/vscode-mcp).
|
||||
|
||||
<head>
|
||||
<meta property="og:title" content="Cracking the Code in VS Code" />
|
||||
|
||||
@@ -7,7 +7,7 @@ authors:
|
||||
|
||||

|
||||
|
||||
Imagine creating an app just by describing what you want out loud, like you’re talking to a friend. That’s the magic of vibe coding: turning natural language into working code with the help of an AI agent. And while typing a prompt gets the job done, saying it out loud hits different 🔥 The new [Speech MCP server](https://block.github.io/goose/docs/tutorials/speech-mcp) has quite literally entered the chat.
|
||||
Imagine creating an app just by describing what you want out loud, like you’re talking to a friend. That’s the magic of vibe coding: turning natural language into working code with the help of an AI agent. And while typing a prompt gets the job done, saying it out loud hits different 🔥 The new [Speech MCP server](/docs/mcp/speech-mcp) has quite literally entered the chat.
|
||||
|
||||
<!--truncate-->
|
||||
|
||||
|
||||
@@ -24,18 +24,18 @@ That's exactly where MCP comes in. Best part is, you don't need to be a develope
|
||||
## MCP Servers You Should Try Right Now
|
||||
So what can you connect your AI agent to? MCP Servers! MCP servers give your agent access to your tools. With [over 3000 MCP servers](https://glama.ai/mcp/servers) you can connect to, here is your top list of popular MCP servers you should try:
|
||||
|
||||
- **[Google Drive](/docs/tutorials/google-drive-mcp)**: File access and search capabilities for Google Drive
|
||||
- **[YouTube Transcript](/docs/tutorials/youtube-transcript)**: Grab and work with YouTube video transcripts
|
||||
- **[Google Maps](/docs/tutorials/google-maps-mcp)**: Location services, directions, and place details
|
||||
- **[Tavily Web Search](/docs/tutorials/tavily-mcp)**: Web and local search using Tavily's Search API
|
||||
- **[Asana](/docs/tutorials/asana-mcp)**: View asana tasks, projects, workspaces, and/or comments
|
||||
- **[Speech](/docs/tutorials/speech-mcp)**: Real-time voice interaction, audio/video transcription, text-to-speech conversion and more
|
||||
- **[GitHub](/docs/tutorials/github-mcp)**: Tools to read, search, and manage Git repositories
|
||||
- **[Fetch](/docs/tutorials/fetch-mcp)**: Web content fetching and conversion for efficient LLM usage
|
||||
- **[Google Drive](/docs/mcp/google-drive-mcp)**: File access and search capabilities for Google Drive
|
||||
- **[YouTube Transcript](/docs/mcp/youtube-transcript-mcp)**: Grab and work with YouTube video transcripts
|
||||
- **[Google Maps](/docs/mcp/google-maps-mcp)**: Location services, directions, and place details
|
||||
- **[Tavily Web Search](/docs/mcp/tavily-mcp)**: Web and local search using Tavily's Search API
|
||||
- **[Asana](/docs/mcp/asana-mcp)**: View asana tasks, projects, workspaces, and/or comments
|
||||
- **[Speech](/docs/mcp/speech-mcp)**: Real-time voice interaction, audio/video transcription, text-to-speech conversion and more
|
||||
- **[GitHub](/docs/mcp/github-mcp)**: Tools to read, search, and manage Git repositories
|
||||
- **[Fetch](/docs/mcp/fetch-mcp)**: Web content fetching and conversion for efficient LLM usage
|
||||
|
||||
This quick list should give you an idea of all the ways you can now use AI agents with your workflow. You can also explore community favorites in [handy MCP directories](https://dev.to/techgirl1908/my-favorite-mcp-directories-573n), and learn [how to check MCP servers are safe](/blog/2025/03/26/mcp-security) before installing.
|
||||
|
||||
You can also check out these [Goose tutorials](/docs/category/tutorials), showing you exactly how you can use some of these popular MCP servers with Goose, or use [Goose's Tutorial extension](/docs/tutorials/tutorial-extension) to get extra help walking you through using or building extensions.
|
||||
You can also check out these [Goose tutorials](/docs/category/mcp-servers), showing you exactly how you can use some of these popular MCP servers with Goose, or use [Goose's Tutorial extension](/docs/mcp/tutorial-mcp) to get extra help walking you through using or building extensions.
|
||||
|
||||
## Example MCP Prompts
|
||||
Now that you've caught a glimpse of some of the MCP servers that out there, how do you make sure you're using MCPs with AI agents the best you can? This is where prompts come in.
|
||||
|
||||
@@ -22,7 +22,7 @@ You can ask Goose what you can do with an extension to get a list of all the fea
|
||||
|
||||
## GitHub MCP Server: Everything GitHub
|
||||
|
||||
The [GitHub MCP Server](/docs/tutorials/github-mcp) comes with quite a lot of functionality. It can help you create issues, pull requests, repositories, and branches. My most frequent use case for the GitHub MCP is reviewing and understanding pull requests.
|
||||
The [GitHub MCP Server](/docs/mcp/github-mcp) comes with quite a lot of functionality. It can help you create issues, pull requests, repositories, and branches. My most frequent use case for the GitHub MCP is reviewing and understanding pull requests.
|
||||
|
||||
For cases when it's a large pull request, or I don't understand what is going on, I can pass the PR to Goose, giving it the right context to make me understand and then act on the pull request. I'm even able to create a documentation update or changelog update from the file changes in the PR. This is definitely one of my favorite things.
|
||||
|
||||
@@ -34,7 +34,7 @@ Hey Goose, this pull request https://github.com/block/goose/pull/1949, has a lot
|
||||
|
||||
## Knowledge Graph Memory: Context on Steroids
|
||||
|
||||
The [Knowledge Graph Memory](/docs/tutorials/knowledge-graph-mcp) extension is like giving Goose a photographic memory of your project or data. Like the name implies, it creates a graph of any information fed into it, connecting the dots between different pieces of information or as I like to use it for - documentation.
|
||||
The [Knowledge Graph Memory](/docs/mcp/knowledge-graph-mcp) extension is like giving Goose a photographic memory of your project or data. Like the name implies, it creates a graph of any information fed into it, connecting the dots between different pieces of information or as I like to use it for - documentation.
|
||||
|
||||
If I'm working on a specific project or library and I don't want any hallucinations, I am able to feed Goose with the right context and it will be able to answer questions about the project or library with the right context.
|
||||
|
||||
@@ -48,7 +48,7 @@ I'm currently in a project called Goose, read through the documentation in `docu
|
||||
|
||||
## Fetch Extension: The Web in our Hands
|
||||
|
||||
I had a slightly hard time deciding between the [Tavily Web Search Extension](/docs/tutorials/tavily-mcp) and The [Fetch Extension](/docs/tutorials/fetch-mcp) because while I do use them both to access the web, the Fetch extension works more like default for me. With the example above using the Knowledge graph, I'm able to get information from the internet to give Goose additional context to work with.
|
||||
I had a slightly hard time deciding between the [Tavily Web Search Extension](/docs/mcp/tavily-mcp) and The [Fetch Extension](/docs/mcp/fetch-mcp) because while I do use them both to access the web, the Fetch extension works more like default for me. With the example above using the Knowledge graph, I'm able to get information from the internet to give Goose additional context to work with.
|
||||
|
||||
:::note
|
||||
The Tavily Web Search Extension has deep research capabilities and is great for finding specific information, while the Fetch Extension is more about general web access and data retrieval.
|
||||
@@ -56,14 +56,14 @@ The Tavily Web Search Extension has deep research capabilities and is great for
|
||||
|
||||
## Memory Extension: My Habits and Preferences
|
||||
|
||||
I use the [Memory Extension](/docs/tutorials/memory-mcp) to remind Goose about my general preferences as I work - to default to JavaScript or Node when trying out new prototypes, if I prefer one naming convention or the other - maybe even how I like my coffee :D.
|
||||
I use the [Memory Extension](/docs/mcp/memory-mcp) to remind Goose about my general preferences as I work - to default to JavaScript or Node when trying out new prototypes, if I prefer one naming convention or the other - maybe even how I like my coffee :D.
|
||||
|
||||
This works differently from the Knowledge Graph extension even though they both store information locally. When combined with the Knowledge Graph, it can also help maintain a clear trail of technical decisions and their rationale. For example I got stuck on a code migration and asked Goose to remember where we stopped, what we've tried so far, and what we want to do next for when I start a new session.
|
||||
|
||||
|
||||
## VS Code Extension: Your Favorite Editor, Connected
|
||||
|
||||
One of the biggest points in conversations with people especially around vibe coding, is finding ways to track what changes are being made. While version control is always recommended, sometimes I want to be able to stop or change direction before going too far. The [VS Code Extension](/docs/tutorials/vscode-mcp) alongside other features, allows me to preview the diff of my code changes before I commit them.
|
||||
One of the biggest points in conversations with people especially around vibe coding, is finding ways to track what changes are being made. While version control is always recommended, sometimes I want to be able to stop or change direction before going too far. The [VS Code Extension](/docs/mcp/vscode-mcp) alongside other features, allows me to preview the diff of my code changes before I commit them.
|
||||
|
||||
I can choose to accept or refuse these changes, or tell Goose to try something else before any actual changes are made.
|
||||
|
||||
|
||||
@@ -108,7 +108,7 @@ Here are a few examples:
|
||||
|
||||
### 7. Integrate Goose into Your CI/CD
|
||||
|
||||
Before issues hit production, add [Goose to your CI/CD pipeline](https://block.github.io/goose/docs/tutorials/cicd) to:
|
||||
Before issues hit production, add [Goose to your CI/CD pipeline](/docs/tutorials/cicd) to:
|
||||
- Automate code reviews
|
||||
- Validate documentation
|
||||
- Run security checks
|
||||
@@ -119,7 +119,7 @@ Some MCP servers can introduce security risks, especially if compromised.
|
||||
|
||||
Use the Goose [allowlist](https://github.com/block/goose/blob/main/crates/goose-server/ALLOWLIST.md) feature to prevent Goose from calling unsafe or untrusted tools.
|
||||
|
||||
Here's how the team at Block is thinking about [securing the MCP](https://block.github.io/goose/blog/2025/03/31/securing-mcp).
|
||||
Here's how the team at Block is thinking about [securing the MCP](/blog/2025/03/31/securing-mcp).
|
||||
|
||||
### 9. Pick a High-Performing LLM
|
||||
|
||||
|
||||
@@ -30,7 +30,7 @@ Sidenote: I met an AI enthusiast at a meetup who said he sometimes gets coding i
|
||||
|
||||
### How to Try It
|
||||
|
||||
1. Follow [this tutorial](/docs/tutorials/speech-mcp)
|
||||
1. Follow [this tutorial](/docs/mcp/speech-mcp)
|
||||
2. Enable the [`Speech`](https://github.com/Kvadratni/speech-mcp) and [`Developer`](/extensions/detail?id=developer) extensions
|
||||
3. Prompt Goose:
|
||||
> I'd like to speak instead of typing.
|
||||
@@ -49,7 +49,7 @@ I want guests to feel like I actually know their work, even if I don't have hour
|
||||
|
||||
### How to Try It
|
||||
|
||||
1. Follow [this tutorial](/docs/tutorials/youtube-transcript)
|
||||
1. Follow [this tutorial](/docs/mcp/youtube-transcript-mcp)
|
||||
2. Enable the [`YouTube Transcript`](https://github.com/jkawamoto/mcp-youtube-transcript) and [`Developer`](/extensions/detail?id=developer) extensions
|
||||
3. Prompt Goose:
|
||||
> Generate a transcript for this video https://www.youtube.com/watch?v=dQw4w9WgXcQ, then create relevant interview questions based on the content
|
||||
@@ -86,7 +86,7 @@ I'm not currently looking for a job, but I like to stay prepared. My strategy in
|
||||
|
||||
### How to Try It
|
||||
|
||||
1. Follow [this tutorial](/docs/tutorials/pdf-mcp)
|
||||
1. Follow [this tutorial](/docs/mcp/pdf-mcp)
|
||||
2. Enable the [`PDF Reader`](https://github.com/michaelneale/mcp-read-pdf) extension
|
||||
3. Prompt Goose:
|
||||
> Read the resume at ~/Downloads/resume.pdf and evaluate how well this candidate aligns with the following role requirements:
|
||||
@@ -131,7 +131,7 @@ SQL can get complex with joins, stored procedures, and subqueries. Goose helps m
|
||||
|
||||
### How to Try It
|
||||
|
||||
1. Follow [this tutorial](/docs/tutorials/postgres-mcp)
|
||||
1. Follow [this tutorial](/docs/mcp/postgres-mcp)
|
||||
2. Enable the [`PostgreSQL`](https://github.com/modelcontextprotocol/servers/tree/HEAD/src/postgres) and [`Developer`](/extensions/detail?id=developer) extensions
|
||||
3. Prompt Goose:
|
||||
> Find my top 3 blog posts by average weekly views over the past 90 days. Include title, URL, average weekly views, and whether they were promoted on social.
|
||||
@@ -151,7 +151,7 @@ I tend to overbook myself or get anxious that I won't get accepted, so I apply t
|
||||
|
||||
### How to Try It
|
||||
|
||||
1. Follow [this tutorial](/docs/tutorials/agentql-mcp)
|
||||
1. Follow [this tutorial](/docs/mcp/agentql-mcp)
|
||||
2. Enable the [`AgentQL`](https://github.com/tinyfish-io/agentql-mcp) extension
|
||||
3. Prompt Goose:
|
||||
> I'm a tech conference speaker planning my 2025-2026 submissions.
|
||||
@@ -201,7 +201,7 @@ In addition to generating code, AI agents can help you learn how to code. Goose
|
||||
|
||||
### How to Try It
|
||||
|
||||
1. Follow [this tutorial](/docs/tutorials/tutorial-extension)
|
||||
1. Follow [this tutorial](/docs/mcp/tutorial-mcp)
|
||||
3. Prompt Goose:
|
||||
> I'd like to learn how to build an extension or MCP server for Goose
|
||||
|
||||
|
||||
@@ -135,7 +135,7 @@ Q: **How will MCP help with APIs?**<br/>
|
||||
A: Start with [this post by Angie Jones](/blog/2025/02/17/agentic-ai-mcp/#mcp-ecosystem). MCP provides context about your API, to give AI Agents more context and awareness of the capabilities of your API endpoints and responses. This can help the Agent understand the intent of the request, and dynamically invoke (or "call") to underlying API endpoint, handle data transformation, and return a response. No more manually writing the code, response validators, error handlers, and so on!
|
||||
|
||||
Q: **What are some initial steps I can take as a developer to explore AI agents and MCP?**<br/>
|
||||
A: Start by researching the fundamental concepts, and use other existing MCP servers. We recommend starting with [Goose](https://block.github.io/goose) to integrate an existing MCP server. We have a growing [listof tutorials](https://block.github.io/docs/category/tutorials/) to help you find some technologies like GitHub, PostgreSQL, Google Maps, and more. Once you feel comfortable with using MCP, you can start building your own MCP server for your own APIs.
|
||||
A: Start by researching the fundamental concepts, and use other existing MCP servers. We recommend starting with [Goose](/) to integrate an existing MCP server. We have a growing [listof tutorials](/docs/category/mcp-servers) to help you find some technologies like GitHub, PostgreSQL, Google Maps, and more. Once you feel comfortable with using MCP, you can start building your own MCP server for your own APIs.
|
||||
|
||||
Q: **What about AI and MCP security?**<br/>
|
||||
A: AI agents can enhance security through better context awareness in interactions, but MCP is still relatively new, and requires [careful security evaluations](/blog/2025/03/26/mcp-security/). Your business and dev teams should thoroughly investigate MCP's capabilities to ensure you're building appropriate access control, and managing data privacy.
|
||||
|
||||
@@ -38,7 +38,7 @@ As teams adopt AI tools like Goose, the ability to define and share these automa
|
||||
|
||||
Goose can help standardize and automate these tasks, by [creating recipes](/docs/guides/session-recipes). As a developer on your team uses Goose, they can create a recipe that describes how to perform a task, and then share that with the rest of the team. These recipes can be shared and reused, and improved over time, just like a sports team’s playbook.
|
||||
|
||||
Recipes are built with an understanding of the workflow you want Goose to help with, and these may involve one or more MCP servers, such as [GitHub](/docs/tutorials/github-mcp/) or [PostgreSQL](/docs/tutorials/postgres-mcp/). The recipes are designed to be reusable and adaptable, allowing developers to create a library that can be used across different projects.
|
||||
Recipes are built with an understanding of the workflow you want Goose to help with, and these may involve one or more MCP servers, such as [GitHub](/docs/mcp/github-mcp/) or [PostgreSQL](/docs/mcp/postgres-mcp/). The recipes are designed to be reusable and adaptable, allowing developers to create a library that can be used across different projects.
|
||||
|
||||
A shared playbook of AI plays helps everyone on the team perform tasks consistently. It can also reduce the time spent on repetitive work.
|
||||
|
||||
|
||||
@@ -110,7 +110,7 @@ This starts up a local web server (the command line output will tell you which p
|
||||
|
||||
## Goose learns how to drive!
|
||||
|
||||
Following our [mbot MCP tutorial](/docs/tutorials/mbot-mcp/) we can set up our MCP extension just like we ran our Java JAR file with the environment variables.
|
||||
Following our [mbot MCP tutorial](/docs/mcp/mbot-mcp/) we can set up our MCP extension just like we ran our Java JAR file with the environment variables.
|
||||
|
||||
Now we can give Goose commands like "drive in a square pattern by making left turns and moving forward, and beeping before you turn" and it will send the commands to the mbot2 rover via MQTT.
|
||||
|
||||
|
||||
@@ -40,7 +40,7 @@ Because I don't do this often, I often end up searching for the commands, which
|
||||
|
||||
To avoid that interruption, I started offloading the task to [Goose](/), an open source AI agent.
|
||||
|
||||
Goose uses its built-in [Developer](/docs/tutorials/developer-mcp) MCP server to handle coding-related tasks on my machine. Here's what the interaction looks like:
|
||||
Goose uses its built-in [Developer](/docs/mcp/developer-mcp) MCP server to handle coding-related tasks on my machine. Here's what the interaction looks like:
|
||||
|
||||
**Prompt to Goose:**
|
||||
|
||||
|
||||
@@ -10,7 +10,7 @@ authors:
|
||||
|
||||
# What's in my .goosehints file (and why it probably shouldn't be)
|
||||
|
||||
As Goose users, we have two main ways to provide persistent context to our AI assistant: the `.goosehints` file and the [Memory Extension](/docs/tutorials/memory-mcp) MCP server. Today, I'll share what's in my `.goosehints` file, why some of it should probably move to the Memory Extension, and how you can make that choice.
|
||||
As Goose users, we have two main ways to provide persistent context to our AI assistant: the `.goosehints` file and the [Memory Extension](/docs/mcp/memory-mcp) MCP server. Today, I'll share what's in my `.goosehints` file, why some of it should probably move to the Memory Extension, and how you can make that choice.
|
||||
|
||||
<!-- truncate -->
|
||||
|
||||
@@ -22,7 +22,7 @@ That stored knowledge – your preferences, quirks, and routine – makes the wh
|
||||
|
||||
This is exactly the challenge we face with AI assistants. By default, they start each conversation (aka, "context window") fresh – no memory of your coding standards, documentation preferences, or how you like your pull requests structured. The same way you'd get tired of reciting your detailed coffee order every morning, it's inefficient to repeatedly explain to your AI assistant that you prefer Python's Black formatter, want detailed commit messages, and or how you want to construct a briefing going to everyone in the company.
|
||||
|
||||
This is where persistent context comes in. Through tools like `.goosehints` and the [Memory Extension](/docs/tutorials/memory-mcp) MCP server, we can give our AI assistants the equivalent of a barista's "regular customer" knowledge. But just as you wouldn't want your barista memorizing your entire life story just to make your coffee, we need to be thoughtful about what context we make persistent. The key is finding the right balance between having enough context to work efficiently and not overwhelming our systems with unnecessary information.
|
||||
This is where persistent context comes in. Through tools like `.goosehints` and the [Memory Extension](/docs/mcp/memory-mcp) MCP server, we can give our AI assistants the equivalent of a barista's "regular customer" knowledge. But just as you wouldn't want your barista memorizing your entire life story just to make your coffee, we need to be thoughtful about what context we make persistent. The key is finding the right balance between having enough context to work efficiently and not overwhelming our systems with unnecessary information.
|
||||
|
||||
Let's explore how to strike that balance.
|
||||
|
||||
@@ -34,7 +34,7 @@ You can read more about `.goosehints` in the [Goose documentation](/docs/guides/
|
||||
|
||||
### What is the Memory Extension?
|
||||
|
||||
The [Memory Extension](/docs/tutorials/memory-mcp) is a dynamic storage system using the Model Context Protocol that allows you to store and retrieve context on-demand using tags or keywords. It lives in your `~/.goose/memory` directory (local) or `~/.config/goose/memory` (global).
|
||||
The [Memory Extension](/docs/mcp/memory-mcp) is a dynamic storage system using the Model Context Protocol that allows you to store and retrieve context on-demand using tags or keywords. It lives in your `~/.goose/memory` directory (local) or `~/.config/goose/memory` (global).
|
||||
|
||||
Unlike `.goosehints`, which is static and loaded entirely with every request, Memory Extension can be updated and accessed as needed, allowing for more flexible and user-specific configurations.
|
||||
|
||||
|
||||
@@ -98,7 +98,7 @@ extensions:
|
||||
|
||||
## Guide
|
||||
|
||||
**[Get started with the full guide →](/docs/guides/isolated-development-environments)**
|
||||
**[Get started with the full guide →](/docs/tutorials/isolated-development-environments)**
|
||||
|
||||
---
|
||||
|
||||
|
||||
@@ -25,11 +25,11 @@ Goose relies heavily on tool calling capabilities and currently works best with
|
||||
| [Azure OpenAI](https://learn.microsoft.com/en-us/azure/ai-services/openai/) | Access Azure-hosted OpenAI models, including GPT-4 and GPT-3.5. Supports both API key and Azure credential chain authentication. | `AZURE_OPENAI_ENDPOINT`, `AZURE_OPENAI_DEPLOYMENT_NAME`, `AZURE_OPENAI_API_KEY` (optional) |
|
||||
| [Databricks](https://www.databricks.com/) | Unified data analytics and AI platform for building and deploying models. | `DATABRICKS_HOST`, `DATABRICKS_TOKEN` |
|
||||
| [Gemini](https://ai.google.dev/gemini-api/docs) | Advanced LLMs by Google with multimodal capabilities (text, images). | `GOOGLE_API_KEY` |
|
||||
| [GCP Vertex AI](https://cloud.google.com/vertex-ai) | Google Cloud's Vertex AI platform, supporting Gemini and Claude models. **Credentials must be configured in advance. Follow the instructions at https://cloud.google.com/vertex-ai/docs/authentication.** | `GCP_PROJECT_ID`, `GCP_LOCATION` and optional `GCP_MAX_RETRIES` (6), `GCP_INITIAL_RETRY_INTERVAL_MS` (5000), `GCP_BACKOFF_MULTIPLIER` (2.0), `GCP_MAX_RETRY_INTERVAL_MS` (320_000). |
|
||||
| [GCP Vertex AI](https://cloud.google.com/vertex-ai) | Google Cloud's Vertex AI platform, supporting Gemini and Claude models. **Credentials must be [configured in advance](https://cloud.google.com/vertex-ai/docs/authentication).** | `GCP_PROJECT_ID`, `GCP_LOCATION` and optional `GCP_MAX_RETRIES` (6), `GCP_INITIAL_RETRY_INTERVAL_MS` (5000), `GCP_BACKOFF_MULTIPLIER` (2.0), `GCP_MAX_RETRY_INTERVAL_MS` (320_000). |
|
||||
| [GitHub Copilot](https://docs.github.com/en/copilot/using-github-copilot/ai-models) | Access to GitHub Copilot's chat models including gpt-4o, o1, o3-mini, and Claude models. Uses device code authentication flow for secure access. | Uses GitHub device code authentication flow (no API key needed) |
|
||||
| [Groq](https://groq.com/) | High-performance inference hardware and tools for LLMs. | `GROQ_API_KEY` |
|
||||
| [Ollama](https://ollama.com/) | Local model runner supporting Qwen, Llama, DeepSeek, and other open-source models. **Because this provider runs locally, you must first [download and run a model](/docs/getting-started/providers#local-llms-ollama).** | `OLLAMA_HOST` |
|
||||
| [Ramalama](https://ramalama.ai/) | Local model using native [OCI](https://opencontainers.org/) container runtimes, [CNCF](https://www.cncf.io/) tools, and supporting models as OCI artifacts. Ramalama API an compatible alternative to Ollama and can be used with the Goose Ollama provider. Supports Qwen, Llama, DeepSeek, and other open-source models. **Because this provider runs locally, you must first [download and run a model](/docs/getting-started/providers#local-llms-ollama).** | `OLLAMA_HOST` |
|
||||
| [Ollama](https://ollama.com/) | Local model runner supporting Qwen, Llama, DeepSeek, and other open-source models. **Because this provider runs locally, you must first [download and run a model](/docs/getting-started/providers#local-llms).** | `OLLAMA_HOST` |
|
||||
| [Ramalama](https://ramalama.ai/) | Local model using native [OCI](https://opencontainers.org/) container runtimes, [CNCF](https://www.cncf.io/) tools, and supporting models as OCI artifacts. Ramalama API an compatible alternative to Ollama and can be used with the Goose Ollama provider. Supports Qwen, Llama, DeepSeek, and other open-source models. **Because this provider runs locally, you must first [download and run a model](/docs/getting-started/providers#local-llms).** | `OLLAMA_HOST` |
|
||||
| [OpenAI](https://platform.openai.com/api-keys) | Provides gpt-4o, o1, and other advanced language models. Also supports OpenAI-compatible endpoints (e.g., self-hosted LLaMA, vLLM, KServe). **o1-mini and o1-preview are not supported because Goose uses tool calling.** | `OPENAI_API_KEY`, `OPENAI_HOST` (optional), `OPENAI_ORGANIZATION` (optional), `OPENAI_PROJECT` (optional), `OPENAI_CUSTOM_HEADERS` (optional) |
|
||||
| [OpenRouter](https://openrouter.ai/) | API gateway for unified access to various models with features like rate-limiting management. | `OPENROUTER_API_KEY` |
|
||||
| [Snowflake](https://docs.snowflake.com/user-guide/snowflake-cortex/aisql#choosing-a-model) | Access the latest models using Snowflake Cortex services, including Claude models. **Requires a Snowflake account and programmatic access token (PAT)**. | `SNOWFLAKE_HOST`, `SNOWFLAKE_TOKEN` |
|
||||
@@ -278,7 +278,7 @@ To set up Google Gemini with Goose, follow these steps:
|
||||
</Tabs>
|
||||
|
||||
|
||||
### Local LLMs (Ollama or Ramalama)
|
||||
### Local LLMs
|
||||
|
||||
Ollama and Ramalama are both options to provide local LLMs, each which requires a bit more set up before you can use one of them with Goose.
|
||||
|
||||
|
||||
@@ -12,11 +12,7 @@ Extensions are based on the [Model Context Protocol (MCP)](https://github.com/mo
|
||||
Goose to a wide ecosystem of capabilities.
|
||||
|
||||
:::tip Tutorials
|
||||
Check out the [step-by-step tutorials](/docs/category/tutorials) for adding and using several Goose Extensions
|
||||
:::
|
||||
|
||||
:::tip Featured Extension
|
||||
Looking for isolated development environments? Check out our guide on [Isolated Development Environments](/docs/guides/isolated-development-environments) using the new container-use extension.
|
||||
Check out the [step-by-step tutorials](/docs/category/mcp-servers) for adding and using several Goose Extensions
|
||||
:::
|
||||
|
||||
|
||||
|
||||
@@ -1,6 +1,6 @@
|
||||
{
|
||||
"label": "Architecture Overview",
|
||||
"position": 5,
|
||||
"position": 6,
|
||||
"link": {
|
||||
"type": "generated-index",
|
||||
"description": "Extend Goose functionalities with extensions and custom configurations"
|
||||
|
||||
@@ -27,7 +27,7 @@ LLMs have context windows, which are limits on how much conversation history the
|
||||
Turning on too many extensions can degrade performance. Enable only essential [extensions and tools](/docs/guides/tool-permissions) to improve tool selection accuracy, save context window space, and stay within provider tool limits.
|
||||
|
||||
### Teach Goose your preferences
|
||||
Help Goose remember how you like to work by using [`.goosehints`](/docs/guides/using-goosehints/) for permanent project preferences and the [Memory extension](/docs/tutorials/memory-mcp) for things you want Goose to dynamically recall later. Both can help save valuable context window space while keeping your preferences available.
|
||||
Help Goose remember how you like to work by using [`.goosehints`](/docs/guides/using-goosehints/) for permanent project preferences and the [Memory extension](/docs/mcp/memory-mcp) for things you want Goose to dynamically recall later. Both can help save valuable context window space while keeping your preferences available.
|
||||
|
||||
### Protect sensitive files
|
||||
Goose is often eager to make changes. You can stop it from changing specific files by creating a [.gooseignore](/docs/guides/using-gooseignore) file. In this file, you can list all the file paths you want it to avoid.
|
||||
|
||||
@@ -8,7 +8,7 @@ sidebar_position: 14
|
||||
`.gooseignore` is a text file that defines patterns for files and directories that Goose will not access. This means Goose cannot read, modify, delete, or run shell commands on these files when using the Developer extension's tools.
|
||||
|
||||
:::info Developer extension only
|
||||
The .gooseignore feature currently only affects tools in the [Developer](/docs/tutorials/developer-mcp) extension. Other extensions are not restricted by these rules.
|
||||
The .gooseignore feature currently only affects tools in the [Developer](/docs/mcp/developer-mcp) extension. Other extensions are not restricted by these rules.
|
||||
:::
|
||||
|
||||
This guide will show you how to use `.gooseignore` files to prevent Goose from changing specific files and directories.
|
||||
|
||||
8
documentation/docs/mcp/_category_.json
Normal file
8
documentation/docs/mcp/_category_.json
Normal file
@@ -0,0 +1,8 @@
|
||||
{
|
||||
"label": "MCP Servers",
|
||||
"position": 5,
|
||||
"link": {
|
||||
"type": "generated-index",
|
||||
"description": "How to integrate and use MCP servers as Goose extensions"
|
||||
}
|
||||
}
|
||||
@@ -10,7 +10,7 @@ import YouTubeShortEmbed from '@site/src/components/YouTubeShortEmbed';
|
||||
<YouTubeShortEmbed videoUrl="https://www.youtube.com/embed/PF6hpDaI9Mc" />
|
||||
|
||||
|
||||
This tutorial covers how to add the [Knowledge Graph Memory MCP Server](https://github.com/modelcontextprotocol/servers/tree/main/src/memory) as a Goose extension. This enables Goose to analyze relationships, detect patterns, and gain a deeper understanding of your data. The knowledge graph builds on the [memory extension](/docs/tutorials/memory-mcp) by mapping complex relationships between concepts and providing persistent memory across Goose sessions.
|
||||
This tutorial covers how to add the [Knowledge Graph Memory MCP Server](https://github.com/modelcontextprotocol/servers/tree/main/src/memory) as a Goose extension. This enables Goose to analyze relationships, detect patterns, and gain a deeper understanding of your data. The knowledge graph builds on the [memory extension](/docs/mcp/memory-mcp) by mapping complex relationships between concepts and providing persistent memory across Goose sessions.
|
||||
|
||||
:::tip TLDR
|
||||
<Tabs groupId="interface">
|
||||
@@ -404,9 +404,9 @@ This audit reveals several critical accessibility issues that should be addresse
|
||||
|
||||
### Further Automation with GitHub Extension
|
||||
|
||||
You can take this accessibility audit a step further by combining the [GitHub Extension](/docs/tutorials/github-mcp) with the Puppeteer Extension. With this setup, Goose doesn’t just find issues and apply fixes, it can also handle the entire Git workflow for you. The GitHub Extension allows Goose to commit changes, create a pull request, and even generate a PR description, so all you have to do is review and merge.
|
||||
You can take this accessibility audit a step further by combining the [GitHub Extension](/docs/mcp/github-mcp) with the Puppeteer Extension. With this setup, Goose doesn’t just find issues and apply fixes, it can also handle the entire Git workflow for you. The GitHub Extension allows Goose to commit changes, create a pull request, and even generate a PR description, so all you have to do is review and merge.
|
||||
|
||||
1. Enable the GitHub extension by following the steps in the **[GitHub Extension Tutorial](/docs/tutorials/github-mcp#configuration)**.
|
||||
1. Enable the GitHub extension by following the steps in the **[GitHub Extension Tutorial](/docs/mcp/github-mcp#configuration)**.
|
||||
|
||||
|
||||
:::tip
|
||||
@@ -14,8 +14,16 @@ import LinuxDesktopInstallButtons from '@site/src/components/LinuxDesktopInstall
|
||||
|
||||
# Goose in 5 minutes
|
||||
|
||||
Goose is an open source AI agent that supercharges your software development by automating coding tasks. This quick tutorial will guide you through getting started with Goose!
|
||||
Goose is an extensible open source AI agent enhances your software development by automating coding tasks.
|
||||
|
||||
This quick tutorial will guide you through:
|
||||
|
||||
- ✅ Installing Goose
|
||||
- ✅ Configuring your LLM
|
||||
- ✅ Building a small app
|
||||
- ✅ Adding an MCP server
|
||||
|
||||
Let's begin 🚀
|
||||
|
||||
## Install Goose
|
||||
|
||||
@@ -139,6 +147,11 @@ Goose relies heavily on tool calling capabilities and currently works best with
|
||||
Sessions are single, continuous conversations between you and Goose. Let's start one.
|
||||
|
||||
<Tabs groupId="interface">
|
||||
<TabItem value="ui" label="Goose Desktop" default>
|
||||
After choosing an LLM provider, you’ll see the session interface ready for use.
|
||||
|
||||
Type your questions, tasks, or instructions directly into the input field, and Goose will immediately get to work.
|
||||
</TabItem>
|
||||
<TabItem value="cli" label="Goose CLI">
|
||||
1. Make an empty directory (e.g. `goose-demo`) and navigate to that directory from the terminal.
|
||||
2. To start a new session, run:
|
||||
@@ -154,11 +167,6 @@ Sessions are single, continuous conversations between you and Goose. Let's start
|
||||
:::
|
||||
|
||||
</TabItem>
|
||||
<TabItem value="ui" label="Goose Desktop" default>
|
||||
After choosing an LLM provider, you’ll see the session interface ready for use.
|
||||
|
||||
Type your questions, tasks, or instructions directly into the input field, and Goose will immediately get to work.
|
||||
</TabItem>
|
||||
</Tabs>
|
||||
|
||||
## Write Prompt
|
||||
@@ -179,6 +187,14 @@ Goose will create a plan and then get right to work on implementing it. Once don
|
||||
While you're able to manually navigate to your working directory and open the HTML file in a browser, wouldn't it be better if Goose did that for you? Let's give Goose the ability to open a web browser by enabling the `Computer Controller` extension.
|
||||
|
||||
<Tabs groupId="interface">
|
||||
|
||||
<TabItem value="ui" label="Goose Desktop" default>
|
||||
1. Locate the menu (`...`) in the top right corner of the Goose Desktop.
|
||||
2. Select `Advanced settings` from the menu.
|
||||
3. Under the `Extensions` section, toggle the `Computer Controller` extension to enable it. This [extension](https://block.github.io/goose/v1/extensions/detail/nondeveloper) enables webscraping, file caching, and automations.
|
||||
4. Scroll back to the top and click `<- Back` in the upper left corner to return to your session.
|
||||
5. Now that Goose has browser capabilities, let's ask it to launch your game in a browser:
|
||||
</TabItem>
|
||||
<TabItem value="cli" label="Goose CLI">
|
||||
1. End the current session by entering `Ctrl+C` so that you can return to the terminal's command prompt.
|
||||
2. Run the configuration command
|
||||
@@ -213,13 +229,6 @@ While you're able to manually navigate to your working directory and open the HT
|
||||
```
|
||||
5. Ask Goose to launch your game in a browser:
|
||||
</TabItem>
|
||||
<TabItem value="ui" label="Goose Desktop" default>
|
||||
1. Locate the menu (`...`) in the top right corner of the Goose Desktop.
|
||||
2. Select `Advanced settings` from the menu.
|
||||
3. Under the `Extensions` section, toggle the `Computer Controller` extension to enable it. This [extension](https://block.github.io/goose/v1/extensions/detail/nondeveloper) enables webscraping, file caching, and automations.
|
||||
4. Scroll back to the top and click `<- Back` in the upper left corner to return to your session.
|
||||
5. Now that Goose has browser capabilities, let's ask it to launch your game in a browser:
|
||||
</TabItem>
|
||||
</Tabs>
|
||||
|
||||
```
|
||||
|
||||
@@ -38,7 +38,7 @@ This error occurs when the input provided to Goose exceeds the maximum token lim
|
||||
|
||||
### Using Ollama Provider
|
||||
|
||||
Ollama provides local LLMs, which means you must first [download Ollama and run a model](/docs/getting-started/providers#local-llms-ollama) before attempting to use this provider with Goose. If you do not have the model downloaded, you'll run into the following error:
|
||||
Ollama provides local LLMs, which means you must first [download Ollama and run a model](/docs/getting-started/providers#local-llms) before attempting to use this provider with Goose. If you do not have the model downloaded, you'll run into the following error:
|
||||
|
||||
> ExecutionError("error sending request for url (http://localhost:11434/v1/chat/completions)")
|
||||
|
||||
|
||||
@@ -3,6 +3,6 @@
|
||||
"position": 4,
|
||||
"link": {
|
||||
"type": "generated-index",
|
||||
"description": "How to integrate and use MCP servers as Goose extensions"
|
||||
"description": "How to use Goose in various ways"
|
||||
}
|
||||
}
|
||||
@@ -1,6 +1,5 @@
|
||||
---
|
||||
title: Building Custom Extensions
|
||||
sidebar_position: 1
|
||||
description: Create your own custom MCP Server to use as a Goose extension
|
||||
---
|
||||
|
||||
|
||||
@@ -1,6 +1,5 @@
|
||||
---
|
||||
title: Isolated Development Environments
|
||||
sidebar_position: 25
|
||||
---
|
||||
|
||||
import Tabs from '@theme/Tabs';
|
||||
@@ -29,7 +28,7 @@ The **[Container Use MCP](https://github.com/dagger/container-use)** server prov
|
||||
|
||||
### Install Container Use
|
||||
|
||||
Head on over to the [Container Use README](https://github.com/dagger/container-use/blob/main/README.md) for up to date install instructions for this fast moving project.
|
||||
Head on over to the [Container Use README](https://github.com/dagger/container-use/blob/main/README.md) for up-to-date install instructions for this fast moving project.
|
||||
|
||||
## Adding to Goose
|
||||
|
||||
@@ -185,4 +184,4 @@ If you encounter issues:
|
||||
|
||||
With container-use enabled in Goose, you're ready to develop with confidence. Try starting a conversation about a project you've been hesitant to experiment with, and let Goose set up a safe, isolated environment for your exploration.
|
||||
|
||||
Remember: with isolated environments, there's no such thing as a failed experiment—only learning opportunities that don't affect your main codebase.
|
||||
Remember: with isolated environments, there's no such thing as a failed experiment - only learning opportunities that don't affect your main codebase.
|
||||
@@ -108,7 +108,148 @@ const config: Config = {
|
||||
{
|
||||
from: '/docs/guides/share-goose-sessions',
|
||||
to: '/docs/guides/session-recipes'
|
||||
}
|
||||
},
|
||||
// MCP tutorial redirects - moved from /docs/tutorials/ to /docs/mcp/
|
||||
{
|
||||
from: '/docs/tutorials/agentql-mcp',
|
||||
to: '/docs/mcp/agentql-mcp'
|
||||
},
|
||||
{
|
||||
from: '/docs/tutorials/asana-mcp',
|
||||
to: '/docs/mcp/asana-mcp'
|
||||
},
|
||||
{
|
||||
from: '/docs/tutorials/blender-mcp',
|
||||
to: '/docs/mcp/blender-mcp'
|
||||
},
|
||||
{
|
||||
from: '/docs/tutorials/brave-mcp',
|
||||
to: '/docs/mcp/brave-mcp'
|
||||
},
|
||||
{
|
||||
from: '/docs/tutorials/browserbase-mcp',
|
||||
to: '/docs/mcp/browserbase-mcp'
|
||||
},
|
||||
{
|
||||
from: '/docs/tutorials/computer-controller-mcp',
|
||||
to: '/docs/mcp/computer-controller-mcp'
|
||||
},
|
||||
{
|
||||
from: '/docs/tutorials/context7-mcp',
|
||||
to: '/docs/mcp/context7-mcp'
|
||||
},
|
||||
{
|
||||
from: '/docs/tutorials/developer-mcp',
|
||||
to: '/docs/mcp/developer-mcp'
|
||||
},
|
||||
{
|
||||
from: '/docs/tutorials/elevenlabs-mcp',
|
||||
to: '/docs/mcp/elevenlabs-mcp'
|
||||
},
|
||||
{
|
||||
from: '/docs/tutorials/fetch-mcp',
|
||||
to: '/docs/mcp/fetch-mcp'
|
||||
},
|
||||
{
|
||||
from: '/docs/tutorials/figma-mcp',
|
||||
to: '/docs/mcp/figma-mcp'
|
||||
},
|
||||
{
|
||||
from: '/docs/tutorials/filesystem-mcp',
|
||||
to: '/docs/mcp/filesystem-mcp'
|
||||
},
|
||||
{
|
||||
from: '/docs/tutorials/github-mcp',
|
||||
to: '/docs/mcp/github-mcp'
|
||||
},
|
||||
{
|
||||
from: '/docs/tutorials/google-drive-mcp',
|
||||
to: '/docs/mcp/google-drive-mcp'
|
||||
},
|
||||
{
|
||||
from: '/docs/tutorials/google-maps-mcp',
|
||||
to: '/docs/mcp/google-maps-mcp'
|
||||
},
|
||||
{
|
||||
from: '/docs/tutorials/jetbrains-mcp',
|
||||
to: '/docs/mcp/jetbrains-mcp'
|
||||
},
|
||||
{
|
||||
from: '/docs/tutorials/knowledge-graph-mcp',
|
||||
to: '/docs/mcp/knowledge-graph-mcp'
|
||||
},
|
||||
{
|
||||
from: '/docs/tutorials/mbot-mcp',
|
||||
to: '/docs/mcp/mbot-mcp'
|
||||
},
|
||||
{
|
||||
from: '/docs/tutorials/memory-mcp',
|
||||
to: '/docs/mcp/memory-mcp'
|
||||
},
|
||||
{
|
||||
from: '/docs/tutorials/nostrbook-mcp',
|
||||
to: '/docs/mcp/nostrbook-mcp'
|
||||
},
|
||||
{
|
||||
from: '/docs/tutorials/pdf-mcp',
|
||||
to: '/docs/mcp/pdf-mcp'
|
||||
},
|
||||
{
|
||||
from: '/docs/tutorials/pieces-mcp',
|
||||
to: '/docs/mcp/pieces-mcp'
|
||||
},
|
||||
{
|
||||
from: '/docs/tutorials/playwright-mcp',
|
||||
to: '/docs/mcp/playwright-mcp'
|
||||
},
|
||||
{
|
||||
from: '/docs/tutorials/postgres-mcp',
|
||||
to: '/docs/mcp/postgres-mcp'
|
||||
},
|
||||
{
|
||||
from: '/docs/tutorials/puppeteer-mcp',
|
||||
to: '/docs/mcp/puppeteer-mcp'
|
||||
},
|
||||
{
|
||||
from: '/docs/tutorials/reddit-mcp',
|
||||
to: '/docs/mcp/reddit-mcp'
|
||||
},
|
||||
{
|
||||
from: '/docs/tutorials/repomix-mcp',
|
||||
to: '/docs/mcp/repomix-mcp'
|
||||
},
|
||||
{
|
||||
from: '/docs/tutorials/selenium-mcp',
|
||||
to: '/docs/mcp/selenium-mcp'
|
||||
},
|
||||
{
|
||||
from: '/docs/tutorials/speech-mcp',
|
||||
to: '/docs/mcp/speech-mcp'
|
||||
},
|
||||
{
|
||||
from: '/docs/tutorials/square-mcp',
|
||||
to: '/docs/mcp/square-mcp'
|
||||
},
|
||||
{
|
||||
from: '/docs/tutorials/tavily-mcp',
|
||||
to: '/docs/mcp/tavily-mcp'
|
||||
},
|
||||
{
|
||||
from: '/docs/tutorials/tutorial-extension',
|
||||
to: '/docs/mcp/tutorial-mcp'
|
||||
},
|
||||
{
|
||||
from: '/docs/tutorials/vscode-mcp',
|
||||
to: '/docs/mcp/vscode-mcp'
|
||||
},
|
||||
{
|
||||
from: '/docs/tutorials/youtube-transcript',
|
||||
to: '/docs/mcp/youtube-transcript-mcp'
|
||||
},
|
||||
{
|
||||
from: '/docs/guides/isolated-development-environments',
|
||||
to: '/docs/tutorials/isolated-development-environments'
|
||||
}
|
||||
],
|
||||
},
|
||||
],
|
||||
@@ -132,7 +273,7 @@ const config: Config = {
|
||||
position: "left",
|
||||
},
|
||||
{
|
||||
to: "/docs/category/getting-started",
|
||||
to: "/docs/category/guides",
|
||||
position: "left",
|
||||
label: "Docs",
|
||||
},
|
||||
|
||||
@@ -19,7 +19,7 @@ const LinuxDesktopInstallButtons = () => {
|
||||
<IconDownload /> Linux ARM64
|
||||
</Link>
|
||||
<Link
|
||||
className="button button--secondary button--lg"
|
||||
className="button button--primary button--lg"
|
||||
to="https://github.com/block/goose/releases/download/v1.0.29/Goose-1.0.29-1.x86_64.rpm"
|
||||
>
|
||||
<IconDownload /> RPM Package
|
||||
|
||||
Reference in New Issue
Block a user