From 8531777aa0ffa75130a82aabc7009cc985e9a708 Mon Sep 17 00:00:00 2001 From: Adewale Abati Date: Thu, 5 Dec 2024 13:39:51 +0100 Subject: [PATCH] docs: Update docs for better installation and provider setups (#408) Co-authored-by: Angie Jones --- docs/configuration.md | 81 ++++------------------------- docs/installation.md | 49 ++++++++++++++++-- docs/plugins/providers.md | 104 ++++++++++++++++++++++++++++++++++---- docs/quickstart.md | 19 +++---- 4 files changed, 161 insertions(+), 92 deletions(-) diff --git a/docs/configuration.md b/docs/configuration.md index 92e7c1b1..a322ec12 100644 --- a/docs/configuration.md +++ b/docs/configuration.md @@ -4,7 +4,7 @@ If you need to customize goose, one way is via editing: `~/.config/goose/profiles.yaml`. -It will look by default something like (and when you run `goose session start` without the `--profile` flag it will use the `default` profile): +By default, it looks like this: ```yaml default: @@ -17,17 +17,14 @@ default: requires: {} ``` +If you run `goose session start` without the `--profile` flag it will use the `default` profile automatically. + ### Fields #### provider -Provider of LLM. LLM providers that are currently supported by Goose: +`provider` specifies the chosen LLM provider by the user. You can set up multiple profiles with different providers. Goose will use the provider specified in the profile to interact with the LLM. Here is the list of [supported LLM providers][providers] -| Provider | Required environment variable(s) to access provider | -| ---------- | --------------------------------------------------- | -| openai | `OPENAI_API_KEY` | -| anthropic | `ANTHROPIC_API_KEY` | -| databricks | `DATABRICKS_HOST` and `DATABRICKS_TOKEN` | #### processor @@ -54,68 +51,6 @@ To list available toolkits, use the following command: goose toolkit list ``` -### Example `profiles.yaml` files - -#### provider as `anthropic` - -```yaml - -default: - provider: anthropic - processor: claude-3-5-sonnet-20241022 - accelerator: claude-3-5-sonnet-20241022 -``` - -#### provider as `databricks` - -```yaml -default: - provider: databricks - processor: databricks-meta-llama-3-1-70b-instruct - accelerator: databricks-meta-llama-3-1-70b-instruct - moderator: passive - toolkits: - - name: developer - requires: {} -``` - -You can tell it to use another provider for example for Anthropic: - -```yaml -default: - provider: anthropic - processor: claude-3-5-sonnet-20241022 - accelerator: claude-3-5-sonnet-20241022 - moderator: passive - toolkits: - - name: developer - requires: {} -``` - -this will then use the claude-sonnet model, you will need to set the `ANTHROPIC_API_KEY` to your anthropic API key. - -You can also customize Goose's behavior through toolkits. These are set up automatically for you in the same `~/.config/goose/profiles.yaml` file, but you can include or remove toolkits as you see fit. - -For example, Goose's `unit-test-gen` command sets up a new profile in this file for you: - -```yaml -unit-test-gen: - provider: openai - processor: gpt-4o - accelerator: gpt-4o-mini - moderator: passive - toolkits: - - name: developer - requires: {} - - name: unit-test-gen - requires: {} - - name: java - requires: {} -``` - -[jinja-guide]: https://jinja.palletsprojects.com/en/3.1.x/ -[using-goosehints]: https://block.github.com/goose/guidance/using-goosehints.html - ## Adding a toolkit To make a toolkit available to Goose, add it to your project's pyproject.toml. For example in the Goose pyproject.toml file: @@ -150,6 +85,7 @@ Or, if you're developing a new toolkit and want to test it: uv run goose session start --profile my-profile ``` + ## Tuning Goose to your repo Goose ships with the ability to read in the contents of a file named `.goosehints` from your repo. If you find yourself repeating the same information across sessions to Goose, this file is the right place to add this information. @@ -159,4 +95,9 @@ This file will be read into the Goose system prompt if it is present in the curr Check out the [guide on using .goosehints][using-goosehints] for more tips. > [!NOTE] -> `.goosehints` follows [jinja templating rules][jinja-guide] in case you want to leverage templating to insert file contents or variables. \ No newline at end of file +> `.goosehints` follows [jinja templating rules][jinja-guide] in case you want to leverage templating to insert file contents or variables. + + +[providers]: https://block.github.io/goose/plugins/providers.html +[jinja-guide]: https://jinja.palletsprojects.com/en/3.1.x/ +[using-goosehints]: https://block.github.com/goose/guidance/using-goosehints.html diff --git a/docs/installation.md b/docs/installation.md index 641f0e09..b2f8f0b7 100644 --- a/docs/installation.md +++ b/docs/installation.md @@ -1,8 +1,10 @@ # Installation -To install Goose, use `pipx`.First ensure [pipx][pipx] is installed: +To install Goose, use `pipx` on macOS, Linux, or Windows. -``` sh +First, ensure [pipx][pipx] is installed: + +```sh brew install pipx pipx ensurepath ``` @@ -15,4 +17,45 @@ pipx install goose-ai [pipx]: https://github.com/pypa/pipx?tab=readme-ov-file#install-pipx -You can then run `goose` from the command line with `goose session start`. \ No newline at end of file +### Configuration + +#### Set up a provider +Goose works with a set of [supported LLM providers][providers] that you can obtain an API key from if you don't already have one. You'll be prompted to set an API key if you haven't set one previously when you run Goose. + +>[!TIP] +> **Billing:** +> +> You will need to have credits in your LLM Provider account (when necessary) to be able to successfully make requests. +> + +#### Profiles + +After installation, you can configure Goose anytime by editing your profile file located at `~/.config/goose/profiles.yaml`. You can set multiple profile configurations, use different LLM providers, and enable toolkits that customize Goose's functionality as well: + +```yaml +default: + provider: openai + processor: gpt-4o + accelerator: gpt-4o-mini + moderator: passive + toolkits: + - name: developer + requires: {} +``` + + +## Running Goose + +You can run `goose` from the command line using: + +```sh +goose session start +``` + + +## Additional Resources + +Visit the [Configuration Guide][configuration-guide] for detailed instructions on configuring Goose. + +[configuration-guide]: https://block.github.io/goose/configuration.html +[providers]: https://block.github.io/goose/plugins/providers.html \ No newline at end of file diff --git a/docs/plugins/providers.md b/docs/plugins/providers.md index 6e1f0311..719ef084 100644 --- a/docs/plugins/providers.md +++ b/docs/plugins/providers.md @@ -1,15 +1,99 @@ # Providers -Providers in Goose mean "LLM providers" that Goose can interact with. Providers are defined in the [Exchange library][exchange-providers] for the most part, but you can define your own. +Providers in Goose mean "LLM providers" that Goose can interact with. Providers are defined in the [Exchange library][exchange-providers] for the most part, but you can define your own. -**Currently available providers:** +As you configure your chosen provider, you add the models you want to use to the `~/.config/goose/profiles.yaml` file and you can set any necessary environment variables or API keys in your terminal. For example: + +```sh +export PROVIDER_API_KEY="your_api_key_here" +``` -* Anthropic -* Azure -* Bedrock -* Databricks -* Google -* Ollama -* OpenAI +## Currently Available Providers -[exchange-providers]: https://github.com/block/goose/tree/main/packages/exchange/src/exchange/providers +### Anthropic + +To use Anthropic, you need an API key, which you can obtain by signing up or logging into [Anthropic's platform](https://www.anthropic.com/). Once you have your API key and your `profiles.yaml` file updated to the provider, you can set the `ANTHROPIC_API_KEY` environment variable in your shell using: + +```sh +export ANTHROPIC_API_KEY="your_api_key_here"`. +``` + +```yaml title="profiles.yaml" +default: + provider: anthropic + processor: claude-3-5-sonnet-20241022 + accelerator: claude-3-5-sonnet-20241022 +``` + +### Azure + +Azure AI services provide API keys through the Azure Portal. Visit the [Azure Portal](https://portal.azure.com/) to create a resource and obtain your key. You will need to configure Goose by updating your profile and setting appropriate environment variables. + +```yaml title="profiles.yaml" +default: + provider: azure + processor: azure-gpt-4 + accelerator: azure-gpt-3 +``` + +### Bedrock + +More information can be found at [AWS Bedrock](https://aws.amazon.com/bedrock/). You need to set up your AWS credentials and configure Bedrock access accordingly in your Goose profile. + + +```yaml title="profiles.yaml" +default: + provider: bedrock + processor: titan-llm + accelerator: titan-llm-lite +``` + +### Databricks + +To use Databricks, sign up or log into [Databricks](https://www.databricks.com/) and generate a personal access token via the user settings. Configure Goose by setting the `DATABRICKS_HOST` and `DATABRICKS_TOKEN` environment variables. + +```yaml title="profiles.yaml" +default: + provider: databricks + processor: databricks-meta-llama-3-1-70b-instruct + accelerator: databricks-meta-llama-3-1-70b-instruct +``` + +### Google + +Google Cloud AI services require you to set up a project in the [Google Cloud Console](https://console.cloud.google.com/). After enabling the relevant APIs, you should generate an API key or set up a service account. Ensure your application can access these credentials. + +```yaml title="profiles.yaml" +default: + provider: google + processor: gemini-1.5-flash + accelerator: gemini-1.5-flash +``` + +### Ollama + +For Ollama, refer to the setup process on [Ollama's site](https://ollama.com/) for obtaining necessary credentials. Make sure your environment has all the required tokens set up. + +```yaml title="profiles.yaml" +default: + provider: ollama + processor: ollama-pro + accelerator: ollama-lite +``` + +### OpenAI + +Register at [OpenAI's platform](https://platform.openai.com/api-keys) to obtain an API key. Configure Goose by updating your `profiles.yaml` file and setting the `OPENAI_API_KEY` in your terminal: + +```sh +export OPENAI_API_KEY="your_api_key_here" +``` + +```yaml title="profiles.yaml" +default: + provider: openai + processor: gpt-4 + accelerator: gpt-3.5-turbo +``` + +[exchange-providers]: https://github.com/block/goose/tree/main/packages/exchange/src/exchange/providers \ No newline at end of file diff --git a/docs/quickstart.md b/docs/quickstart.md index 1b53f121..bf9921bc 100644 --- a/docs/quickstart.md +++ b/docs/quickstart.md @@ -21,6 +21,16 @@ pipx install goose-ai ### Running Goose +#### Set up a provider +Goose works with a set of [supported LLM providers][providers] that you can obtain an API key from if you don't already have one. You'll be prompted to set an API key if you haven't set one previously when you run Goose. + +>[!TIP] +> **Billing:** +> +> You will need to have credits in your LLM Provider account (when necessary) to be able to successfully make requests. +> + + #### Start a session From your terminal, navigate to the directory you'd like to start from and run: @@ -29,15 +39,6 @@ From your terminal, navigate to the directory you'd like to start from and run: goose session start ``` -#### Set up a provider -Goose works with your [preferred LLM][providers]. By default, it uses `openai` as the LLM provider. You'll be prompted to set an [API key][openai-key] if you haven't set one previously. - ->[!TIP] -> **Billing:** -> -> You will need to have credits in your LLM Provider account to be able to successfully make requests. -> - #### Make Goose do the work for you You will see the Goose prompt `G❯`: