From 10b7d8e376b168f76d176f357fb90f31255d0b88 Mon Sep 17 00:00:00 2001 From: crimson-knight Date: Tue, 11 Apr 2023 07:46:25 -0400 Subject: [PATCH] Adds information on how to use the other cache methods available --- .env.template | 1 + README.md | 9 +++++++++ 2 files changed, 10 insertions(+) diff --git a/.env.template b/.env.template index 01735615..1ff98a6f 100644 --- a/.env.template +++ b/.env.template @@ -13,3 +13,4 @@ OPENAI_AZURE_DEPLOYMENT_ID=deployment-id-for-azure IMAGE_PROVIDER=dalle HUGGINGFACE_API_TOKEN= USE_MAC_OS_TTS=False +MEMORY_BACKEND=local diff --git a/README.md b/README.md index 749c8791..aa083689 100644 --- a/README.md +++ b/README.md @@ -204,6 +204,15 @@ export PINECONE_ENV="Your pinecone region" # something like: us-east4-gcp ``` +## Setting Your Cache Type + +By default Auto-GPT is going to use LocalCache instead of redis or Pinecone. + +To switch to either, change the `MEMORY_BACKEND` env variable to the value that you want: + +`local` (default) uses a local JSON cache file +`pinecone` uses the Pinecone.io account you configured in your ENV settings +`redis` will use the redis cache that you configured ## View Memory Usage