From 8b5838675c3da3bffcca03b9b3554779875059ba Mon Sep 17 00:00:00 2001 From: Daniel <88821366+ddltn@users.noreply.github.com> Date: Tue, 13 May 2025 14:00:01 +0100 Subject: [PATCH] docs: clarify LLM API configuration in mcp_simple_chatbot README (#487) Co-authored-by: Danny <88821366+ddaltn@users.noreply.github.com> --- examples/clients/simple-chatbot/README.MD | 1 + 1 file changed, 1 insertion(+) diff --git a/examples/clients/simple-chatbot/README.MD b/examples/clients/simple-chatbot/README.MD index 683e4f3..22996d9 100644 --- a/examples/clients/simple-chatbot/README.MD +++ b/examples/clients/simple-chatbot/README.MD @@ -25,6 +25,7 @@ This example demonstrates how to integrate the Model Context Protocol (MCP) into ```plaintext LLM_API_KEY=your_api_key_here ``` + **Note:** The current implementation is configured to use the Groq API endpoint (`https://api.groq.com/openai/v1/chat/completions`) with the `llama-3.2-90b-vision-preview` model. If you plan to use a different LLM provider, you'll need to modify the `LLMClient` class in `main.py` to use the appropriate endpoint URL and model parameters. 3. **Configure servers:**