docs: clarify LLM API configuration in mcp_simple_chatbot README (#487)

Co-authored-by: Danny <88821366+ddaltn@users.noreply.github.com>
This commit is contained in:
Daniel
2025-05-13 14:00:01 +01:00
committed by GitHub
parent 173e0ee3b8
commit 8b5838675c

View File

@@ -25,6 +25,7 @@ This example demonstrates how to integrate the Model Context Protocol (MCP) into
```plaintext ```plaintext
LLM_API_KEY=your_api_key_here LLM_API_KEY=your_api_key_here
``` ```
**Note:** The current implementation is configured to use the Groq API endpoint (`https://api.groq.com/openai/v1/chat/completions`) with the `llama-3.2-90b-vision-preview` model. If you plan to use a different LLM provider, you'll need to modify the `LLMClient` class in `main.py` to use the appropriate endpoint URL and model parameters.
3. **Configure servers:** 3. **Configure servers:**