mirror of
https://github.com/aljazceru/mcp-python-sdk.git
synced 2025-12-19 14:54:24 +01:00
docs: clarify LLM API configuration in mcp_simple_chatbot README (#487)
Co-authored-by: Danny <88821366+ddaltn@users.noreply.github.com>
This commit is contained in:
@@ -25,6 +25,7 @@ This example demonstrates how to integrate the Model Context Protocol (MCP) into
|
|||||||
```plaintext
|
```plaintext
|
||||||
LLM_API_KEY=your_api_key_here
|
LLM_API_KEY=your_api_key_here
|
||||||
```
|
```
|
||||||
|
**Note:** The current implementation is configured to use the Groq API endpoint (`https://api.groq.com/openai/v1/chat/completions`) with the `llama-3.2-90b-vision-preview` model. If you plan to use a different LLM provider, you'll need to modify the `LLMClient` class in `main.py` to use the appropriate endpoint URL and model parameters.
|
||||||
|
|
||||||
3. **Configure servers:**
|
3. **Configure servers:**
|
||||||
|
|
||||||
|
|||||||
Reference in New Issue
Block a user