docs: Update README with FastMCP-style structure and examples

This commit is contained in:
David Soria Parra
2024-12-19 21:37:29 +00:00
parent f2128a7977
commit 45cea71d90

371
README.md
View File

@@ -18,12 +18,22 @@
- [Overview](#overview) - [Overview](#overview)
- [Installation](#installation) - [Installation](#installation)
- [Writing MCP Servers](#writing-mcp-servers) - [Quickstart](#quickstart)
- [FastMCP (Recommended)](#fastmcp-recommended) - [What is MCP?](#what-is-mcp)
- [Low-Level Server (Advanced)](#low-level-server-advanced) - [Core Concepts](#core-concepts)
- [Writing MCP Clients](#writing-mcp-clients) - [Server](#server)
- [MCP Primitives](#mcp-primitives) - [Resources](#resources)
- [Server Capabilities](#server-capabilities) - [Tools](#tools)
- [Prompts](#prompts)
- [Images](#images)
- [Context](#context)
- [Running Your Server](#running-your-server)
- [Development Mode](#development-mode)
- [Claude Desktop Integration](#claude-desktop-integration)
- [Direct Execution](#direct-execution)
- [Examples](#examples)
- [Echo Server](#echo-server)
- [SQLite Explorer](#sqlite-explorer)
- [Documentation](#documentation) - [Documentation](#documentation)
- [Contributing](#contributing) - [Contributing](#contributing)
- [License](#license) - [License](#license)
@@ -41,7 +51,6 @@
[discussions-badge]: https://img.shields.io/github/discussions/modelcontextprotocol/python-sdk [discussions-badge]: https://img.shields.io/github/discussions/modelcontextprotocol/python-sdk
[discussions-url]: https://github.com/modelcontextprotocol/python-sdk/discussions [discussions-url]: https://github.com/modelcontextprotocol/python-sdk/discussions
## Overview ## Overview
The Model Context Protocol allows applications to provide context for LLMs in a standardized way, separating the concerns of providing context from the actual LLM interaction. This Python SDK implements the full MCP specification, making it easy to: The Model Context Protocol allows applications to provide context for LLMs in a standardized way, separating the concerns of providing context from the actual LLM interaction. This Python SDK implements the full MCP specification, making it easy to:
@@ -53,186 +62,268 @@ The Model Context Protocol allows applications to provide context for LLMs in a
## Installation ## Installation
We recommend the use of [uv](https://docs.astral.sh/uv/) to manage your Python projects: We recommend using [uv](https://docs.astral.sh/uv/) to manage your Python projects:
```bash ```bash
uv add mcp uv add mcp
``` ```
Alternatively, add mcp to your `requirements.txt`: Alternatively:
``` ```bash
pip install mcp pip install mcp
# or add to requirements.txt
pip install -r requirements.txt
``` ```
## Writing MCP Servers ## Quickstart
The MCP Python SDK provides two ways to implement servers: Let's create a simple MCP server that exposes a calculator tool and some data:
### FastMCP (Recommended)
FastMCP provides a high-level, Pythonic interface for building MCP servers quickly and easily. It handles all the complex protocol details so you can focus on building great tools:
```python ```python
# server.py
from mcp.server.fastmcp import FastMCP from mcp.server.fastmcp import FastMCP
# Create an MCP server
mcp = FastMCP("Demo") mcp = FastMCP("Demo")
# Add an addition tool
@mcp.tool() @mcp.tool()
def add(a: int, b: int) -> int: def add(a: int, b: int) -> int:
"""Add two numbers""" """Add two numbers"""
return a + b return a + b
# Add a dynamic greeting resource
@mcp.resource("greeting://{name}") @mcp.resource("greeting://{name}")
def get_greeting(name: str) -> str: def get_greeting(name: str) -> str:
"""Get a personalized greeting""" """Get a personalized greeting"""
return f"Hello, {name}!" return f"Hello, {name}!"
``` ```
FastMCP features: You can install this server in [Claude Desktop](https://claude.ai/download) and interact with it right away by running:
- Simple, decorator-based API ```bash
- Automatic type handling and validation mcp install server.py
- Built-in support for async functions ```
- Progress reporting and logging
- Resource templating
- Image handling
### Low-Level Server (Advanced) Alternatively, you can test it with the MCP Inspector:
```bash
mcp dev server.py
```
For more control, you can use the low-level server implementation directly. This gives you full access to the protocol and allows you to customize every aspect of your server: ## What is MCP?
The [Model Context Protocol (MCP)](https://modelcontextprotocol.io) lets you build servers that expose data and functionality to LLM applications in a secure, standardized way. Think of it like a web API, but specifically designed for LLM interactions. MCP servers can:
- Expose data through **Resources** (think of these sort of like GET endpoints; they are used to load information into the LLM's context)
- Provide functionality through **Tools** (sort of like POST endpoints; they are used to execute code or otherwise produce a side effect)
- Define interaction patterns through **Prompts** (reusable templates for LLM interactions)
- And more!
## Core Concepts
### Server
The FastMCP server is your core interface to the MCP protocol. It handles connection management, protocol compliance, and message routing:
```python ```python
from mcp.server.lowlevel import Server, NotificationOptions from mcp.server.fastmcp import FastMCP
from mcp.server.models import InitializationOptions
import mcp.server.stdio
import mcp.types as types
# Create a server instance # Create a named server
server = Server("example-server") mcp = FastMCP("My App")
@server.list_prompts() # Specify dependencies for deployment and development
async def handle_list_prompts() -> list[types.Prompt]: mcp = FastMCP("My App", dependencies=["pandas", "numpy"])
```
### Resources
Resources are how you expose data to LLMs. They're similar to GET endpoints in a REST API - they provide data but shouldn't perform significant computation or have side effects:
```python
@mcp.resource("config://app")
def get_config() -> str:
"""Static configuration data"""
return "App configuration here"
@mcp.resource("users://{user_id}/profile")
def get_user_profile(user_id: str) -> str:
"""Dynamic user data"""
return f"Profile data for user {user_id}"
```
### Tools
Tools let LLMs take actions through your server. Unlike resources, tools are expected to perform computation and have side effects:
```python
@mcp.tool()
def calculate_bmi(weight_kg: float, height_m: float) -> float:
"""Calculate BMI given weight in kg and height in meters"""
return weight_kg / (height_m ** 2)
@mcp.tool()
async def fetch_weather(city: str) -> str:
"""Fetch current weather for a city"""
async with httpx.AsyncClient() as client:
response = await client.get(f"https://api.weather.com/{city}")
return response.text
```
### Prompts
Prompts are reusable templates that help LLMs interact with your server effectively:
```python
@mcp.prompt()
def review_code(code: str) -> str:
return f"Please review this code:\n\n{code}"
@mcp.prompt()
def debug_error(error: str) -> list[Message]:
return [ return [
types.Prompt( UserMessage("I'm seeing this error:"),
name="example-prompt", UserMessage(error),
description="An example prompt template", AssistantMessage("I'll help debug that. What have you tried so far?")
arguments=[
types.PromptArgument(
name="arg1",
description="Example argument",
required=True
)
] ]
)
]
@server.get_prompt()
async def handle_get_prompt(
name: str,
arguments: dict[str, str] | None
) -> types.GetPromptResult:
if name != "example-prompt":
raise ValueError(f"Unknown prompt: {name}")
return types.GetPromptResult(
description="Example prompt",
messages=[
types.PromptMessage(
role="user",
content=types.TextContent(
type="text",
text="Example prompt text"
)
)
]
)
async def run():
async with mcp.server.stdio.stdio_server() as (read_stream, write_stream):
await server.run(
read_stream,
write_stream,
InitializationOptions(
server_name="example",
server_version="0.1.0",
capabilities=server.get_capabilities(
notification_options=NotificationOptions(),
experimental_capabilities={},
)
)
)
if __name__ == "__main__":
import asyncio
asyncio.run(run())
``` ```
## Writing MCP Clients ### Images
The SDK provides a high-level client interface for connecting to MCP servers: FastMCP provides an `Image` class that automatically handles image data:
```python ```python
from mcp import ClientSession, StdioServerParameters from mcp.server.fastmcp import FastMCP, Image
from mcp.client.stdio import stdio_client from PIL import Image as PILImage
# Create server parameters for stdio connection @mcp.tool()
server_params = StdioServerParameters( def create_thumbnail(image_path: str) -> Image:
command="python", # Executable """Create a thumbnail from an image"""
args=["example_server.py"], # Optional command line arguments img = PILImage.open(image_path)
env=None # Optional environment variables img.thumbnail((100, 100))
) return Image(data=img.tobytes(), format="png")
async def run():
async with stdio_client(server_params) as (read, write):
async with ClientSession(read, write) as session:
# Initialize the connection
await session.initialize()
# List available prompts
prompts = await session.list_prompts()
# Get a prompt
prompt = await session.get_prompt("example-prompt", arguments={"arg1": "value"})
# List available resources
resources = await session.list_resources()
# List available tools
tools = await session.list_tools()
# Read a resource
resource = await session.read_resource("file://some/path")
# Call a tool
result = await session.call_tool("tool-name", arguments={"arg1": "value"})
if __name__ == "__main__":
import asyncio
asyncio.run(run())
``` ```
## MCP Primitives ### Context
The MCP protocol defines three core primitives that servers can implement: The Context object gives your tools and resources access to MCP capabilities:
| Primitive | Control | Description | Example Use | ```python
|-----------|-----------------------|-----------------------------------------------------|------------------------------| from mcp.server.fastmcp import FastMCP, Context
| Prompts | User-controlled | Interactive templates invoked by user choice | Slash commands, menu options |
| Resources | Application-controlled| Contextual data managed by the client application | File contents, API responses |
| Tools | Model-controlled | Functions exposed to the LLM to take actions | API calls, data updates |
### Server Capabilities @mcp.tool()
async def long_task(files: list[str], ctx: Context) -> str:
"""Process multiple files with progress tracking"""
for i, file in enumerate(files):
ctx.info(f"Processing {file}")
await ctx.report_progress(i, len(files))
data = await ctx.read_resource(f"file://{file}")
return "Processing complete"
```
MCP servers declare capabilities during initialization: ## Running Your Server
| Capability | Feature Flag | Description | ### Development Mode
|-------------|------------------------------|------------------------------------|
| `prompts` | `listChanged` | Prompt template management | The fastest way to test and debug your server is with the MCP Inspector:
| `resources` | `subscribe`<br/>`listChanged`| Resource exposure and updates |
| `tools` | `listChanged` | Tool discovery and execution | ```bash
| `logging` | - | Server logging configuration | mcp dev server.py
| `completion`| - | Argument completion suggestions |
# Add dependencies
mcp dev server.py --with pandas --with numpy
# Mount local code
mcp dev server.py --with-editable .
```
### Claude Desktop Integration
Once your server is ready, install it in Claude Desktop:
```bash
mcp install server.py
# Custom name
mcp install server.py --name "My Analytics Server"
# Environment variables
mcp install server.py -e API_KEY=abc123 -e DB_URL=postgres://...
mcp install server.py -f .env
```
### Direct Execution
For advanced scenarios like custom deployments:
```python
from mcp.server.fastmcp import FastMCP
mcp = FastMCP("My App")
if __name__ == "__main__":
mcp.run()
```
Run it with:
```bash
python server.py
# or
mcp run server.py
```
## Examples
### Echo Server
A simple server demonstrating resources, tools, and prompts:
```python
from mcp.server.fastmcp import FastMCP
mcp = FastMCP("Echo")
@mcp.resource("echo://{message}")
def echo_resource(message: str) -> str:
"""Echo a message as a resource"""
return f"Resource echo: {message}"
@mcp.tool()
def echo_tool(message: str) -> str:
"""Echo a message as a tool"""
return f"Tool echo: {message}"
@mcp.prompt()
def echo_prompt(message: str) -> str:
"""Create an echo prompt"""
return f"Please process this message: {message}"
```
### SQLite Explorer
A more complex example showing database integration:
```python
from mcp.server.fastmcp import FastMCP
import sqlite3
mcp = FastMCP("SQLite Explorer")
@mcp.resource("schema://main")
def get_schema() -> str:
"""Provide the database schema as a resource"""
conn = sqlite3.connect("database.db")
schema = conn.execute(
"SELECT sql FROM sqlite_master WHERE type='table'"
).fetchall()
return "\n".join(sql[0] for sql in schema if sql[0])
@mcp.tool()
def query_data(sql: str) -> str:
"""Execute SQL queries safely"""
conn = sqlite3.connect("database.db")
try:
result = conn.execute(sql).fetchall()
return "\n".join(str(row) for row in result)
except Exception as e:
return f"Error: {str(e)}"
```
## Documentation ## Documentation