Skip to main content
OpenMCP wraps the reference MCP SDK with ergonomic registration, automatic JSON Schema inference, and operational essentials. Every feature cites the MCP clause it implements. You ship production servers in minutes.

Install

pip install openmcp
Or with uv:
uv add openmcp

Your first server

from openmcp import MCPServer, tool

@tool(description="Add two numbers")
def add(a: int, b: int) -> int:
    return a + b

@tool(description="Multiply two numbers")
def multiply(a: int, b: int) -> int:
    return a * b

server = MCPServer("calculator")
server.collect(add, multiply)

if __name__ == "__main__":
    import asyncio
    asyncio.run(server.serve())  # Streamable HTTP on :8000
Decorators attach metadata. collect() registers them. No globals, no hidden state.

Using with Dedalus SDK

The fastest path to production is through the Dedalus Agents SDK:
from dedalus import Agent
from openmcp import MCPServer, tool

@tool(description="Search knowledge base")
def search(query: str) -> list[dict]:
    # Your search logic
    return [{"title": "Result", "content": "..."}]

server = MCPServer("knowledge")
server.collect(search)

agent = Agent(
    model="claude-sonnet-4-20250514",
    mcp_servers=[server],  # OpenMCP servers plug in directly
)

response = await agent.run("Find information about quantum computing")
The SDK handles transport, authentication, and tool orchestration. Your MCP server becomes an agent capability.

Connect a client

Script-style (no context manager required):
from openmcp.client import MCPClient
import asyncio

async def main():
    client = await MCPClient.connect("http://127.0.0.1:8000/mcp")
    
    tools = await client.list_tools()
    print([t.name for t in tools.tools])
    
    result = await client.call_tool("add", {"a": 2, "b": 3})
    print(result.content[0].text)  # "5"
    
    await client.close()

asyncio.run(main())
Or use context manager for guaranteed cleanup:
async with await MCPClient.connect("http://127.0.0.1:8000/mcp") as client:
    result = await client.call_tool("add", {"a": 2, "b": 3})

Registration model

OpenMCP separates declaration from registration. The @tool decorator only attaches metadata. Registration happens when you call collect().
# Decoration: metadata only
@tool(description="Shared utility")
def timestamp() -> int:
    return int(time.time())

# Registration: explicit
server_a = MCPServer("service-a")
server_b = MCPServer("service-b")

server_a.collect(timestamp)
server_b.collect(timestamp)
Same function, multiple servers. No state conflicts. Tests stay isolated. For functions organized in modules:
from tools import math, text

server = MCPServer("multi-module")
server.collect_from(math, text)  # Registers all decorated functions
See Registration for the full design rationale.

Capabilities

CapabilityWhat you get
ToolsSync/async functions, JSON Schema inference, tags, allow-lists
ResourcesStatic URIs, templates, binary/text payloads, subscriptions
PromptsReusable chat templates with typed arguments
CompletionsArgument completion for prompts and resource templates
LoggingRuntime log levels, structured messages
ProgressToken-based trackers with coalescing
CancellationCooperative cancellation via context
RootsFilesystem boundaries with path guards
SamplingRequest LLM completions from the client
ElicitationRequest structured user input mid-call

Context

Inside a tool, get_context() returns a request-scoped helper:
from openmcp import tool, get_context

@tool(description="Process items with progress")
async def process(items: list[str]) -> dict:
    ctx = get_context()
    
    async with ctx.progress(total=len(items)) as tracker:
        for item in items:
            await do_work(item)
            await tracker.advance(1, message=f"Processed {item}")
            await ctx.info("Item done", data={"item": item})
    
    return {"count": len(items)}
Methods: ctx.debug(), ctx.info(), ctx.warning(), ctx.error(), ctx.progress(). Outside a request context, get_context() raises. You catch lifecycle bugs early.

Transports

Streamable HTTP (default):
await server.serve()  # http://127.0.0.1:8000/mcp
STDIO for CLI tools:
await server.serve(transport="stdio")

Authentication

Bearer token:
from openmcp.client.auth import BearerAuth

client = await MCPClient.connect(
    "http://127.0.0.1:8000/mcp",
    auth=BearerAuth(access_token="your-token")
)
DPoP (RFC 9449) for zero-trust environments:
from openmcp.client.auth import DPoPAuth

dpop_key = DPoPAuth.generate_key()
client = await MCPClient.connect(
    "http://127.0.0.1:8000/mcp",
    auth=DPoPAuth(access_token="your-token", dpop_key=dpop_key)
)

Production checklist

  • Set NotificationFlags for dynamic list changes
  • Configure TransportSecuritySettings for allowed hosts/origins
  • Add authorization via AuthorizationConfig and a provider
  • Use allow_dynamic_tools=True if you add tools at runtime
  • Call notify_tools_list_changed() after dynamic mutations
  • Test with real MCP clients (Claude Desktop, Cursor)

Examples

ScenarioPath
Minimal serverexamples/showcase/01_minimal.py
Script-style clientexamples/showcase/01_client.py
Bidirectional (sampling)examples/showcase/02_bidirectional_*
Real-time tool updatesexamples/showcase/03_realtime_*
Typed tools + Pydanticexamples/capabilities/tools/01_typed_tools.py
Tags and filteringexamples/capabilities/tools/02_tags_and_filtering.py
Testing patternsexamples/patterns/testing.py
Multi-server setupexamples/patterns/multi_server.py
MCP server chainingexamples/advanced/llm_chain.py
Run any example:
uv run python examples/showcase/01_minimal.py

Next steps