Skip to main content
The Dedalus SDK is a full MCP client. Connect your agents to any server that implements the Model Context Protocol, hosted by you, us, or anyone else. Local tools handle your custom logic, MCP servers add hosted capabilities (search, databases, SaaS APIs, etc.).

Connect MCP server in one line

import asyncio
from dedalus_labs import AsyncDedalus, DedalusRunner
from dotenv import load_dotenv

load_dotenv()

async def main():
    client = AsyncDedalus()
    runner = DedalusRunner(client)

    result = await runner.run(
        input="What's the weather forecast for San Francisco this week?",
        model="anthropic/claude-opus-4-5",
        mcp_servers=["windsornguyen/open-meteo-mcp"],  # Weather forecasts via Open-Meteo
    )

    print(result.final_output)

if __name__ == "__main__":
    asyncio.run(main())
The agent discovers the server’s tools and uses them when relevant.

Combine with local tools

MCP servers and local tools work together. Pass both to runner.run().
import asyncio
from dedalus_labs import AsyncDedalus, DedalusRunner
from dotenv import load_dotenv

load_dotenv()

def as_bullets(items: list[str]) -> str:
    """Format items as a bulleted list."""
    return "\n".join(f"• {item}" for item in items)

async def main():
    client = AsyncDedalus()
    runner = DedalusRunner(client)

    result = await runner.run(
        input=(
            "Get the 7-day weather forecast for San Francisco "
            "and format the daily conditions as bullets using as_bullets."
        ),
        model="anthropic/claude-opus-4-5",
        mcp_servers=["windsornguyen/open-meteo-mcp"],
        tools=[as_bullets],
    )

    print(result.final_output)

if __name__ == "__main__":
    asyncio.run(main())

External MCP URL

You can connect directly to any external MCP server URL (Streamable HTTP). This is useful when:
  • You’re testing a server without registering it
  • You’re connecting to a self-hosted MCP deployment
  • You’re using an MCP server that isn’t in the marketplace
import asyncio
from dedalus_labs import AsyncDedalus, DedalusRunner
from dotenv import load_dotenv

load_dotenv()

async def main():
    client = AsyncDedalus()
    runner = DedalusRunner(client)

    result = await runner.run(
        input="Use your tools to summarize the Dedalus Python SDK repo in 5 bullet points.",
        model="openai/gpt-5.2",
        # External MCP URL!
        mcp_servers=["https://mcp.deepwiki.com/mcp"],
    )

    print(result.final_output)

if __name__ == "__main__":
    asyncio.run(main())

Next steps

  • Return typed data: Structured Outputs — Validate and parse JSON into schemas
  • Stream the workflow: Streaming — Watch tool use + output in real time
  • See examples: Use Cases — End-to-end MCP agent patterns
Connect these docs programmatically to Claude, VSCode, and more via MCP for real-time answers.
Last modified on February 28, 2026