Skip to main content
The Dedalus SDK is a full MCP client. Connect your agents to any server that implements the Model Context Protocol, hosted by you, us, or anyone else.

Quickstart

import asyncio
from dedalus_labs import AsyncDedalus, DedalusRunner
from dotenv import load_dotenv

load_dotenv()

async def main():
    client = AsyncDedalus()
    runner = DedalusRunner(client)

    result = await runner.run(
        input="Use your tools to tell me "
        "some cool facts dedalus-labs/dedalus-sdk-python",
        model="openai/gpt-5-nano",

        # Any public MCP URL!
        mcp_servers=["https://mcp.deepwiki.com/mcp"]
    )

    print(result.final_output)

if __name__ == "__main__":
    asyncio.run(main())
The agent discovers the server’s tools and uses them when relevant. Public MCP endpoints work out of the box. Self-hosted servers work the same way. If it speaks MCP over streamable HTTP, your agent can use it.

Next Steps

See Tools to combine MCP servers with local functions, or browse our Examples for more patterns.