This example demonstrates basic remote MCP (Model Context Protocol) server usage with the Dedalus SDK for connecting to external tools and services.
import asyncio
from dedalus_labs import AsyncDedalus, DedalusRunner
from dotenv import load_dotenv
from dedalus_labs.utils.stream import stream_async
load_dotenv()
async def main():
client = AsyncDedalus()
runner = DedalusRunner(client)
result = await runner.run(
input="Who won Wimbledon 2025?",
model="openai/gpt-5-mini",
mcp_servers=["simon-liang/brave-search-mcp"],
stream=False
)
print(result.final_output)
if __name__ == "__main__":
asyncio.run(main())
Using Local MCP Servers
If you want to use a locally running MCP server, you’ll need to expose it via a public tunnel since Dedalus needs to access your server over the internet.
Install Cloudflare Tunnel:
Start the tunnel:
# Replace the port with wherever your MCP server is running
cloudflared tunnel --url http://localhost:8000
This will output a public URL (e.g., https://random-name.trycloudflare.com) that you can use to connect your Dedalus agent to your local MCP server.
import asyncio
from dedalus_labs import AsyncDedalus, DedalusRunner
from dotenv import load_dotenv
from dedalus_labs.utils.stream import stream_async
load_dotenv()
async def main():
client = AsyncDedalus()
runner = DedalusRunner(client)
result = await runner.run(
input="Who won Wimbledon 2025?",
model="openai/gpt-5-mini",
mcp_servers="https://random-name.trycloudflare.com",
stream=False
)
print(result.final_output)
if __name__ == "__main__":
asyncio.run(main())