Skip to main content
DedalusRunner is the core of the Dedalus SDK. It orchestrates local tools, hosted MCP servers, streaming, and any model from any provider into a single agentic loop. Five lines of code, any agent you want.

Quick Example

from dedalus_labs import AsyncDedalus, DedalusRunner

client = AsyncDedalus()
runner = DedalusRunner(client)

result = await runner.run(
    input="What's the weather in Tokyo?",
    model="anthropic/claude-sonnet-4-20250514",
    mcp_servers=["windsornguyen/open-meteo-mcp"],
    max_steps=5,
)

print(result.final_output)

Parameters


Return Value

RunResult
object
Response object returned by runner.run().
Multi-turn Chat
import asyncio
from dedalus_labs import AsyncDedalus, DedalusRunner

async def main():
    client = AsyncDedalus()
    runner = DedalusRunner(client)
    messages: list[dict] = []

    while True:
        user_input = input("You: ").strip()
        if not user_input:
            break

        messages.append({"role": "user", "content": user_input})

        result = await runner.run(
            model="openai/gpt-4o",
            messages=messages,
        )

        messages = result.to_input_list()
        print(f"Assistant: {result.final_output}\n")

asyncio.run(main())
Example Response
{
  "final_output": "The weather in Tokyo is currently 18°C with clear skies.",
  "tool_results": [],
  "mcp_results": [
    {
      "name": "get_current_weather",
      "result": {"temperature": 18, "conditions": "clear"},
      "server": "windsornguyen/open-meteo-mcp"
    }
  ],
  "tools_called": ["get_current_weather"],
  "steps_used": 2,
  "messages": [...]
}

Next Steps

Tools

Define local functions the model can call.

MCP Servers

Connect to hosted MCP servers.

Structured Outputs

Validate responses against schemas.

Streaming

Stream responses as they generate.
Last modified on March 10, 2026