Most data analysis requires two distinct capabilities: finding information and processing it. An agent that can search the web for current data, then write and execute Python code to analyze it, bridges this gap.
This example combines a web search MCP server with a local code execution tool. The agent finds real-time information, extracts the relevant data, writes analysis code, executes it, and reports the findings.
import asyncio
from dedalus_labs import AsyncDedalus, DedalusRunner
from dedalus_labs.utils.stream import stream_async
from dotenv import load_dotenv
load_dotenv()
def execute_python_code(code: str) -> str:
"""Execute Python code and return the result."""
try:
namespace = {}
exec(code, {"__builtins__": __builtins__}, namespace)
if 'result' in namespace:
return str(namespace['result'])
results = {k: v for k, v in namespace.items() if not k.startswith('_')}
return str(results) if results else "Code executed successfully"
except Exception as e:
return f"Error: {str(e)}"
async def main():
client = AsyncDedalus()
runner = DedalusRunner(client)
result = runner.run(
input="""Research the current stock prices of Tesla (TSLA) and Apple (AAPL).
Then write Python code to:
1. Compare their prices
2. Calculate the percentage difference
3. Provide a brief analysis""",
model="openai/gpt-4o-mini",
tools=[execute_python_code],
mcp_servers=["tsion/brave-search-mcp"],
stream=True
)
await stream_async(result)
if __name__ == "__main__":
asyncio.run(main())
The agent searches for stock prices, extracts the numbers, writes comparison code, runs it, and explains the results—all in one request.
The execute_python_code tool runs arbitrary code. In production, use a sandboxed execution environment.
Why This Pattern
Traditional approaches require separate steps: query an API, parse the response, write analysis code, run it manually. This agent handles the entire workflow, adapting its code to whatever data it finds.
The pattern extends to any research-then-analyze task: market research, competitive analysis, data journalism, or exploratory data science.