- Provider-agnostic: Use OpenAI, Anthropic, Google, xAI, DeepSeek, and more with one API.
- Tool- and MCP-native: Let models call local functions and hosted MCP servers.
- Production-ready: Streaming, structured outputs, routing/handoffs, and runtime policies.
What are you trying to build?
Chat with a model
Send a prompt and get a response from any provider/model.
Equip a model with tools
Let the model call typed Python/TS functions that you implement.
Stream agent output
Print responses as they’re generated (great for UIs/CLIs).
Add MCP servers
Connect to hosted MCP servers with one line.
Get reliable JSON
Validate model output against schemas (Pydantic/Zod).
Route across models
Provide multiple models; the agent can route/handoff by phase.
Installation
Set Your API Key
Get your API key from the dashboard and set it as an environment variable:.env file:
Your First Request
Let’s build this incrementally.1) Chat with a model
2) Add an MCP server
Here we connect a well-known MCP server and let the model use it.3) Add a local tool
Define a function with type hints and a docstring. Pass it torunner.run(). The SDK extracts the schema automatically and handles execution when the model decides to use it.
4) Stream output
Next steps
Use Cases
Start from common agent patterns and templates.
Cookbook
End-to-end implementations and working recipes.