Skip to main content
A tool is a function you expose to a language model. Concretely, it specifies the input and output schema that describes what the function takes in and what it outputs. Tool calling is useful because language models cannot execute code by itself. It can only specify which function should be invoked and how.

The manual way

Under the hood, tool calling is a three-step process.
  1. Describe the structure of the tool as a JSON schema and pass it to the model
  2. The model fills in the JSON schema and outputs it for the application to parse
  3. The application executes the tool call and send the results to the model

1. Describe the tool schema

TypeScript
import Dedalus from "dedalus-labs";

const client = new Dedalus();

function getWeather(city: string, units = "celsius") {
	return { temp: 22, conditions: "sunny" }; // Toy example
}

const tools = [
	{
		type: "function",
		function: {
			name: "getWeather",
			description: "Get current weather for a city.",
			parameters: {
				type: "object",
				properties: {
					city: { type: "string" },
					units: { type: "string", enum: ["celsius", "fahrenheit"] },
				},
				required: ["city"],
			},
		},
	},
];

2. Give tool call schema to model

TypeScript
const response = await client.chat.completions.create({
	model: "openai/gpt-4.1",
	messages: [{ role: "user", content: "Weather in Paris?" }],
	tools, // From step 1
});

// Process what the model responds with!
const msg = response.choices[0].message;
const toolCall = msg.tool_calls[0];
const args = JSON.parse(toolCall.function.arguments);

3. Execute the tool call

TypeScript
const result = getWeather(args.city, args.units);

const final = await client.chat.completions.create({
	model: "openai/gpt-4.1",
	messages: [
		{ role: "user", content: "Weather in Paris?" },
		msg,
		{ role: "tool", tool_call_id: toolCall.id, content: JSON.stringify(result) },
	],
	tools,
});

console.log(final.choices[0].message.content);
When put all together, a simple tool call looks like this:
TypeScript
import Dedalus from "dedalus-labs";

const client = new Dedalus();

function getWeather(city: string, units = "celsius") {
	return { temp: 22, conditions: "sunny" };
}

const tools = [
	{
		type: "function",
		function: {
			name: "getWeather",
			description: "Get current weather for a city.",
			parameters: {
				type: "object",
				properties: {
					city: { type: "string" },
					units: { type: "string", enum: ["celsius", "fahrenheit"] },
				},
				required: ["city"],
			},
		},
	},
];

const response = await client.chat.completions.create({
	model: "openai/gpt-4.1",
	messages: [{ role: "user", content: "Weather in Paris?" }],
	tools,
});

const msg = response.choices[0].message;
const toolCall = msg.tool_calls[0];
const args = JSON.parse(toolCall.function.arguments);
const result = getWeather(args.city, args.units);

const final = await client.chat.completions.create({
	model: "openai/gpt-4.1",
	messages: [
		{ role: "user", content: "Weather in Paris?" },
		msg,
		{ role: "tool", tool_call_id: toolCall.id, content: JSON.stringify(result) },
	],
	tools,
});

console.log(final.choices[0].message.content);
That’s a lot of work! You are hand-writing schemas, parsing args, dispatching tool calls, and maintaining the request/response loop yourself.

The Dedalus Way

The DedalusRunner supports automatic tool calling serialization. This means that it handles schema extraction, tool dispatch, conversation looping, and final response handling. All you have to do is pass in your function into the tools parameter!
TypeScript
import Dedalus, { DedalusRunner } from "dedalus-labs";

function getWeather(city: string, units: string = "celsius") {
	return { temp: 22, conditions: "sunny" };
}

const client = new Dedalus();
const runner = new DedalusRunner(client);

const result = await runner.run({
	input: "What's the weather in Paris?",
	model: "openai/gpt-4.1",
	tools: [getWeather],
});

console.log(result.finalOutput);
See Response Schemas for the full ChatCompletion and RunResult shapes, including tool_calls fields.

Writing good tools

Type your functions. Annotations become the JSON schema the model sees. city: string becomes {"type": "string"}. The more specific the types, the better the model fills them in. Write docstrings. The Runner uses your docstring as the tool’s description. The model reads it to decide when to call the function. Use descriptive names. The model picks which tool to call by name. getWeather beats doStuff. Keep tool counts low. Tool schemas take up space in the context window. Minimize the tools you pass for a given task.

Read more

Dedalus Runner

Learn more about the DedalusRunner

MCP Servers

Learn how to connect MCP servers to your Dedalus models

Structured Outputs

Guarantee that your model outputs the desired schema every time

Use Cases

End-to-end examples for inspiration
Last modified on April 9, 2026