Skip to main content
BYOK lets you send requests through Dedalus using your own provider API key. The request still flows through our unified API (routing, tool calling, streaming, format normalization), but the LLM call is billed to your account with the provider.

When to use BYOK

  • You have negotiated pricing or credits with a provider.
  • You want to use a model tier or region not available on our shared keys.
  • Your compliance policy requires that API keys stay under your control.

Quick start

Pass three headers (or SDK options) alongside your normal Dedalus API key:
HeaderSDK optionDescription
X-ProviderproviderProvider name (openai, anthropic, google, etc.)
X-Provider-Keyprovider_keyYour API key for that provider
X-Provider-Modelprovider_modelModel identifier at the provider (optional)
Only X-Provider-Key is strictly required. If you omit X-Provider, it is inferred from the model name. If you omit X-Provider-Model, the model from the request body is used.

Examples

curl

curl https://api.dedaluslabs.ai/v1/chat/completions \
  -H "Authorization: Bearer $DEDALUS_API_KEY" \
  -H "X-Provider: openai" \
  -H "X-Provider-Key: $OPENAI_API_KEY" \
  -H "X-Provider-Model: gpt-4o" \
  -H "Content-Type: application/json" \
  -d '{
    "model": "openai/gpt-4o",
    "messages": [{"role": "user", "content": "Hello"}]
  }'

Python SDK

from dedalus_sdk import AsyncDedalus

client = AsyncDedalus(
    provider="openai",
    provider_key="sk-your-openai-key",
    provider_model="gpt-4o",
)

response = await client.chat.completions.create(
    model="openai/gpt-4o",
    messages=[{"role": "user", "content": "Hello"}],
)

TypeScript SDK

import Dedalus from "dedalus";

const client = new Dedalus({
	provider: "openai",
	providerKey: "sk-your-openai-key",
	providerModel: "gpt-4o",
});

const response = await client.chat.completions.create({
	model: "openai/gpt-4o",
	messages: [{ role: "user", content: "Hello" }],
});

Environment variables

You can also set BYOK options via environment variables instead of passing them in code:
export DEDALUS_PROVIDER="anthropic"
export DEDALUS_PROVIDER_KEY="sk-ant-your-key"
export DEDALUS_PROVIDER_MODEL="claude-sonnet-4-5-20250929"
The SDK picks these up automatically. No code changes needed.

Per-request overrides

The SDK options set defaults for every request. You can also override per-request by setting the headers directly:
response = await client.chat.completions.create(
    model="google/gemini-2.5-pro",
    messages=[{"role": "user", "content": "Hello"}],
    extra_headers={
        "X-Provider": "google",
        "X-Provider-Key": "your-google-key",
    },
)

Supported providers

Any provider in our model list works with BYOK:

OpenAI

openai

Anthropic

anthropic

Google

google

xAI

xai

Mistral

mistral

DeepSeek

deepseek

Groq

groq

Cohere

cohere

Perplexity

perplexity

Cerebras

cerebras

Together AI

together_ai

Fireworks AI

fireworks_ai

Moonshot

moonshot

How it works

Your request still goes through Dedalus. We handle routing, format normalization, streaming, and tool calling. The only difference is which API key is used for the upstream LLM call.
You → Dedalus API (your Dedalus key) → Provider (your provider key) → Response → You
BYOK keys are sent over HTTPS and are never stored. They are used for the duration of the request and discarded. If you need Dedalus to manage keys on your behalf, contact us at support@dedaluslabs.ai.

Error handling

ScenarioWhat happens
Invalid provider nameHTTP 400 with supported provider list
Missing or invalid provider keyProvider returns its own auth error (usually 401)
Model not available on providerProvider returns its own model error (usually 404)
The error response always includes the upstream provider’s error message so you can debug directly.
Last modified on March 5, 2026