Azure AI Foundry
Azure AI Foundry API Keys
To use Azure AI Services with OneRouter, you’ll need to provide your Azure API key configuration in JSON format.
Each key configuration requires the following fields:
{
"endpoint_url": "https://<resource>.services.ai.azure.com/deployments/<model-id>/chat/completions?api-version=<api-version>",
"api_key": "your-azure-api-key",
"model_id": "the-azure-model-id"
}You can find these values in your Azure AI Services resource:
endpoint_url: Navigate to your Azure AI Services resource in the Azure portal. In the “Overview” section, you’ll find your endpoint URL. Make sure to append
/chat/completionsto the base URL. You can read more in the Azure Foundry documentation.api_key: In the same “Overview” section of your Azure AI Services resource, you can find your API key under “Keys and Endpoint”.
model_id: This is the name of your model deployment in Azure AI Services.
Make sure to replace the url with your own project url. Also the url should end with /chat/completions with the api version that you would like to use.
Using the OpenAI SDK
from openai import OpenAI
client = OpenAI(
base_url="https://llm.onerouter.pro/v1",
api_key="<API_KEY>",
)
completion = client.chat.completions.create(
byok_conf={
"provider": "azure-ai-foundry",
"fallback": True,
"credentials": {
"endpoint_url": "https://<resource>.services.ai.azure.com/deployments/<model-id>/chat/completions?api-version=<api-version>",
"api_key": "your-azure-api-key",
"model_id": "the-azure-model-id"
}
},
model="gpt-5.1-chat",
messages=[
{
"role": "user",
"content": "What is the meaning of life?"
}
]
)
print(completion.choices[0].message.content)byok_conf: When this parameter is included in the input, OneRouter will route requests according to the BYOK configuration information specified. When this parameter is
nullornot provided, OneRouter defaults to routing to the shared provider pool.provider: Specifies the name of the provider where the request should be routed.
fallback: OneRouter always prioritizes using your provider keys when available. By default, if your key encounters a rate limit or failure, OneRouter will fall back to using shared OneRouter credits.
The default value for fallback is
True.If fallback is specified as
False, OneRouter will only use your key for requests to that provider.
credentials: The keys and credentials used for authentication with the specified provider.
Using the OneRouter API directly
import requests
import json
response = requests.post(
url="https://llm.onerouter.pro/v1/chat/completions",
headers={
"Authorization": "Bearer <API_KEY>",
"Content-Type": "application/json"
},
data=json.dumps({
"byok_conf": {
"provider": "azure-ai-foundry",
"fallback": True,
"credentials": {
"endpoint_url": "https://<resource>.services.ai.azure.com/deployments/<model-id>/chat/completions?api-version=<api-version>",
"api_key": "your-azure-api-key",
"model_id": "the-azure-model-id"
}
},
"model": "gpt-5.1-chat",
"messages": [
{
"role": "user",
"content": "What is the meaning of life?"
}
]
})
)
print(response.json()["choices"][0]["message"]["content"])Last updated