A
Auraon
/Docs
Getting Started

Introduction

Auraon is an AI routing layer that gives you access to 200+ LLMs through a single OpenAI-compatible API. Use any model — GPT-4o, Claude Opus, Gemini, Llama, DeepSeek — without changing your existing code.

How it works

Auraon acts as a proxy between your application and any LLM provider. Send requests to https://api.auraon.ai/v1 with your Auraon API key, and we route to the right model.

Base URL

https://api.auraon.ai/v1

OpenAI compatibility

Auraon is 100% compatible with the OpenAI API spec. Any code using the OpenAI SDK works with Auraon — just change the base_url and your API key.

Python
from openai import OpenAI

client = OpenAI(
    api_key="br-your-api-key",        # Auraon key
    base_url="https://api.auraon.ai/v1"  # only change needed
)

response = client.chat.completions.create(
    model="claude-opus-4",   # or "auto" for smart routing
    messages=[{"role": "user", "content": "Hello!"}]
)
print(response.choices[0].message.content)

Supported models

Auraon supports 200+ models from all major providers. Use exact model IDs (e.g. gpt-4o, claude-opus-4), or pass auto to let Auraon pick the best model for your request.