# Azure OpenAI Twin — aoai.twins.la A high-fidelity digital twin of the Azure OpenAI Service data plane. The twin emulates chat completions, legacy completions, and embeddings, all backed by deterministic synthetic responses (no real model is ever called). ## URL shape Path-prefixed, single host: https://aoai.twins.la//openai/deployments//chat/completions https://aoai.twins.la//openai/deployments//embeddings https://aoai.twins.la//openai/deployments//completions There is no subdomain-per-resource. The Azure ARM control plane is NOT emulated — operators provision resources, deployments, and api-keys through the Twin Plane (`/_twin/`). ## Authentication (data plane) Both auth paths are accepted on every data-plane endpoint and either is sufficient: * `api-key: ` — primary AOAI auth header. * `Authorization: Bearer ` — AAD-shaped JWT, RS256-signed by the twin's per-resource keypair, obtained from `POST //oauth2/v2.0/token` with grant_type=client_credentials. Tenant isolation is enforced on both paths. ## Twin Plane authentication Twin Plane (`/_twin/`) uses the standard tenant Basic + admin Bearer scheme: * Bootstrap a tenant: `POST /_twin/tenants` -> {tenant_id, tenant_secret} * Tenant calls: HTTP Basic `tenant_id:tenant_secret` * Admin calls: `Authorization: Bearer ` (or `X-Twin-Admin-Token`) ## Key endpoints Twin Plane (no auth): GET /_twin/health GET /_twin/scenarios GET /_twin/settings GET /_twin/references POST /_twin/tenants Twin Plane (Basic tenant_id:tenant_secret): POST /_twin/resources -> {resource_id, base_url} GET /_twin/resources DELETE /_twin/resources/ POST /_twin/resources//api_keys -> {key_id, api_key (shown ONCE)} GET /_twin/resources//api_keys POST /_twin/resources//deployments body: {model, deployment_id?} GET /_twin/resources//deployments DELETE /_twin/resources//deployments/ GET /_twin/logs (or admin Bearer for cross-tenant) POST /_twin/feedback Per-resource AAD endpoints (no auth): GET //.well-known/openid-configuration GET //.well-known/jwks.json POST //oauth2/v2.0/token form: grant_type=client_credentials, client_id=, client_secret= Data plane (api-key OR AAD bearer): POST //openai/deployments//chat/completions POST //openai/deployments//completions POST //openai/deployments//embeddings ## Quick start (cloud) curl -X POST https://aoai.twins.la/_twin/tenants \ -H "Content-Type: application/json" \ -d '{"friendly_name":"Dev"}' # -> { tenant_id, tenant_secret } curl -X POST https://aoai.twins.la/_twin/resources \ -u "TENANT_ID:TENANT_SECRET" \ -H "Content-Type: application/json" \ -d '{"friendly_name":"my-aoai"}' # -> { resource_id, base_url } curl -X POST https://aoai.twins.la/_twin/resources/RID/api_keys \ -u "TENANT_ID:TENANT_SECRET" -d '{}' # -> { key_id, api_key } curl -X POST https://aoai.twins.la/_twin/resources/RID/deployments \ -u "TENANT_ID:TENANT_SECRET" \ -H "Content-Type: application/json" \ -d '{"model":"gpt-4o-mini","deployment_id":"chat"}' curl -X POST 'https://aoai.twins.la/RID/openai/deployments/chat/chat/completions?api-version=2024-10-21' \ -H 'api-key: RAW_API_KEY' \ -H 'Content-Type: application/json' \ -d '{"messages":[{"role":"user","content":"hello"}]}' ## SDK example from openai import AzureOpenAI client = AzureOpenAI( api_key="RAW_API_KEY", api_version="2024-10-21", azure_endpoint="https://aoai.twins.la/RID", ) resp = client.chat.completions.create( model="chat", # the deployment name messages=[{"role": "user", "content": "hello"}], ) ## Local pip install twins-aoai twins-aoai-local python -m twins_aoai_local ## Reference GitHub: https://github.com/twins-la/aoai Project overview: https://twins.la All twins: https://github.com/twins-la/twins-la