API Docs

OpenAI-compatible API with routing.

lmchat is built to let you keep your OpenAI-style client integration while gaining provider flexibility, reliability primitives, and policy controls. This section is a deep reference, structured by topic.

Base URL
All endpoints live under:
https://lmchat.net/api/v1
Health/discovery endpoint:
GET /api/v1
Authentication
Requests require a bearer token (API key).
Authorization: Bearer LMCHAT_API_KEY
Recommended (optional) headers for attribution/analytics:
HTTP-Referer: https://yourapp.com
X-Title: YourAppName

What is OpenAI-compatible?

Compatibility goal
“OpenAI-compatible” means the request/response envelopes follow the common OpenAI API conventions so many SDKs can work by changing only the base URL and API key. Where providers differ, lmchat normalizes behavior as much as possible, or exposes provider-specific capability differences in model metadata.

Where to start

Quickstart
Minimal examples for curl, OpenAI SDK, Python, and Node runtimes.
Chat Completions
The primary endpoint for modern apps: messages, tools, streaming, and structured outputs.
Routing
Provider preferences, fallbacks, and cost/latency-aware selection.
Data policies
Logging, retention, and how prompts/outputs are handled across providers.

Looking for pricing?

Pricing is documented on Pricing.