1. Create an API key
Create a separate key for every project so usage can be limited and audited later.
Authorization: Bearer YOUR_API_KEYOpenAI API Proxy · OpenAI Compatible API
Use one aliapi.me Base URL for OpenAI-compatible requests. Keep the SDK you already use while adding project keys, request logs, usage analytics and future multi-model routing.
This page targets real developer intent: where to put the Base URL, how to update SDK setup and how to debug failed requests.
Create a separate key for every project so usage can be limited and audited later.
Authorization: Bearer YOUR_API_KEYSet your SDK or tool to the aliapi.me OpenAI-compatible endpoint.
https://api.aliapi.me/v1Continue using Chat Completions, models and messages in the familiar OpenAI format.
/v1/chat/completions| Capability | Direct integration | With aliapi.me |
|---|---|---|
| Base URL | Configured per provider | One entry point for future routing |
| Keys | Spread across projects and env vars | Create, disable, limit and audit per project |
| Debugging | Implemented by each app | Central logs for status, latency, tokens and errors |
| Model switching | Requires code or config changes | Can be moved into gateway policy |
No. The common path is to keep the SDK and replace only the Base URL and API key.
Those tools usually support a custom OpenAI-compatible Base URL, so they can connect to different models through one gateway.
Use clear terms such as API relay, unified access, API gateway and OpenAI-compatible API. Avoid unclear or exaggerated claims.
Browse Claude, Gemini, DeepSeek, Qwen and other model groups in the model directory, or use the FAQ library for Base URL, SDK and tool setup questions.