Introducing v2 — Now Even Faster

One API for
All AI Models.

TheOldAPI provides a unified API for all your AI needs. No pay-as-you-go, just a simple $5/mo subscription.

0
Models
0
Providers
0
Uptime
0
ms Latency
OpenAI Anthropic Google Gemini DeepSeek Meta Llama Mistral xAI Grok Cohere Qwen Perplexity HuggingFace Together AI OpenAI Anthropic Google Gemini DeepSeek Meta Llama Mistral xAI Grok Cohere Qwen Perplexity HuggingFace Together AI
Top Models

Explore and use powerful models

We provide API-based inference across leading models, including text generation, image generation, and more.

Why Us

Why choose TheOldAPI?

There are many reasons to choose us, but here are some of the most important ones.

OpenAI Compatible

Drop-in replacement. Works with any OpenAI SDK, LangChain, or HTTP client. Just change your base URL.

Blazing Fast

Direct provider connections with connection pooling. Sub-100ms routing overhead.

Always Available

Multi-provider failover. If one backend goes down, we route to the next automatically.

Full Tool Support

Function calling, streaming, vision, embeddings — every feature works out of the box.

Hundreds of Models

GPT-5.3, Claude Opus, Gemini Pro, Grok, DeepSeek, and many more — all from one API.

Anthropic Format

Native support for both OpenAI and Anthropic message formats via /v1/messages.

Live Model Explorer

Model Explorer

Browse all available models. Filter by provider or search by name.

0 / 0 models

Loading models...

Simple pricing, powerful models

No pay-as-you-go. No hidden fees. Just a simple subscription.

Monthly
$ 5 /month

Full access to every model

  • All 100+ models
  • Unlimited requests
  • 30 req/min rate limit
  • Streaming support
  • Priority routing & failover
  • Full tool/function calling
  • Vision & image generation
  • Anthropic /v1/messages
Enterprise
Custom

For teams with high-volume needs

  • Everything in Unlimited
  • Custom rate limits
  • Dedicated support
  • SLA guarantees
  • Usage analytics
  • Invoice billing
7-Day Money Back
No Hidden Fees
Cancel Anytime

Get Your API Key

Subscribe on Ko-fi with the same email you use on this site. Your API key will appear in your Dashboard automatically.

Why TheOldAPI?

Direct API Access
$50-200+/1M tokens
  • Pay per token
  • Multiple API keys needed
  • No failover
  • Manage each provider separately
TheOldAPI
$5/mo
  • Unlimited tokens
  • One API key for all
  • Automatic failover
  • 100+ models, one dashboard

Frequently Asked Questions

What does "unlimited" mean?

No per-token or per-request charges. The only limit is 30 requests per minute.

Is this OpenAI-compatible?

Yes! Point your OpenAI SDK base URL to our API and it works instantly.

Can I cancel anytime?

Absolutely. Cancel anytime and retain access until the end of your billing period.

How do I get my API key?

Subscribe on Ko-fi with the same email you use here. Sign in, go to your Dashboard, and your key appears automatically.

Welcome Back

Sign in to access your dashboard

Don't have an account?

Dashboard

Welcome back, user

All Systems Operational
Models 100+
Latency <50ms
Uptime 99.9%

Membership

Free
Your Plan
Free
$0/mo
  • No API access
  • No models
  • No streaming
Upgrade to
Pro
$5/mo
  • Unlimited requests
  • 100+ AI models
  • Full streaming support

Subscribe on Ko-fi using the same email as your account here. Your key will appear automatically.

Your API Key

Subscribe to a paid plan to get your API key.

Quick Start

from openai import OpenAI

client = OpenAI(
    base_url="https://your-api.com/v1",
    api_key="YOUR_API_KEY"
)

resp = client.chat.completions.create(
    model="gpt-5.3-codex",
    messages=[{"role": "user",
               "content": "Hello!"}]
)
print(resp.choices[0].message.content)

Endpoints

POST /v1/chat/completions
POST /v1/messages
GET /v1/models
POST /v1/messages/count_tokens
Developer Docs

Documentation

Everything you need to integrate TheOldAPI into your application.

Quick Start

TheOldAPI is fully compatible with the OpenAI SDK. Just change your base_url and you're ready to go.

1. Install the SDK

pip install openai

2. Make your first request

from openai import OpenAI

client = OpenAI(
    base_url="/v1",
    api_key="your-api-key"
)

response = client.chat.completions.create(
    model="gpt-5.3-codex",
    messages=[
        {"role": "user", "content": "Hello!"}
    ],
    stream=True
)

for chunk in response:
    print(chunk.choices[0].delta.content, end="")

Authentication

All API requests require an API key passed via the Authorization header:

Authorization: Bearer YOUR_API_KEY

Get your API key from the Dashboard after signing up.

API Endpoints

POST /v1/chat/completions

Create a chat completion. Supports all OpenAI-compatible parameters including stream, tools, temperature, max_tokens, etc.

GET /v1/models

List all available models. Returns model IDs and provider information.

POST /v1/messages

Anthropic-compatible messages endpoint. Supports Claude-format requests natively.

POST /v1/messages/count_tokens

Count tokens for an Anthropic-format messages request before sending.

Rate Limits

Plan Requests/min Models
Unlimited ($5/mo) 30 All 100+
Enterprise Custom All 100+

Rate limit headers are included in every response: X-RateLimit-Limit, X-RateLimit-Remaining, X-RateLimit-Reset.

Streaming

Set "stream": true in your request body to receive Server-Sent Events (SSE). Each event contains a chunk of the response.

curl /v1/chat/completions \
  -H "Authorization: Bearer YOUR_KEY" \
  -H "Content-Type: application/json" \
  -d '{"model":"gpt-5.3-codex","stream":true,
       "messages":[{"role":"user","content":"Hi"}]}'

Node.js Example

import OpenAI from "openai";

const client = new OpenAI({
    baseURL: "/v1",
    apiKey: "your-api-key",
});

const completion = await client.chat.completions.create({
    model: "claude-sonnet-4-20250514",
    messages: [{ role: "user", content: "Hello!" }],
});

console.log(completion.choices[0].message.content);

Join our Discord community

Join our Discord community to connect with other developers, get help, and stay updated on the latest TheOldAPI news.

Join Discord
theoldapi