Skip to content
All posts
·4 min readpythonsdktutorialmulti-ai

Python SDK Guide: One Key, Five AI Providers

Use Python with aiapi.cheap to call Claude, GPT, Gemini, Grok, or DeepSeek. Pick the SDK you already know — Anthropic or OpenAI — and just swap the base URL.

The Real-Talk Version

You already know one Python SDK. Maybe anthropic, maybe openai. You do not want to learn five.

Good news — you do not have to. aiapi.cheap speaks both shapes. One key, one base URL, and either SDK calls Claude, GPT, Gemini, Grok, or DeepSeek depending on the model name you pass.

This guide shows the two paths. Pick the one that matches the SDK already in your requirements.txt.

Path 1: Anthropic SDK (Native Claude API)

Use this if your code already imports anthropic. The /v1/messages endpoint matches Anthropic's spec exactly. Tool use, streaming, prompt caching, extended thinking — all work unchanged.

This path is Claude-only. To call GPT or DeepSeek from the same code, jump to Path 2.

Path 2: OpenAI SDK (Universal — Hits All 5 Vendors)

Use this if you already import openai, or if you want one SDK for everything. The /v1/chat/completions endpoint dispatches by model name:

  • model="claude-sonnet-4-6" → Anthropic
  • model="gpt-4o" → OpenAI
  • model="gemini-3-pro-preview" → Google
  • model="grok-4.2" → xAI
  • model="deepseek-v3.2" → DeepSeek
  • Same request shape, same response shape. The proxy handles vendor differences for you.

    Setup In 60 Seconds

    Before the code, you need three things:

    1. An aiapi.cheap account (signup).

    2. Some balance topped up via crypto (Oxapay — USDT, BTC, ETH).

    3. A customer key from your dashboard. It starts with sk-aic-.

    The base URL you point both SDKs at is always:

    https://aiapi.cheap/api/proxy

    That is it. No region routing, no separate endpoints per vendor.

    Code Sample 1 — Anthropic SDK Calling Claude

    Install:

    pip install anthropic
    import os
    from anthropic import Anthropic
    
    client = Anthropic(
        base_url="https://aiapi.cheap/api/proxy",
        api_key=os.environ["AIAPI_KEY"],  # sk-aic-...
    )
    
    response = client.messages.create(
        model="claude-sonnet-4-6",
        max_tokens=1024,
        messages=[
            {"role": "user", "content": "Write a haiku about cheap APIs."}
        ],
    )
    
    print(response.content[0].text)

    If you set ANTHROPIC_BASE_URL and ANTHROPIC_API_KEY as env vars, you can skip passing them — the SDK picks them up automatically. Same for tools like Claude Code.

    Code Sample 2 — OpenAI SDK Calling GPT

    Install:

    pip install openai
    import os
    from openai import OpenAI
    
    client = OpenAI(
        base_url="https://aiapi.cheap/api/proxy",
        api_key=os.environ["AIAPI_KEY"],  # sk-aic-...
    )
    
    response = client.chat.completions.create(
        model="gpt-4o",
        messages=[
            {"role": "user", "content": "Write a haiku about cheap APIs."}
        ],
    )
    
    print(response.choices[0].message.content)

    The key insight: this exact same code calls Gemini, Grok, or DeepSeek. Only the model field changes.

    # Switch to Gemini
    response = client.chat.completions.create(
        model="gemini-3-pro-preview",
        messages=[{"role": "user", "content": "Same prompt, different brain."}],
    )
    
    # Switch to DeepSeek (cheapest of the bunch)
    response = client.chat.completions.create(
        model="deepseek-v3.2",
        messages=[{"role": "user", "content": "Same prompt, different brain."}],
    )

    No if vendor == "openai" branching in your app code. The proxy does that work.

    Streaming Works The Same

    Both SDKs support streaming. SSE chunks come through unchanged.

    Anthropic SDK:

    with client.messages.stream(
        model="claude-sonnet-4-6",
        max_tokens=1024,
        messages=[{"role": "user", "content": "Stream me a poem."}],
    ) as stream:
        for text in stream.text_stream:
            print(text, end="", flush=True)

    OpenAI SDK:

    stream = client.chat.completions.create(
        model="gpt-4o",
        messages=[{"role": "user", "content": "Stream me a poem."}],
        stream=True,
    )
    for chunk in stream:
        if chunk.choices[0].delta.content:
            print(chunk.choices[0].delta.content, end="", flush=True)

    LangChain, Claude Code, Cursor, Raw Fetch

    Anything that talks to Anthropic or OpenAI talks to us. Examples:

  • LangChain — pass base_url to ChatAnthropic or ChatOpenAI.
  • Claude Code — set ANTHROPIC_BASE_URL and ANTHROPIC_API_KEY env vars in your shell.
  • Cursor — point custom OpenAI base URL at our proxy.
  • Raw `requests` / `httpx` — POST to /v1/messages or /v1/chat/completions with bearer auth.
  • The SDKs themselves are the same code you would run against the official API. See the official Python SDK source if you want to dig in: github.com/anthropics/anthropic-sdk-python.

    Common Mistakes (Real-Talk)

  • Forgetting to set base URL. SDK quietly hits the official endpoint, which 401s with your sk-aic-* key. Double-check the URL.
  • Mixing keys. Your sk-aic-* key only works on aiapi.cheap. The proxy does not need your Anthropic or OpenAI keys — those live on our side.
  • Hardcoding the key. Use env vars. Always.
  • Using the wrong endpoint for the SDK. Anthropic SDK hits /v1/messages, OpenAI SDK hits /v1/chat/completions. Both are mounted under the same base URL — the SDKs handle the path themselves.
  • Cost-Wise, What This Buys You

    Same models, 70-80% off official prices. A typical Sonnet 4.6 workload at $300/month official drops to $60/month on Pro. See the pricing comparison post for the full per-vendor breakdown.

    Next Steps

  • Welcome post — what aiapi.cheap is, in 5 minutes
  • Docs — full endpoint reference, error codes, rate limits
  • Pricing comparison — see the dollar savings per vendor
  • Grab a key, swap one line of config, ship something.

    Start at aiapi.cheap