INTEGRATIONS

Plug in. Ship faster.

One base URL β€” https://api.seekapi.ai/v1 β€” works anywhere the OpenAI SDK runs.

SeekAPI is the most stable global DeepSeek gateway.

Framework hubs β€” market view

Pick a stack. One base URL. Same savings story everywhere.

Cursor / VS Code

Verified
  1. Open your project env or settings where the OpenAI client is configured.
  2. Set base URL (or OPENAI_BASE_URL) to https://api.seekapi.ai/v1.
  3. Replace the API key with your SeekAPI key from the console.
  4. Keep model names compatible with your SeekAPI routing (e.g. DeepSeek).
# .env.local
OPENAI_API_KEY=sk_seekapi_...
OPENAI_BASE_URL=https://api.seekapi.ai/v1

Obsidian

Verified

For community plugins that call OpenAI-compatible endpoints, use these adapter fields:

  • API base: https://api.seekapi.ai/v1
  • Path style: OpenAI /chat/completions
  • Auth header: Authorization: Bearer <SEEKAPI_KEY>

Raycast

Verified

In AI extensions or scripts that expose β€œOpenAI API URL” and β€œAPI Key”:

  • Base URL: https://api.seekapi.ai/v1
  • Model: your enabled SeekAPI model id
  • Key: SeekAPI token (never commit it to a public extension)

Next.js

Verified

Server-side only β€” keep keys in env, never in client bundles.

// app/api/chat/route.ts (or server action)
import OpenAI from "openai";

const client = new OpenAI({
  apiKey: process.env.SEEKAPI_API_KEY!,
  baseURL: "https://api.seekapi.ai/v1",
});

export async function POST(req: Request) {
  const { messages } = await req.json();
  const completion = await client.chat.completions.create({
    model: "deepseek-chat",
    messages,
  });
  return Response.json(completion);
}

Python SDK

Verified
from openai import OpenAI
import os

client = OpenAI(
    api_key=os.environ["SEEKAPI_API_KEY"],
    base_url="https://api.seekapi.ai/v1",
)

resp = client.chat.completions.create(
    model="deepseek-chat",
    messages=[{"role": "user", "content": "Hello from SeekAPI"}],
)
print(resp.choices[0].message.content)

Read full docsOpen console