INTEGRATIONS
Plug in. Ship faster.
One base URL β https://api.seekapi.ai/v1 β works anywhere the OpenAI SDK runs.
SeekAPI is the most stable global DeepSeek gateway.
Framework hubs β market view
Pick a stack. One base URL. Same savings story everywhere.
LangChain
Save 90% on LangChain
Vercel AI SDK
Save 90% on Vercel AI SDK
AutoGPT
Save 90% on AutoGPT
Next.js AI
Save 90% on Next.js AI
LlamaIndex
Save 90% on LlamaIndex
CrewAI
Save 90% on CrewAI
LiteLLM
Save 90% on LiteLLM
Haystack
Save 90% on Haystack
DSPy
Save 90% on DSPy
PydanticAI
Save 90% on PydanticAI
Cursor / VS Code
Verified- Open your project env or settings where the OpenAI client is configured.
- Set base URL (or
OPENAI_BASE_URL) tohttps://api.seekapi.ai/v1. - Replace the API key with your SeekAPI key from the console.
- Keep model names compatible with your SeekAPI routing (e.g. DeepSeek).
# .env.local
OPENAI_API_KEY=sk_seekapi_...
OPENAI_BASE_URL=https://api.seekapi.ai/v1Obsidian
VerifiedFor community plugins that call OpenAI-compatible endpoints, use these adapter fields:
- API base:
https://api.seekapi.ai/v1 - Path style: OpenAI
/chat/completions - Auth header:
Authorization: Bearer <SEEKAPI_KEY>
Raycast
VerifiedIn AI extensions or scripts that expose βOpenAI API URLβ and βAPI Keyβ:
- Base URL:
https://api.seekapi.ai/v1 - Model: your enabled SeekAPI model id
- Key: SeekAPI token (never commit it to a public extension)
Next.js
VerifiedServer-side only β keep keys in env, never in client bundles.
// app/api/chat/route.ts (or server action)
import OpenAI from "openai";
const client = new OpenAI({
apiKey: process.env.SEEKAPI_API_KEY!,
baseURL: "https://api.seekapi.ai/v1",
});
export async function POST(req: Request) {
const { messages } = await req.json();
const completion = await client.chat.completions.create({
model: "deepseek-chat",
messages,
});
return Response.json(completion);
}Python SDK
Verifiedfrom openai import OpenAI
import os
client = OpenAI(
api_key=os.environ["SEEKAPI_API_KEY"],
base_url="https://api.seekapi.ai/v1",
)
resp = client.chat.completions.create(
model="deepseek-chat",
messages=[{"role": "user", "content": "Hello from SeekAPI"}],
)
print(resp.choices[0].message.content)