Skip to main content
Client‑agnostic by design. If your client can call an OpenAI‑style HTTP API, it works with OpenStack. In most cases you only change the baseURL and the API key — no new SDK integration or code changes required. Works with: official OpenAI SDK, Vercel AI SDK, OpenRouter SDK, community clients, and plain HTTP requests. Edge/serverless runtimes are supported. Compatibility: Chat Completions (including streaming), tools/function‑calling, JSON/structured outputs.

OpenStack SDK

Official OpenStack SDK is coming soon.

OpenAI SDK

Use OpenStack with the official OpenAI SDK for seamless integration.

AI SDK

Connect OpenStack to Vercel’s AI SDK for easy model access.

OpenRouter SDK

Leverage OpenStack with the OpenRouter SDK for access to 500+ models.

Google ADK

Integrate OpenStack with the Google Agent Development Kit (ADK) via LiteLLM.

Together AI SDK

Use OpenStack with the Together AI SDK for custom model integration.

Environment Setup

Ensure you have your OpenStack API key set in your environment variables:
OPENSTACK_API_KEY="your_openstack_api_key_here"
Always send a stable, pseudonymous user id. For browser-only apps, do not expose your paywall key — use a server or edge function.