Supported Model Providers
Any provider that supports OpenAI-compatible APIs can be integrated with OpenStack. This includes, but is not limited to:- OpenAI: Access models such as GPT-5, GPT-5-Codex, and more.
- Anthropic: Utilize models like Claude Haiku and Claude Sonnet.
- Together AI: Connect to models hosted on Together AI’s platform.
- OpenRouter: Leverage 500+ models available through OpenRouter.
- Gemini: Use Google’s Gemini models via OpenAI-compatible APIs.
- Custom Models: Bring your own models hosted on any platform that offers OpenAI-compatible APIs.
Getting Started
To connect a model to OpenStack, follow these general steps:Set Up Your Provider Account
Ensure you have an account with the model provider and obtain the necessary API keys.
Configure OpenStack
In the OpenStack dashboard, navigate to the model integration section and add your provider’s API key and endpoint.
Test the Integration
Use the OpenStack playground to send test requests to the model and verify that everything is working as expected.