Turbine On-Prem 26.0.4 Release
swimlane announces that turbine now supports bring your own model (byom) architectures turbine customers can route all hero ai features—including hero ai native actions in automation, turbine deep agents, and hero ai companion—through their own enterprise ai subscriptions and infrastructure this enables organizations to use turbine alongside existing ai stacks, retain full data sovereignty, and satisfy strict regulatory, compliance, and security requirements to support this, turbine introduces two new ways to connect models litellm ai gateway integration turbine now integrates with the litellm ai gateway , an open source ai gateway that simplifies multi model management through a unified, openai compatible interface for more than 100 llm apis, including anthropic, azure, bedrock, and vertex ai for details on litellm and its architecture, see litellm https //www litellm ai/ in this release, turbine uses the aws bedrock compatible converse endpoint to connect to litellm ai gateway this integration is currently officially supported and tested with the anthropic family of models turbine will extend official litellm support to additional models available within the bedrock ecosystem in upcoming releases availability available for turbine on‑premises deployments in this release turbine cloud support for litellm ai gateway is planned for a future release direct amazon aws bedrock integration (bring your own key) turbine also introduces a direct bring your own bedrock key option customers can provide their own aws bedrock credentials to route hero ai features directly through their existing aws infrastructure availability available for turbine on‑premises deployments in this release