DEPLOYMENT
Deploy Locally
To deploy Buster locally on your machine, follow these steps:
Prerequisites
- You need to have Docker installed.
- You need an API key from OpenAI.
- You need a reranking API key from one of the following providers (ensure it matches the Cohere API format):
Deployment
-
Install the Buster CLI (macOS Homebrew):
For other operating systems or manual installation, see the Install the CLI guide.
-
Start Buster: Run the following command in your terminal and follow the prompts:
Additional Information
When using the buster auth
command you’ll want to use the --local
flag to point to your local Buster instance.
Roadmap
Currently, we only support using OpenAI as an LLM provider for local deployments. We are using Litellm as a proxy server and will be adding more support in the future.