Overview of the buster init Command

The buster init command streamlines the process of setting up a new Buster project. It creates a buster.yml configuration file and helps connect your data warehouse in a single guided workflow.

Basic Usage

To initialize a new Buster project:

buster init

Running this command will start an interactive setup process that:

  1. Creates a buster.yml file in your current directory
  2. Helps you connect to your data warehouse
  3. Configures model paths and database settings
  4. Optionally generates semantic models from your dbt catalog

Connecting a Data Source

When you run buster init, you’ll be guided through connecting your data warehouse:

  1. You’ll first be asked to select your data warehouse type from the supported options:

    • Postgres
    • BigQuery
    • Snowflake
    • Redshift
    • MySQL
    • SQL Server
    • Databricks
    • ClickHouse
    • StarRocks
    • Supabase
  2. Based on your selection, you’ll be prompted for the required connection details:

    • Host/endpoint
    • Port
    • Database name
    • Schema
    • Username and password (or other authentication details)
  3. The command will test the connection to ensure everything is working properly

  4. Upon successful connection, the credentials will be securely stored

Configuring Your Project

After connecting your data source, you’ll be asked to configure your project:

  1. Data Source Name: A unique identifier for this connection
  2. Database: The database to connect to
  3. Schema: The schema containing your models
  4. Model Paths: Directories containing your dbt models or SQL files
  5. Semantic Model Paths (Optional): Custom directories for your semantic model YAML files

Discovering dbt Projects

The init command automatically detects and integrates with dbt projects:

  1. It looks for common dbt configuration files (dbt_project.yml, etc.)
  2. It identifies model paths based on dbt project settings
  3. It finds your dbt catalog files for semantic model generation

For more details on working with dbt, see our complete guide to Integrating with dbt.

Generating Semantic Models

The init command includes integration with the generate command, letting you:

  1. Automatically discover your dbt catalog
  2. Generate initial semantic models based on your database schema and dbt models
  3. Place these models alongside your SQL files or in a dedicated directory

Example Flow

Here’s an example of initializing a project with a Postgres database:

$ buster init

✨ Creating a new Buster project ✨

Let's connect to your data warehouse.
Which database are you using? [postgres/bigquery/snowflake/...]: postgres

Enter connection details for Postgres:
Host: localhost
Port: 5432
Database: mydatabase
Username: myuser
Password: ********

Testing connection... Success!

Enter project configuration:
Data Source Name: my_analytics
Schema: public
Model Paths (comma-separated): models
Generate semantic models from dbt catalog? [y/N]: y

Searching for dbt catalog...
Found catalog with 15 models and 120 columns.
Generating semantic models... Done!

Project configured successfully! Your buster.yml file has been created.

Configuration File

The init command creates a buster.yml file with your configuration:

projects:
  - path: .
    data_source_name: my_analytics
    schema: public
    database: mydatabase
    model_paths:
      - models
    semantic_model_paths:
      - semantic_models

Additional Options

The init command supports several flags to customize behavior:

  • --no-interactive: Run in non-interactive mode (useful for scripts)
  • --project-file FILE: Specify a custom location for the buster.yml file
  • --data-source NAME: Pre-select a data source name
  • --database-type TYPE: Pre-select a database type
  • --generate-models: Automatically generate semantic models
  • --skip-connection-test: Skip testing the database connection

Next Steps

After initializing your project:

  1. Review the generated buster.yml file
  2. Examine any generated semantic models and enhance them as needed
  3. Run buster deploy to deploy your project