Skip to main content

Databricks setup

profiles.yml file is for dbt Core users only

If you're using dbt Cloud, you don't need to create a profiles.yml file. This file is only for dbt Core users. To connect your data platform to dbt Cloud, refer to About data platforms.

  • Maintained by: Databricks
  • Authors: some dbt loving Bricksters
  • GitHub repo: databricks/dbt-databricks
  • PyPI package: dbt-databricks
  • Slack channel: #db-databricks-and-spark
  • Supported dbt Core version: v0.18.0 and newer
  • dbt Cloud support: Supported
  • Minimum data platform version: Databricks SQL or DBR 12+

Installing dbt-databricks

Use pip to install the adapter. Before 1.8, installing the adapter would automatically install dbt-core and any additional dependencies. Beginning in 1.8, installing an adapter does not automatically install dbt-core. This is because adapters and dbt Core versions have been decoupled from each other so we no longer want to overwrite existing dbt-core installations. Use the following command for installation:

Configuring dbt-databricks

For Databricks-specific configuration, please refer to Databricks configs.

dbt-databricks is the recommended adapter for Databricks. It includes features not available in dbt-spark, such as:

  • Unity Catalog support
  • No need to install additional drivers or dependencies for use on the CLI
  • Use of Delta Lake for all models out of the box
  • SQL macros that are optimized to run with Photon

Connecting to Databricks

To connect to a data platform with dbt Core, create the appropriate profile and target YAML keys/values in the profiles.yml configuration file for your Databricks SQL Warehouse/cluster. This dbt YAML file lives in the .dbt/ directory of your user/home directory. For more info, refer to Connection profiles and profiles.yml.

dbt-databricks can connect to Databricks SQL Warehouses and all-purpose clusters. Databricks SQL Warehouses is the recommended way to get started with Databricks.

Refer to the Databricks docs for more info on how to obtain the credentials for configuring your profile.

Examples

You can use either token-based authentication or OAuth client-based authentication to connect to Databricks. Refer to the following examples for more info on how to configure your profile for each type of authentication.

~/.dbt/profiles.yml
your_profile_name:
target: dev
outputs:
dev:
type: databricks
catalog: [optional catalog name if you are using Unity Catalog]
schema: [schema name] # Required
host: [yourorg.databrickshost.com] # Required
http_path: [/sql/your/http/path] # Required
token: [dapiXXXXXXXXXXXXXXXXXXXXXXX] # Required Personal Access Token (PAT) if using token-based authentication
threads: [1 or more] # Optional, default 1

Host parameters

The following profile fields are always required.

FieldDescriptionExample
hostThe hostname of your cluster.

Don't include the http:// or https:// prefix.
yourorg.databrickshost.com
http_pathThe http path to your SQL Warehouse or all-purpose cluster./sql/your/http/path
schemaThe name of a schema within your cluster's catalog.

It's not recommended to use schema names that have upper case or mixed case letters.
my_schema

Authentication parameters

The dbt-databricks adapter supports both token-based authentication and OAuth client-based authentication.

Refer to the following required parameters to configure your profile for each type of authentication:

FieldAuthentication typeDescriptionExample
tokenToken-basedThe Personal Access Token (PAT) to connect to Databricks.dapiXXXXXXXXX
XXXXXXXXXXXXXX
client_idOAuth-basedThe client ID for your Databricks OAuth application.
<oauth-client-id>
client_secretOAuth-basedThe client secret for your Databricks OAuth application.
XXXXXXXXXXXXX
XXXXXXXXXXXXXX
auth_typeOAuth-basedThe type of authorization needed to connect to Databricks.
oauth

Additional parameters

The following profile fields are optional to set up. They help you configure how your cluster's session and dbt work for your connection.

Profile fieldDescriptionExample
threadsThe number of threads dbt should use (default is 1)8
connect_retriesThe number of times dbt should retry the connection to Databricks (default is 1)3
connect_timeoutHow many seconds before the connection to Databricks should timeout (default behavior is no timeouts)1000
session_propertiesThis sets the Databricks session properties used in the connection. Execute SET -v to see available optionsansi_mode: true

Supported Functionality

Delta Lake

Most dbt Core functionality is supported, but some features are only available on Delta Lake.

Delta-only features:

  1. Incremental model updates by unique_key instead of partition_by (see merge strategy)
  2. Snapshots

Unity Catalog

The adapter dbt-databricks>=1.1.1 supports the 3-level namespace of Unity Catalog (catalog / schema / relations) so you can organize and secure your data the way you like.

0