Skip to main content

Write queries with exports

The exports feature in the dbt Semantic Layer enhances the saved queries by allowing you to write commonly used queries directly within your data platform.

While saved queries are a way to save and reuse commonly used queries in MetricFlow, exports take this functionality a step further by:

  • Enabling you to write these queries within your data platform using dbt Cloud.
  • Proving an integration path for tools that don't natively support the dbt Semantic Layer by exposing tables of metrics and dimensions.

Essentially, exports are like any other table in your data platform. They enable you to query metric definitions through any SQL interface or connect to downstream tools without a first-class Semantic Layer integration. Refer to Available integrations for more information.

Prerequisites

  • You have a multi-tenant dbt Cloud account on a Team or Enterprise plan. Single-tenant is not supported at this time.
  • You use one of the following data platforms: Snowflake, BigQuery, Databricks, or Redshift.
  • You are on dbt version 1.7 or newer.
  • You have the dbt Semantic Layer configured in your dbt project.
  • You have a dbt Cloud environment with a Job scheduler enabled.

Comparison between exports and saved queries

FeatureExports
Saved queries
AvailabilityAvailable on dbt Cloud Team or Enterprise plans with dbt versions 1.7 or newer.
Note, Exports are not supported in dbt Cloud IDE yet.
Available in both dbt Core and dbt Cloud.
PurposeTo write saved queries in your data platform and expose metrics and dimensions as a view or table.To define and manage common Semantic Layer queries in YAML, including metrics and dimensions.
UsageAutomatically runs saved queries and writes them within your data platform. Exports count towards queried metrics usage.

Example: Create a weekly aggregated table for active user metrics, automatically updated and stored in the data platform.
Used for organizing and reusing common MetricFlow queries within dbt projects.


Example: Group related metrics together for better organization, and include commonly used dimensions and filters.
IntegrationMust have the dbt Semantic Layer configured in your dbt project.

Tightly integrated with the MetricFlow Server and dbt Cloud's job scheduler.
Integrated into the dbt DAG and managed alongside other dbt nodes.
ConfigurationDefined within the saved_queries configuration. Set up within the dbt Cloud environment and job scheduler settings.Defined in YAML format within dbt project files.

Define exports

Exports are an additional configuration added to a saved query. They define how to write a saved query, along with the schema and table name.

You can define exports in YAML format as a key within the saved_queries configuration and in the same file as your metric definitions.

An example of a saved query with an export:

semantic_model.yml
saved_queries:
- name: order_metrics
description: Relevant order metrics
query_params:
metrics:
- orders
- large_order
- food_orders
- order_total
group_by:
- Entity('order_id')
- TimeDimension('metric_time', 'day')
- Dimension('customer__customer_name')
- ... # Additional group_by
where:
- "{{TimeDimension('metric_time')}} > current_timestamp - interval '1 week'"
- ... # Additional where clauses
exports:
- name: order_metrics
config:
export_as: table # Options available: table, view
schema: YOUR_SCHEMA # Optional - defaults to deployment schema
alias: SOME_TABLE_NAME # Optional - defaults to Export name

Run exports

Once you define exports in your dbt project, then you can run them. There are two ways to run an export:

Exports in development

You can run an export in your development environment using your development credentials if you want to test the output of the export before production. You can use the following command to run exports in the dbt Cloud CLI:

dbt sl export

The following table lists the options for dbt sl export command:

ParametersTypeRequiredDescription
nameStringRequiredName of the export object.
saved-queryStringRequiredName of a saved query that could be used.
selectList or StringOptionalSpecify the names of exports to select from the saved query.
excludeStringOptionalSpecify the names of exports to exclude from the saved query.
export_asStringOptionalType of export to create from the export_as types available in the config. Options available are table or view.
schemaStringOptionalSchema to use for creating the table or view.
aliasStringOptionalTable alias to use to write the table or view.

You can also run any export defined for the saved query and write the table or view in your development environment. Refer to the following command example and output:

Example

dbt sl export --saved-query sq_name

Output

Polling for export status - query_id: 2c1W6M6qGklo1LR4QqzsH7ASGFs..
Export completed.

Use the select flag

By default, all exports are run for a saved query. You can use the select flag in development.

For example, the following command runs export_1 and export_2 and doesn't work with the --alias or --export_as flags:

dbt sl export --saved-query sq_name --select export_1,export2
Overriding export configurations

The --select flag is mainly used to include or exclude specific exports. If you need to change these settings, you can use the following flags to override export configurations:

  • --export-as Defines the materialization type (table or view) for the export. This creates a new export with its own settings and is useful for testing in development.
  • --schema Specifies the schema to use for the written table or view.
  • --alias Assigns a custom alias to the written table or view. This overrides the default export name.

Be careful. The --select flag can't be used with alias or schema.

For example, you can use the following command to create a new export named new_export as a table:

dbt sl export --saved-query sq_number1 --export-as table --alias new_export

Exports in production

To enable exports in production using the dbt Cloud job scheduler:

  1. Set environment variable
  2. Create and execute exports

Set environment variable

When you run a build job, any saved queries downstream of the dbt models in that job will also run. To make sure your export data is up-to-date, run the export as a downstream step (after the model).

Create and execute exports

  1. After dbt finishes building the models, the MetricFlow Server processes the exports, compiles the necessary SQL, and executes this SQL against your data platform. It directly executes a "create table" statement so the data stays within your data platform.
  2. Review the exports' execution details in the jobs logs and confirm the export was run successfully. This helps troubleshoot and to ensure accuracy. Since saved queries are integrated into the dbt DAG, all outputs related to exports are available in the job logs.
  3. Your data is now available in the data platform for querying.

Enabling and executing exports in dbt Cloud optimizes data workflows and ensures real-time data access. It enhances efficiency and governance for smarter decisions.

You can use the selector syntax --select or -s to specify a particular dbt model to run in your build command to only run the exports downstream of that model, or to select a saved query to run. As an example, the following command will run any saved queries that are downstream of the orders semantic model:

dbt build --select orders+

FAQs

 Can I have multiple exports in a single saved query?
 How do I run all exports for a saved query?
 Will I run duplicate exports if multiple models are downstream of my saved query?
 Can I reference an export as a dbt model using ref()
 How do exports help me use the dbt Semantic Layer in tools that don't support it, such as PowerBI?
 How can I select saved_queries by their resource type?
0