Skip to main content

Saved queries

Saved queries are a way to save commonly used queries in MetricFlow. You can group metrics, dimensions, and filters that are logically related into a saved query. Saved queries is a node and visible in the dbt DAG.

Saved queries serve as the foundational building block, allowing you to configure exports in your saved query configuration. Exports takes this functionality a step further by enabling you to schedule and write saved queries directly within your data platform using dbt Cloud's job scheduler.


To create a saved query, refer to the following table parameters.


Note that we use the double colon (::) to indicate whether a parameter is nested within another parameter. So for example, query_params::metrics means the metrics parameter is nested under query_params.

If you use multiple metrics in a saved query, then you will only be able to reference the common dimensions these metrics share in the group_by or where clauses. Use the entity name prefix with the Dimension object, like Dimension('user__ds').

Configure saved query

Use saved queries to define and manage common Semantic Layer queries in YAML, including metrics and dimensions. Saved queries enable you to organize and reuse common MetricFlow queries within dbt projects. For example, you can group related metrics together for better organization, and include commonly used dimensions and filters.

In your saved query config, you can also leverage caching with the dbt Cloud job scheduler to cache common queries, speed up performance, and reduce compute costs.

To build saved_queries, use the --resource-type flag and run the command dbt build --resource-type saved_query.

Configure exports

Exports are an additional configuration added to a saved query. They define how to write a saved query, along with the schema and table name.

Once you've configured your saved query and set the foundation block, you can now configure exports in the saved_queries YAML configuration file (the same file as your metric definitions). This will also allow you to run exports automatically within your data platform using dbt Cloud's job scheduler.

The following is an example of a saved query with an export:

- name: order_metrics
description: Relevant order metrics
- orders
- large_order
- food_orders
- order_total
- Entity('order_id')
- TimeDimension('metric_time', 'day')
- Dimension('customer__customer_name')
- ... # Additional group_by
- "{{TimeDimension('metric_time')}} > current_timestamp - interval '1 week'"
- ... # Additional where clauses
- name: order_metrics
export_as: table # Options available: table, view
schema: YOUR_SCHEMA # Optional - defaults to deployment schema
alias: SOME_TABLE_NAME # Optional - defaults to Export name

Run exports

Once you've configured exports, you can now take things a step further by running exports to automatically write saved queries within your data platform using dbt Cloud's job scheduler. This feature is only available with the dbt Cloud's Semantic Layer.

For more information on how to run exports, refer to the Exports documentation.


 Can I have multiple exports in a single saved query?
 How can I select saved_queries by their resource type?