Skip to content
Open
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
1 change: 1 addition & 0 deletions website/docs/docs/dbt-versions/release-notes.md
Original file line number Diff line number Diff line change
Expand Up @@ -22,6 +22,7 @@ Release notes are grouped by month for both multi-tenant and virtual private clo
- **New**: You can use the `platform_detection_timeout_seconds` parameter to control how long the Snowflake connector waits when detecting the cloud platform where the connection is being made. For more information, see [Snowflake setup](/docs/core/connect-data-platform/snowflake-setup#platform_detection_timeout_seconds).
- **New**: The `cluster_by` configuration is supported in dynamic tables. For more information, see [Dynamic table clustering](/reference/resource-configs/snowflake-configs#dynamic-table-clustering).
- **New**: When jobs exceed their configured timeout, the BigQuery adapter sends a cancellation request to the BigQuery job. For more information, see [Connect BigQuery](/docs/cloud/connect-data-platform/connect-bigquery#job-creation-timeout-seconds).
- **Enhancement**: The BigQuery adapter now supports an [`execution_project`](/reference/resource-configs/bigquery-configs#configuring-execution-projects) config on individual models, allowing you to route specific workloads to alternate GCP billing projects.

## October 2025

Expand Down
79 changes: 79 additions & 0 deletions website/docs/reference/resource-configs/bigquery-configs.md
Original file line number Diff line number Diff line change
Expand Up @@ -18,6 +18,85 @@ To-do:
For our reference documentation, you can declare `project` in place of `database.`
This will allow you to read and write from multiple BigQuery projects. Same for `dataset`.

<VersionBlock firstVersion="1.12">

## Configuring execution projects

By default, dbt submits queries to the `execution_project` defined in your
[profile configuration](/docs/core/connect-data-platform/bigquery-setup#profiles.yml). When you
need certain resources to bill to a different GCP project (for example, separating production and
sandbox workloads), you can override the execution project by setting the `execution_project` model
configuration.

- Accepts a single project id string, or a mapping of `{target-name: project-id}`
- When provided as a mapping, dbt uses the value that matches the active target in your profile
- If the configured project matches the default execution project, dbt leaves the current project unchanged

<Tabs
defaultValue="dbt_project.yml"
values={[
{ label: 'Project file', value: 'dbt_project.yml', },
{ label: 'Property file', value: 'models/my_model.yml', },
{ label: 'SQL config', value: 'models/events/sessions.sql', },
]}
>

<TabItem value="dbt_project.yml">

<File name='dbt_project.yml'>

```yaml
name: my_project
version: 1.0.0

models:
my_project:
intensive:
+execution_project:
dev: "analytics-dev-execution"
prod: "analytics-prod-execution"
```

</File>
</TabItem>

<TabItem value="models/my_model.yml">

<File name='models/my_model.yml'>

```yaml
models:
- name: heavy_compute_model
config:
execution_project: "billing-project-for-heavy-models"
```

</File>
</TabItem>

<TabItem value="models/events/sessions.sql">

<File name='models/events/sessions.sql'>

```sql
{{ config(
execution_project = {
"dev": "dev-execution-project",
"prod": "prod-execution-project"
}
) }}

select * from {{ ref('staging_sessions') }}
```

</File>
</TabItem>
</Tabs>

dbt automatically switches to the configured project before running the model and restores the previous project once the model finishes.

</VersionBlock>

## Using table partitioning and clustering

### Partition clause
Expand Down