Skip to content

[Bug] ExecutionMode.ASYNC fails on BigQuery incremental models: __dbt_tmp table not found during MERGE #2351

@alistairjkho

Description

@alistairjkho

Astronomer Cosmos Version

1.12.1

dbt-core version

1.9.4

Versions of dbt adapters

dbt-bigquery adapter 1.9.1

LoadMode

DBT_LS

ExecutionMode

AIRFLOW_ASYNC

InvocationMode

SUBPROCESS

airflow version

2.10.5

Operating System

Cloud Composer 2.13.9 Linux

If a you think it's an UI issue, what browsers are you seeing the problem on?

No response

Deployment

Google Cloud Composer

Deployment details

No response

What happened?

I'm testing out Cosmos with ExecutionMode.ASYNC mode instead of LOCAL for performance improvements and I'm facing this issue. When using ExecutionMode.ASYNC with the BigQuery adapter, incremental models (even those using insert_overwrite) fail during the MERGE step. The error indicates that the temporary table created by dbt (e.g., model_name__dbt_tmp) is not found by the time the asynchronous BigQuery job attempts to execute the merge script.

It appears that the DbtRunAirflowAsyncOperator or the setup task is triggering dbt's internal cleanup or closing the BigQuery session before the deferred/asynchronous job can access the temporary table. Normal tables/views work just fine.

Incremental models are quite an important feature of dbt so I'm puzzled why this wasn't listed as a limitation in the async execution mode docs - maybe I'm doing something wrong? I'll post the error log in the thread

For reference this is my setup:
BigQuery as DWH
Cosmos v1.12.1
dbt v1.9.4
dbt-bigquery adapter v1.9.1
Running on GCP composer-2.13.9-airflow-2.10.5

Image

Relevant log output

[2026-02-03, 02:10:50 UTC] {taskinstance.py:3315} ERROR - Task failed with exception
Traceback (most recent call last):
  File "/opt/python3.11/lib/python3.11/site-packages/airflow/models/taskinstance.py", line 769, in _execute_task
    result = _execute_callable(context=context, **execute_callable_kwargs)
             ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/opt/python3.11/lib/python3.11/site-packages/airflow/models/taskinstance.py", line 735, in _execute_callable
    return ExecutionCallableRunner(
           ^^^^^^^^^^^^^^^^^^^^^^^^
  File "/opt/python3.11/lib/python3.11/site-packages/airflow/utils/operator_helpers.py", line 252, in run
    return self.func(*args, **kwargs)
           ^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/opt/python3.11/lib/python3.11/site-packages/airflow/models/baseoperator.py", line 1838, in resume_execution
    return execute_callable(context)
           ^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/opt/python3.11/lib/python3.11/site-packages/cosmos/operators/_asynchronous/bigquery.py", line 243, in execute_complete
    job_id = super().execute_complete(context=context, event=event)
             ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/opt/python3.11/lib/python3.11/site-packages/airflow/providers/google/cloud/operators/bigquery.py", line 3064, in execute_complete
    raise AirflowException(event["message"])
airflow.exceptions.AirflowException: Not found: Table data-trustedwarehouse-s:core_dev_staging_composer_test_staging.stg_dim_bq_information_schema_test__dbt_tmp was not found in location EU at [10:46]
[2026-02-03, 02:10:50 UTC] {taskinstance.py:1227} INFO - Marking task as FAILED. dag_id=dbt_trustedwarehouse_models_deferrable_operators_async_run_test

How to reproduce

Steps to Reproduce
Environment Setup:

  • Set up an Airflow environment using GCP Composer (specifically composer-2.13.9-airflow-2.10.5 or similar).
  • Install Cosmos v1.12.1, dbt-core v1.9.4, and dbt-bigquery adapter v1.9.1.

Configuration:

  • Configure a dbt project in Cosmos using ExecutionMode.ASYNC with BigQuery as the target.

Model Creation:

  • Create a dbt incremental model (e.g., using incremental_strategy='insert_overwrite' or merge).

Execution:

  • Trigger the Airflow DAG.

Observed Result:

  • The DbtRunAirflowAsyncOperator triggers the BigQuery job but appears to close the session or trigger dbt's internal cleanup prematurely.
  • The task fails with an AirflowException: Not found: Table ...__dbt_tmp was not found.
  • This indicates the temporary table used for the incremental merge is deleted before the asynchronous BigQuery job can access it.

Anything else :)?

No response

Are you willing to submit PR?

  • Yes I am willing to submit a PR!

Contact Details

No response

Metadata

Metadata

Assignees

No one assigned

    Labels

    bugSomething isn't workingexecution:asyncRelated to the Async execution modetriage-neededItems need to be reviewed / assigned to milestone

    Type

    No type

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions