Skip to content

Commit 8611c72

Browse files
authored
fix: enable TestWorkflowJob functional test that was unreachable on all profiles (#1401)
## Summary - Fix `TestWorkflowJob` functional test that was **never actually running** due to three compounding bugs: 1. `skip_profile` listed all 3 available profiles (`databricks_cluster`, `databricks_uc_sql_endpoint`, `databricks_uc_cluster`) — test was skipped everywhere 2. Used `simple_python_model` which hardcodes `submission_method='serverless_cluster'` in `dbt.config()`, overriding the YAML's `workflow_job` setting — test was exercising the wrong code path 3. `workflow_schema` included `max_retries` which is not a valid `jobs.create()` parameter resolves follow-up to #1360 ### changes - Add `workflow_python_model` fixture without `submission_method` in `dbt.config()` so the YAML schema's `submission_method: workflow_job` takes effect - Remove `databricks_uc_cluster` from skip list so the test runs on at least one profile - Remove invalid `max_retries` from `workflow_schema` ### checklist - [x] i have run this code in development and it appears to resolve the stated issue - [x] this pr includes tests, or tests are not required/relevant for this pr - [x] i have updated the `CHANGELOG.md` and added information about my change to the "dbt-databricks next" section. ## Test plan - [x] Verified test was SKIPPED on all 3 profiles before the fix - [x] Verified test PASSES on `databricks_uc_cluster` profile after the fix (ran against live cluster)
1 parent 5927a32 commit 8611c72

File tree

3 files changed

+14
-5
lines changed

3 files changed

+14
-5
lines changed

CHANGELOG.md

Lines changed: 1 addition & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -7,6 +7,7 @@
77
### Fixes
88

99
- Fix `workflow_job` Python model submission method failing with dictionary attribute error ([#1360](https://github.com/databricks/dbt-databricks/issues/1360))
10+
- Fix `TestWorkflowJob` functional test that was unreachable on all profiles due to incorrect skip list, wrong model fixture, and invalid `max_retries` parameter ([#1360](https://github.com/databricks/dbt-databricks/issues/1360))
1011
- Fix column order mismatch in microbatch and replace_where incremental strategies by using INSERT BY NAME syntax ([#1338](https://github.com/databricks/dbt-databricks/issues/1338))
1112

1213
## dbt-databricks 1.11.6 (Mar 10, 2026)

tests/functional/adapter/python_model/fixtures.py

Lines changed: 11 additions & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -102,6 +102,17 @@ def model(dbt, spark):
102102
identifier: source
103103
"""
104104

105+
workflow_python_model = """
106+
import pandas
107+
108+
def model(dbt, spark):
109+
dbt.config(
110+
materialized='table',
111+
)
112+
data = [[1,2]] * 10
113+
return spark.createDataFrame(data, schema=['test', 'test2'])
114+
"""
115+
105116
workflow_schema = """version: 2
106117
107118
models:
@@ -110,7 +121,6 @@ def model(dbt, spark):
110121
submission_method: workflow_job
111122
user_folder_for_python: true
112123
python_job_config:
113-
max_retries: 2
114124
timeout_seconds: 500
115125
additional_task_settings: {
116126
"task_key": "my_dbt_task"

tests/functional/adapter/python_model/test_python_model.py

Lines changed: 2 additions & 4 deletions
Original file line numberDiff line numberDiff line change
@@ -259,15 +259,13 @@ def project_config_update(self):
259259

260260

261261
@pytest.mark.python
262-
@pytest.mark.skip_profile(
263-
"databricks_cluster", "databricks_uc_sql_endpoint", "databricks_uc_cluster"
264-
)
262+
@pytest.mark.skip_profile("databricks_cluster", "databricks_uc_sql_endpoint")
265263
class TestWorkflowJob:
266264
@pytest.fixture(scope="class")
267265
def models(self):
268266
return {
269267
"schema.yml": override_fixtures.workflow_schema,
270-
"my_workflow_model.py": override_fixtures.simple_python_model,
268+
"my_workflow_model.py": override_fixtures.workflow_python_model,
271269
}
272270

273271
def test_workflow_run(self, project):

0 commit comments

Comments
 (0)