Skip to content

Commit 404a679

Browse files
authored
BL-1763: Add retries to tasks that run in parallel. (#1638)
If the task fails because of not enough resources, the retry may resolve the problem.
1 parent b13ce72 commit 404a679

File tree

1 file changed

+2
-0
lines changed

1 file changed

+2
-0
lines changed

cob_datapipeline/catalog_full_reindex_dag.py

Lines changed: 2 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -191,6 +191,7 @@ def prepare_alma_data():
191191
PREPARE_ALMA_DATA = PythonOperator(
192192
task_id=f"prepare_alma_data_{index}",
193193
python_callable=xml_parse.prepare_alma_data,
194+
retries=3,
194195
op_kwargs={
195196
"AWS_ACCESS_KEY_ID": AIRFLOW_S3.login,
196197
"AWS_SECRET_ACCESS_KEY": AIRFLOW_S3.password,
@@ -248,6 +249,7 @@ def index_sftp_marc():
248249
INDEX_SFTP_MARC = BashOperator(
249250
task_id=f"index_sftp_marc_{index}",
250251
bash_command=AIRFLOW_HOME + "/dags/cob_datapipeline/scripts/ingest_marc.sh ",
252+
retries=3,
251253
env={
252254
"AWS_ACCESS_KEY_ID": AIRFLOW_S3.login,
253255
"AWS_SECRET_ACCESS_KEY": AIRFLOW_S3.password,

0 commit comments

Comments
 (0)