Skip to content

Commit ebc811a

Browse files
committed
fix: guard accelerate version before allow tp
Signed-off-by: Mehant Kammakomati <[email protected]>
1 parent 51d7e45 commit ebc811a

File tree

1 file changed

+5
-0
lines changed

1 file changed

+5
-0
lines changed

Diff for: src/transformers/training_args.py

+5
Original file line numberDiff line numberDiff line change
@@ -1989,6 +1989,11 @@ def __post_init__(self):
19891989
warnings.warn("`--xla_fsdp_grad_ckpt` is useful only when `--xla` is set to true.")
19901990

19911991
if self.tp_size > 1:
1992+
if not is_accelerate_available("1.3.1"):
1993+
raise NotImplementedError(
1994+
"TP using PyTorch requires Accelerate version `accelerate` >= 1.3.1. "
1995+
"This is not supported and we recommend you to update your version."
1996+
)
19921997
os.environ["ACCELERATE_USE_TP"] = "true"
19931998
os.environ["TP_SIZE"] = str(self.tp_size)
19941999
# accelerate integration for FSDP

0 commit comments

Comments
 (0)