Skip to content

Commit

Permalink
typo fix (#2276)
Browse files Browse the repository at this point in the history
* typo

* style
  • Loading branch information
stas00 authored Dec 22, 2023
1 parent c5baa05 commit ceb7c69
Showing 1 changed file with 2 additions and 2 deletions.
4 changes: 2 additions & 2 deletions src/accelerate/accelerator.py
Original file line number Diff line number Diff line change
Expand Up @@ -963,8 +963,8 @@ def accumulate(self, *models):
Args:
*models (list of `torch.nn.Module`):
PyTorch Modules that was prepared with `Accelerator.prepare`. Models passed to `accumulate()` will skip
gradient syncing during backward pass in distributed training
PyTorch Modules that were prepared with `Accelerator.prepare`. Models passed to `accumulate()` will
skip gradient syncing during backward pass in distributed training
Example:
Expand Down

0 comments on commit ceb7c69

Please sign in to comment.