About average in progress bar with on_epoch #14240
-
| 
 then, is  | 
Beta Was this translation helpful? Give feedback.
Replies: 2 comments 5 replies
-
| it takes batch_size into account for averaging. | 
Beta Was this translation helpful? Give feedback.
-
| @rohitgr7 I think the docs are a little bit unclear on what happens under the hood. 
 Nevertheless, in the provided piece of code  reported_loss = torch.mean(step_values_over_epoch) where  However, by inspecting the source code, I think Lightning does the following when  # Before epoch starts
self.value = 0
self.dataset_size = 0  # E.g. the number of images in a dataset
# On training step
self.value += step_value * batch_size  # Undoes the reduction from the criterion (e.g. MSELoss etc)
self.dataset_size += batch_size  # Correctly accounts for batch_size being different by DataLoader (e.g. when drop_last=False)
# On epoch end
reported_loss = self.value / self.dataset_size | 
Beta Was this translation helpful? Give feedback.
it takes batch_size into account for averaging.