Replies: 1 comment
-
Yes, when you use model_0 (or any PyTorch model) in a loop that iterates through a DataLoader, the model processes each batch of data in one run during each iteration of the loop. In PyTorch, a DataLoader is typically set up to load batches of data. The batch_size parameter in the DataLoader constructor specifies how many samples from your dataset to load in one batch. When you iterate over the DataLoader, each iteration yields a batch of data (X, y), where X is a tensor representing the batch of input features, and y is a tensor representing the batch of labels. |
Beta Was this translation helpful? Give feedback.
0 replies
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Uh oh!
There was an error while loading. Please reload this page.
Uh oh!
There was an error while loading. Please reload this page.
-
Chapter 103: Writing training and testing loop for our batched data (minute 20)
My doubt is: is model_0 processing the whole batch in one run?
Beta Was this translation helpful? Give feedback.
All reactions