🚀 The feature
Suggest using register_buffer() with persistent=False, so the buffer (e.g. window of spectrogram) will not be included in module's state dict.
Motivation, pitch
When I add new transforms to my model and load a pre-existing checkpoint, a missing key error will be raised, e.g. missing resample kernel in transforms.Resample. It can be solved by specifying strict=False at load time, however, I don't see any reason to save the buffers. They can be recomputed on construction time and that won't affect model's behavior.
Alternatives
No response
Additional context
Same motivation as pytorch/pytorch#18056.
🚀 The feature
Suggest using
register_buffer()withpersistent=False, so the buffer (e.g. window of spectrogram) will not be included in module's state dict.Motivation, pitch
When I add new transforms to my model and load a pre-existing checkpoint, a
missing keyerror will be raised, e.g. missing resample kernel intransforms.Resample. It can be solved by specifyingstrict=Falseat load time, however, I don't see any reason to save the buffers. They can be recomputed on construction time and that won't affect model's behavior.Alternatives
No response
Additional context
Same motivation as pytorch/pytorch#18056.