You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Is your feature request related to a problem? Please describe.
I couldn't find an option for turning off saving backups (.bk) for faceswap.py train.
The -ss is for snapshots, but I didn't find a way to turn off the .bk file.
A reason for having that one is that IMO it is superfluous for educational experimenting and not important sessions, while it adds more writing to a disk or takes up space on a RAM disk - the effect is multiplied if one is trying out different models, settings, datasets and keeping their current states.
Describe the solution you'd like
An option, say:
-bk 0
...with default being 1.
Describe alternatives you've considered
I made a simple work around for myself by adding a flag in train/model/_base.py which is set in the code:
DISABLE_BACKUP = True
...
def _save(self):
""" Backup and save the model and state file.
if save_averages and self._should_backup(save_averages) and not DISABLE_BACKUP:
...
The text was updated successfully, but these errors were encountered:
Honestly, only 1 backup is kept, and the backup is only created when loss on both sides has dropped between save iterations to it's lowest average level. This is not a huge overhead on disk space or time.
I will tag this as "suggestion" but I cannot see myself implementing this any time soon. As always, I welcome pull requests
Is your feature request related to a problem? Please describe.
I couldn't find an option for turning off saving backups (.bk) for faceswap.py train.
The -ss is for snapshots, but I didn't find a way to turn off the .bk file.
A reason for having that one is that IMO it is superfluous for educational experimenting and not important sessions, while it adds more writing to a disk or takes up space on a RAM disk - the effect is multiplied if one is trying out different models, settings, datasets and keeping their current states.
Describe the solution you'd like
An option, say:
-bk 0
...with default being 1.
Describe alternatives you've considered
I made a simple work around for myself by adding a flag in train/model/_base.py which is set in the code:
https://github.com/Twenkid/faceswap/blob/master/plugins/train/model/_base.py
The text was updated successfully, but these errors were encountered: