-
Notifications
You must be signed in to change notification settings - Fork 82
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Create a cron job to clean the dangling assets and cached assets #1284
Comments
This issue has been automatically marked as stale because it has not had recent activity. If you think this still needs to be addressed please comment on this thread. Please note that issues that do not follow the contributing guidelines are likely to be ignored. |
keep open |
As we now use signed URLs, only the assets referenced by a cache entry will be accessible. The other ones, if any, will "just" waste space on S3 |
Also: the files on S3 should now be deleted when we delete a dataset (see https://github.com/huggingface/datasets-server/blob/c22f563f350a36148a1deaaf12531d4da991a677/libs/libcommon/src/libcommon/orchestrator.py#L678) |
Note: as of today, the production S3 bucket contains 3TB of assets. It would be interesting to know which part is not needed anymore. Also: /cached-assets directory could have a TTL of 1 day, or less, since the assets are generated on demand. |
Related to #1122
The text was updated successfully, but these errors were encountered: