Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Implement continual learning benchmark evaluation #5

Open
lguerdan opened this issue Apr 6, 2022 · 0 comments
Open

Implement continual learning benchmark evaluation #5

lguerdan opened this issue Apr 6, 2022 · 0 comments
Assignees

Comments

@lguerdan
Copy link
Owner

lguerdan commented Apr 6, 2022

Implement three sets of benchmark evaluations for examining differential privacy convergence behavior under different learning settings.

  • Setting 1: Centralized learning. This is the current setting we consider in our evaluation, where the full training dataset is available in each epoch throughout training, and the full test dataset is also available.
  • Setting 2: Multi-Task learning
  • Setting 2: Single Incremental Task learning

Resources:

  • This paper (especially Section 1.2 on continual learning benchmarks) gives a good overview of Multi-Class and Single Incremental Task Learning. It also references a few papers that might have good reference implementations.
  • This Stack Exchange post gives a good overview of the difference between Single Incremental Task and Multi Task learning evaluations.
  • PyTorch custom Datasets & DataLoaders is probably the easiest way to implement these. This github repo also gives a good overview

Implementation notes:

  • This could be implemented by adding an additional loop in the train function that passes in a new trainloader for each epoch.
  • This evaluation could start with MNIST and CIFAR-10 baselines that are currently implemented. Once Implement full set of benchmark datasets evaluated by Andrew et. al.  #4 is complete, you could also expand this implementation to include the additional set of benchmarks.
@cosmic20 cosmic20 linked a pull request Apr 20, 2022 that will close this issue
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging a pull request may close this issue.

2 participants