You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Currently, we have implemented CIFAR-10 and MNIST datasets, which were used by Abadi et. al. in the DP-SGD paper. Andrew et al (https://arxiv.org/abs/1905.03871) evaluate a wider set of models on CIFAR-100, EMNIST CR, EMNIST AE, Shakespeare, StackOverflow NWP, and StackOverflow LR.
Add an additional set of benchmark datasets, ideally including language datasets such as Shakespeare and StackOverflow NWP. Torchvision includes an additional set of useful datasets that might be easy to add in our current code API.
Adding a dataset involves:
Creating the relevant backbone model in the models/ folder
Adding the model loader to the load_model function in train.py
Adding the dataset option in the load_data() function in utils.py
Adding the benchmark in the run_exp() function in train.py
For some more complex datasets, it might require adding additional pre-processing code to load dataset batches in the correct form.
The text was updated successfully, but these errors were encountered:
Currently, we have implemented CIFAR-10 and MNIST datasets, which were used by Abadi et. al. in the DP-SGD paper. Andrew et al (https://arxiv.org/abs/1905.03871) evaluate a wider set of models on CIFAR-100, EMNIST CR, EMNIST AE, Shakespeare, StackOverflow NWP, and StackOverflow LR.
Add an additional set of benchmark datasets, ideally including language datasets such as Shakespeare and StackOverflow NWP. Torchvision includes an additional set of useful datasets that might be easy to add in our current code API.
Adding a dataset involves:
models/
folderload_model
function intrain.py
load_data()
function inutils.py
run_exp()
function intrain.py
For some more complex datasets, it might require adding additional pre-processing code to load dataset batches in the correct form.
The text was updated successfully, but these errors were encountered: