In this paper, we propose the Imbalanced Noisy Labels Calibration (INLC) approach.
Main dependency:
loguru 0.7.3
torch 2.5.1The datasets used in the experiments:
- ILSVRC2012: Images - Validation images (all tasks). 6.3GB. [link]
- ImageNet32: Download downsampled image data (32x32, 64x64) - Train(32x32), 3 GB [link]
- WebVision1.0: Resized Images (small version) - Google Images Resized (16 GB) [link]
We put the datasets in the ~/data directory, and the directory structure is as follows:
├─cifar-10-batches-py
├─cifar-100-python
├─ILSVRC2012
│ ├─ILSVRC2012_devkit_t12
│ │ ├─data
│ │ └─evaluation
│ └─ILSVRC2012_img_val
├─imagenet32
│ ├─train_data_batch_1
│ ├─...
│ └─val_data
└─webvision1.0
├─google
│ ├─q0001
│ ├─q0002
│ ├─...
| └─q1632
├─info
└─val_images_256Training in CIFAR-10
python main_cifar.py
python main_cifar.py --r_ood 0.2
python main_cifar.py --r_ood 0.2 --r_id 0.2
python main_cifar.py --r_ood 0.2 --r_id 0.2 --asym
python main_cifar.py --r_imb 0.01
python main_cifar.py --r_imb 0.01 --r_ood 0.2
python main_cifar.py --r_imb 0.01 --r_ood 0.2 --r_id 0.2
python main_cifar.py --r_imb 0.01 --r_ood 0.2 --r_id 0.2 --asymTry strategy 2 or replace loss with focal in CIFAR-10
python main_cifar.py --s2
python main_cifar.py --loss focalTraining in CIFAR-100
python main_cifar.py --dataset 100 --tau 0.6
python main_cifar.py --dataset 100 --tau 0.6 --r_ood 0.2
python main_cifar.py --dataset 100 --tau 0.6 --r_ood 0.2 --r_id 0.2 --warm_epochs 20
python main_cifar.py --dataset 100 --tau 0.6 --r_ood 0.2 --r_id 0.2 --asym --warm_epochs 20
python main_cifar.py --dataset 100 --tau 0.6 --r_imb 0.01
python main_cifar.py --dataset 100 --tau 0.6 --r_imb 0.01 --r_ood 0.2
python main_cifar.py --dataset 100 --tau 0.6 --r_imb 0.01 --r_ood 0.2 --r_id 0.2 --warm_epochs 20
python main_cifar.py --dataset 100 --tau 0.6 --r_imb 0.01 --r_ood 0.2 --r_id 0.2 --asym --warm_epochs 20Training or evaluating in WebVision 1.0
python main_webvision.py --s2
python main_webvision.py --testIf training collapses, try increasing the value of --warm_epochs.