May I ask whether the DLPM results reported in your paper on CIFAR-10-LT were obtained by launching training with the cifar10_lt.yml configuration file in the codebase? I used the setting with alpha = 1.5, and when I evaluated FID with 5,000 generations, I only got 35.56, whereas the paper reports 16.10.
May I ask whether the DLPM results reported in your paper on CIFAR-10-LT were obtained by launching training with the
cifar10_lt.ymlconfiguration file in the codebase? I used the setting with alpha = 1.5, and when I evaluated FID with 5,000 generations, I only got 35.56, whereas the paper reports 16.10.