This is the official implementation of the ICIP paper "DISTRIBUTION PADDING IN CONVOLUTIONAL NEURAL NETWORKS".
Recently, I have experienced some NaN error caused by my custom frac_bilinear_upsampling. Therefore, the following environment is highly recommended.
Python == 3.6
Theano == 1.0.3
CUDA == 9.0
CuDNN == 7.5
NVDIA GPU TiTAN X 12GB
If you encounter NaN, one workaround is to use the Theano's frac_bilinear_upsampling. However, it will require some modification as the op does not accept symbolic variables for the resize argument.
To train the model on CIFAR10, simply run
python train.py path/to/CIFAR10
If the dataset is not in path/to/CIFAR10
, it will be automatically downloaded.
By default, the model architecture is ResNet34 with mean interpolation padding.
Training will run for 100 epochs and evaluates on the test set every 1000 iterations.
Type python train.py -h
to see more customizations.
If you find this code helpful for your research, please consider citing the work
@INPROCEEDINGS{DistPaddingNguyen2019,
author={A. {Nguyen} and S. {Choi} and W. {Kim} and S. {Ahn} and J. {Kim} and S. {Lee}},
booktitle={2019 IEEE International Conference on Image Processing (ICIP)},
title={Distribution Padding in Convolutional Neural Networks},
year={2019},
volume={},
number={},
pages={4275-4279},
keywords={Deep learning;convolutional neural network;image padding},
doi={10.1109/ICIP.2019.8803537},
ISSN={},
month={Sep.},}
The partial convolution is adapted and simplified from the official repo.