This repository contains a collection of custom learning rate schedulers. These Scheduler classes can be used to decay the learning rate during training.
pip install numpy matplotlib
The learning rate is decayed linearly from the initial learning rate to minimum learning rate over a specified number of epochs. The formula for the learning rate is given below:
lr = lr_min + (lr_max - lr_min) * (1 - epoch / epochs)
The learning rate is decayed exponentially over a specified number of epochs. The formula for the learning rate is given below:
lr = lr_min + (lr_max - lr_min) * (decay_rate) ^ (epoch / epochs)
The learning rate is decayed by a factor of gamma every step_size epochs. The formula for the learning rate is given below:
lr = lr_max * gamma ^ (epoch / step_size)
The learning rate is decayed by a factor of gamma every step_size epochs. The formula for the learning rate is given below:
lr = lr_max * (1 - epoch / epochs) ^ (power)
- Add more schedulers
- Create examples for each scheduler
- Add documentation
Everyone is welcome to contribute to this repository.