Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Cyclic learning rates #56

Open
2 tasks
JohnGiorgi opened this issue Oct 15, 2018 · 0 comments
Open
2 tasks

Cyclic learning rates #56

JohnGiorgi opened this issue Oct 15, 2018 · 0 comments
Assignees
Labels
enhancement New feature or request feature

Comments

@JohnGiorgi
Copy link
Contributor

JohnGiorgi commented Oct 15, 2018

We should see if using the cyclic learning rate finder (paper: here) along with an adaptive learning rate optimizer (e.g., adam) improves on our current optimizer (nadam).

Todo

  • Use the Cyclic LR Keras Callback to determine an optimal learning rate.
  • Try this learning rate with a few different optimizers (starting with Adam). Does it beat our current optimization method?

Resources

@JohnGiorgi JohnGiorgi changed the title Cyclic learning rates (w/adam) and SGDR Cyclic learning rates Nov 13, 2018
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
enhancement New feature or request feature
Projects
None yet
Development

No branches or pull requests

2 participants