Skip to content

crack detection model question #142

@gkseogus

Description

@gkseogus

For pytorch 1.1 or later, skip the first value of the learning rate schedule optimizer.step() if you use the learning rate scheduler before updating the optimizer.step(). There is an error saying that. How do we deal with this?

Below is an error.
In PyTorch 1.1.0 and later, you should call them in the opposite order: optimizer.step() before lr_scheduler.step(). Failure to do this will result in PyTorch skipping the first value of the learning rate schedule. See more details at https://pytorch.org/docs/stable/optim.html#how-to-adjust-learning-rate
warnings.warn("Detected call of lr_scheduler.step() before optimizer.step().

Metadata

Metadata

Assignees

No one assigned

    Labels

    No labels
    No labels

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions