Skip to content

Conversation

seemuch
Copy link
Collaborator

@seemuch seemuch commented Nov 9, 2018

No description provided.

@seemuch seemuch requested a review from anj-s November 9, 2018 01:39
@seemuch seemuch changed the title Loss fn Update Keras loss function to include l2 loss, like estimator Nov 9, 2018
entropy_loss = _softmax_crossentropy_with_logits(y_true, y_pred)
l2_loss = weight_decay * tf.add_n(
# loss is computed using fp32 for numerical stability.
[tf.nn.l2_loss(tf.cast(v, tf.float32)) for v in tf.trainable_variables()
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Is this populated for Keras? I think we may need to add regularization losses to the layers explicitly. What is the difference in loss that you see with this change?

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

2 participants