How to create the custom loss function by adding negative entropy to the cross-entropy?

Issue

I recently read a paper entitled "REGULARIZING NEURAL NETWORKS BY PENALIZING CONFIDENT OUTPUT DISTRIBUTIONS https://arxiv.org/abs/1701.06548". The authors discuss regularizing neural networks by penalizing low entropy
output distributions through adding a negative entropy term to the negative log-likelihood and creating a custom loss function for model training.

enter image description here

The value β controls the strength of confidence penalty. I have written a custom function for categorical cross-entropy as shown below but the negative entropy term need to be added to the loss function.

import tensorflow as tf
def custom_loss(y_true, y_pred):
    cce = tf.keras.losses.CategoricalCrossentropy()
    cce_loss = cce(y_true, y_pred)    
    return cce_loss

Solution

The entropy of y_pred is essentially the categorical cross entropy between y_pred and itself:

enter image description here

def custom_loss(y_true, y_pred, beta):
    cce = tf.keras.losses.CategoricalCrossentropy()
    return cce(y_true, y_pred) - beta*cce(y_pred, y_pred)

Answered By – Ivan

This Answer collected from stackoverflow, is licensed under cc by-sa 2.5 , cc by-sa 3.0 and cc by-sa 4.0

Leave a Reply

(*) Required, Your email will not be published