tf.keras.losses.KLD

View source on GitHub

Computes Kullback-Leibler divergence loss between y_true and y_pred.

Aliases:

tf.keras.losses.KLD(
    y_true,
    y_pred
)

loss = y_true * log(y_true / y_pred)

See: https://en.wikipedia.org/wiki/Kullback%E2%80%93Leibler_divergence

Usage:

loss = tf.keras.losses.KLD([.4, .9, .2], [.5, .8, .12])
print('Loss: ', loss.numpy())  # Loss: 0.11891246

Args:

  • y_true: Tensor of true targets.
  • y_pred: Tensor of predicted targets.

Returns:

A Tensor with loss.

Raises:

  • TypeError: If y_true cannot be cast to the y_pred.dtype.