tf.keras.backend.elu

View source on GitHub

Exponential linear unit.

Aliases:

tf.keras.backend.elu(
    x,
    alpha=1.0
)

Arguments:

  • x: A tensor or variable to compute the activation function for.
  • alpha: A scalar, slope of negative section.

Returns:

A tensor.