![]() |
Exponential linear unit.
Aliases:
tf.keras.activations.elu(
x,
alpha=1.0
)
Arguments:
x
: Input tensor.alpha
: A scalar, slope of negative section.
Returns:
The exponential linear activation: x
if x > 0
and
alpha * (exp(x)-1)
if x < 0
.