![]() |
Compute the Leaky ReLU activation function.
Aliases:
tf.nn.leaky_relu(
features,
alpha=0.2,
name=None
)
Args:
features
: ATensor
representing preactivation values. Must be one of the following types:float16
,float32
,float64
,int32
,int64
.alpha
: Slope of the activation function at x < 0.name
: A name for the operation (optional).
Returns:
The activation value.