CELU¶
- class torch.nn.CELU(alpha=1.0, inplace=False)[source]¶
Applies the CELU function element-wise.
\[\text{CELU}(x) = \max(0,x) + \min(0, \alpha * (\exp(x/\alpha) - 1)) \]More details can be found in the paper Continuously Differentiable Exponential Linear Units .
- Parameters:
- Shape:
Input: \((*)\), where \(*\) means any number of dimensions.
Output: \((*)\), same shape as the input.
Examples:
>>> m = nn.CELU() >>> input = torch.randn(2) >>> output = m(input)