description: Batch normalization.

tf.nn.batch_normalization

Batch normalization.

Normalizes a tensor by mean and variance, and applies (optionally) a scale \(\gamma\) to it, as well as an offset \(\beta\):

\(\frac{\gamma(x-\mu)}{\sigma}+\beta\)

mean, variance, offset and scale are all expected to be of one of two shapes:

See equation 11 in Algorithm 2 of source: Batch Normalization: Accelerating Deep Network Training by Reducing Internal Covariate Shift; S. Ioffe, C. Szegedy.

x Input Tensor of arbitrary dimensionality.
mean A mean Tensor.
variance A variance Tensor.
offset An offset Tensor, often denoted \(\beta\) in equations, or None. If present, will be added to the normalized tensor.
scale A scale Tensor, often denoted \(\gamma\) in equations, or None. If present, the scale is applied to the normalized tensor.
variance_epsilon A small float number to avoid dividing by 0.
name A name for this operation (optional).

the normalized, scaled, offset tensor.

References:

Batch Normalization - Accelerating Deep Network Training by Reducing Internal Covariate Shift: Ioffe et al., 2015 (pdf)