mygrad.nnet.layers.batchnorm#
- mygrad.nnet.layers.batchnorm(x: ArrayLike, *, gamma: ArrayLike | None = None, beta: ArrayLike | None = None, eps: float, constant: bool | None = None) Tensor [source]#
Performs batch normalization on
x
:y(x) = (x - E[x]) / sqrt(Var[x] + eps) batchnorm(x) = gamma * y(x) + beta
Where \(E[x]\) and \(Var[x]\) represent the mean and variance, respectively, over axis-1 of
x
. The subsequent affine transformation ony
is optional.- Parameters:
- xarray_like, shape=(N, C, …)
The batch to be normalized within each entry of C
- gammaOptional[array_like], shape=(C,)
Optional per-channel scaling factors to be applied after the normalization step.
- betaOptional[array_like], shape=(C,)
Optional per-channel scaling bias factors to be applied after the normalization step.
- epsReal
A small non-negative number.
- constantbool, optional (default=False)
If True, the resulting Tensor is a constant.
- Returns:
- mygrad.Tensor
The batch-normalized data.
Examples
>>> import mygrad as mg >>> from mygrad.nnet import batchnorm >>> x = mg.Tensor([1., 4., 1.]).reshape(3, 1) >>> batchnorm(x, eps=0) Tensor([[-0.70710678], [ 1.41421356], [-0.70710678]])