B

Batch Normalization

Definition

A technique that normalizes the inputs of each layer in a neural network by adjusting and scaling the activations. Batch normalization stabilizes training, allows higher learning rates, and reduces the sensitivity to weight initialization.

Defined Term