What is batch normalization and why is it important?
What is batch normalization and why is it important?
Batch-norm increase stability and performance of neural network training. Normalizes the output from a layer with Zero mean and standard deviation of 1. Important because it reduces overfitting and makes the network train faster. Very useful for complex neural networks.