From the course: Deep Learning: Model Optimization and Tuning
Unlock the full course today
Join today to access over 24,700 courses taught by industry experts.
Batch normalization
From the course: Deep Learning: Model Optimization and Tuning
Batch normalization
- [Instructor] Batch Normalization is an important technique to manage vanishing and exploiting gradients during gradient descent. In Batch Normalization, we normalize the inputs sent to each hidden layer. When we see normalize, we actually use the StandardScaler model to center and scale the weights and biases. For normalization, the values of the outputs of the hidden layer are considered for computing the mean and standard deviation. Even if the delta updates and activation function scale down or scale up the values, this step will normalize the inputs to be of the same scale. Batch Normalizations help achieve higher accuracies with lower epochs, hence is also an optimization technique. On the other hand, it can result in additional computations during training and inference and may increase resource utilization and execution times. The experiments for this chapter, are available in the notebook, 03_XX Tuning Back…
Practice while you learn with exercise files
Download the files the instructor uses to teach the course. Follow along and learn by watching, listening and practicing.