From the course: Artificial Intelligence Foundations: Neural Networks

Unlock the full course today

Join today to access over 24,700 courses taught by industry experts.

Regularization techniques to improve overfitting models

Regularization techniques to improve overfitting models

From the course: Artificial Intelligence Foundations: Neural Networks

Regularization techniques to improve overfitting models

- [Instructor] The purpose of a neural network is to capture the dominant trends in the data. Overfitting is bad because it means that the machine learning algorithm did not capture the dominant trend in the data and therefore won't be able to recognize any trend on new data it has never seen. This means that the model did not really learn anything but only memorize the training data without understanding it. This means that your model cannot make accurate predictions so your validation error is large while your training error is small as is shown in the image on the left. Regularization is a hyperparameter technique to improve overfitting models. It refers to a set of different techniques that lower the complexity of a neural network model during training and thus may prevent overfitting. The image on the right shows a list of regularization techniques to help mitigate overfitting. Let's take a look at three of them.…

Contents