From the course: Machine Learning Foundations: Calculus
Unlock the full course today
Join today to access over 24,700 courses taught by industry experts.
Connecting partial derivatives with backpropagation - Python Tutorial
From the course: Machine Learning Foundations: Calculus
Connecting partial derivatives with backpropagation
- [Instructor] In simplified terms, we can think of a neural network as a collection of neurons in which each neuron is doing its calculation and exchanging that result with other neurons. Some neural networks can employ supervised learning which uses a training set to teach the model to yield the desired output. In a process of designing a neural network, at the start, weights are initialized with some random values and obviously in that case a model output is completely different than the desired output, so the error is huge. To solve that challenge we would want the model to change the weights so the error is reduced. Backpropagation or short backprop is one of the ways we can train our model. It is the algorithm that backpropagates the errors from the output nodes to the input nodes. Instead of the directly computing the gradient of the cost function with respect to each individual weight, the backpropagation algorithm…
Practice while you learn with exercise files
Download the files the instructor uses to teach the course. Follow along and learn by watching, listening and practicing.