From the course: Training Neural Networks in C++
Unlock the full course today
Join today to access over 24,700 courses taught by industry experts.
Challenge: Write your own backpropagation function - C++ Tutorial
From the course: Training Neural Networks in C++
Challenge: Write your own backpropagation function
(bright upbeat music) - [Instructor] Are you ready to finish up your multi-layer perceptron class? This time your task is to write a backpropagation trainer function, which will run one sample through the network with the backpropagation algorithm. Don't worry, you'll just have to write a few lines per step in the provided code. You'll simply have to fill in the blanks. You can do this. And feel free to go back to the backpropagation videos if you get stuck. The function is called BP and it starts at line 84. It receives a feature vector x and a label vector y. Inside the function, I have placed a comment for each step and skeleton code for the loops. Now, for steps three and four, you will need a vector of vectors I added to the class to store the error terms or lowercase deltas. That's why I named it lowercase d. Let's go to MLP.h to look for this vector's declaration. You will see in line 33 that it has the same…
Contents
-
-
-
-
-
-
(Locked)
The need for training4m 45s
-
(Locked)
The training process3m 47s
-
(Locked)
Error function2m 27s
-
(Locked)
Gradient descent2m 53s
-
(Locked)
The delta rule3m 34s
-
(Locked)
The backpropagation algorithm9m 12s
-
(Locked)
Challenge: Write your own backpropagation function3m 29s
-
(Locked)
Solution: Write your own backpropagation function5m
-
(Locked)
-
-