How to compute gradients using the Neural Network Toolbox software through a backpropagation process?
3 views (last 30 days)
Show older comments
I've been reading Neural_Network_Toolbox_Users_Guide and I have a question about below section.
As what it said in 3-19,
In fact, the gradients and Jacobians for any network that has differentiable transfer functions, weight functions and net input functions can be computed using the Neural Network Toolbox software through a backpropagation process. You can even create your own custom networks and then train them using any of the training functions in the table above. The gradients and Jacobians will be automatically computed for you.
But I don't understand how exactly this works.
Here is what I'm trying to demonstrate.
- We use neural networks as a non-linear function approximator to estimate the real function.
- We have inputs(x) and refer the real function(y) with weights that we try to estimate using neural networks.
- A neural network can be trained by minimizing a loss function that changes at each iteration.
- Loss function can be the average of square of (y'-y) , where y' is the target function. Briefly speaking, the gradient of loss function can be represented with (y'-y)▽y.
- Here, note that the target function(y') depends on weights this is different from general learning (e.x. supervised learning), which are fixed before training, so that the target function is changed at each iteration.
- Therefore, what I'm looking for is how to calculate ▽y at each iteration to update weights.
Could you help me to understand and give an example about this?
0 Comments
Answers (1)
Greg Heath
on 16 Jan 2019
You are confused
The target function is constant and independent of the weights.
Hope this helps.
*Thank you for formally accepting my answer*
Greg
See Also
Categories
Find more on Sequence and Numeric Feature Data Workflows in Help Center and File Exchange
Community Treasure Hunt
Find the treasures in MATLAB Central and discover how the community can help you!
Start Hunting!