How to display weight distribution in hidden layers of neural network?
3 views (last 30 days)
Show older comments
I have 8 inputs in the input layer.Now i want to display weight distribution of these 8 inputs in hidden layer to observe the importance of features.To make it more clear example is shown in figure ( https://pasteboard.co/GKCpA6Q.png ).I used `plotwb` function of Matlab it didn't display the weights of every input.
Actually i want to look at weights(weights connecting inputs to first hidden layer) . Larger the weight is, the more important the input.
0 Comments
Answers (1)
Greg Heath
on 17 Sep 2017
That will not work. It does not account for the correlations between inputs.
The best way to rank correlated inputs is
1. Use NO HIDDEN LAYERS !
2. Run 10 or more trials each (different random initial weights)
using
a. A single input
b. All inputs except the one in a.
Hope this helps.
Thank you for formally accepting my answer
Greg
0 Comments
See Also
Categories
Find more on Sequence and Numeric Feature Data Workflows in Help Center and File Exchange
Community Treasure Hunt
Find the treasures in MATLAB Central and discover how the community can help you!
Start Hunting!