What are the reasons for having different outputs after several runs?
2 views (last 30 days)
Show older comments
I use the same initial condition for my network, but I still get different outputs. What are the other factors that can cause this difference?
0 Comments
Accepted Answer
Greg Heath
on 14 Sep 2013
Edited: Greg Heath
on 14 Sep 2013
The short answer is that train(only if all weights are zero) or configure( anytime) assign weights depending on the state of the RNG.
If you are training in a loop over random initial weights. The best statistically unbiased choice for the best net is determined by the minimum mse of the validation set.
rng(4151941) % Initialize RNG with famous birthday
for i = 1:Ntrials
s{i} = rng; %Save the ith state of the rng (may not need a cell)
net = configure(net,x,t);
[ net tr ] = train(net,x,t);
mseval(i) = tr.best_vperf; % Best mseval over all epochs of ith run
end
[ minmseval ibest ] = min(mseval);
rng = s{ibest}; % For repeating the best design
bestnet = configure(net,x,t);
bestIW0 = bestnet.IW
bestb0 = bestnet.b
bestLW0 = bestnet.LW
[ bestnet tr ] = train(bestnet,x,t) % NO SEMICOLON TO REVEAL ALL DETAILS!!!
Hope this helps.
Thank you for formally accepting my answer
Greg
0 Comments
More Answers (1)
Walter Roberson
on 12 Sep 2013
By default, Neural Networks are initialized randomly.
3 Comments
Greg Heath
on 29 Sep 2013
Edited: Greg Heath
on 29 Sep 2013
The obsolete newfit, newpr and newff initialize when the net is created.
The current fitnet, patternnet and feedforwardnet are either initialized by configure before training OR, if configure is not used, they will be automatically initialized by train.
In the latter case it is difficult to obtain those values. If you really need them, use configure then save or print the weights before calling train.
See Also
Categories
Find more on Sequence and Numeric Feature Data Workflows in Help Center and File Exchange
Products
Community Treasure Hunt
Find the treasures in MATLAB Central and discover how the community can help you!
Start Hunting!