The ann program doesn't run all 50 epochs that specified.

6 views (last 30 days)
Here is the code:
inputs = readmatrix('C:\Users\tanmo\Downloads\Input-Station-2.xlsx');
targets = readmatrix('C:\Users\tanmo\Downloads\Target-Station-2.xlsx');
inputs = inputs'; % Transpose if necessary
targets = targets'; % Transpose if necessary
net = feedforwardnet([10 10]);
net.trainFcn = 'trainlm'; % Using Levenberg-Marquardt backpropagation
net.trainParam.epochs = 50;
net.trainParam.lr = 0.01;
net.trainParam.max_fail = 50; % Maximum validation failures
net.divideParam.trainRatio = 0.70;
net.divideParam.valRatio = 0.15;
net.divideParam.testRatio = 0.15;
net.trainParam.goal = 0; % Essentially remove the performance goal// added later
net.trainParam.min_grad = 1e-10; % Set a very small gradient goal
[net, tr]= train(net, inputs, targets);
outputs = net(inputs);
performance = perform(net, targets, outputs);
plotperform(tr)
I can't fix the number of iteration or epoch. It does a random iteration in different time and plots graph for random epoch values.

Answers (1)

Jayanti
Jayanti on 16 Sep 2024
When you are training the neural network, it may not run till the specified number of epochs if any of the stopping criteria are satisfied. There is possibility of the gradient value to fall below "min_grad" value 1e-10 as a result the training process will stop before running for every epoch.
From the attached image, you can see that the gradient value is less than the min_grad value at 15 epoch. Also, you can see the message minimum gradient reached at the bottom of the nntraintooldialog box.
To solve this issue, try increasing the value of "min_grad" or avoid specifying it.
Hope this helps!

Categories

Find more on Sequence and Numeric Feature Data Workflows in Help Center and File Exchange

Products


Release

R2021b

Community Treasure Hunt

Find the treasures in MATLAB Central and discover how the community can help you!

Start Hunting!