How do I improve the performance of fitnet for eye-height/width prediction

3 views (last 30 days)
I am currently doing a project where I need to use ANN to predict the EH/EW of an eye-diagram. The eye-height has its values ranging from 0.1-0.5 Volt whereas the eye-width usually varies from 4e-10 second to 10e-10 second. I'm trying to use the loop over increasing hidden neurons number method to search for the best design but I get very bad result. Other than using Train, Val, and Test data, is another set of unseen data needed to ensure the generalization?
I've attached the code in below. The 'xtra' in the code below refers to another independent set of unseen data. Thank you if you are willing to help me. Much appreciated.
[xn XS] = mapminmax(x);
[ I Nn ] = size(x);
[ O Nn ] = size(t);
[xn XS] = mapminmax(x); %normalizing input
[tn TS] = mapminmax(t); %normalizing target
MSE00 = mean(var(t',1))
MSEn00 = mean(var(tn',1))
Ntrn = Nn-2*round(0.15*Nn)
Ntrneq = Ntrn*O
% For H hidden nodes, the number of unknown weights is
% Nw = (I+1)*H+(H+1)*O
% Ntrneq >= Nw <==> H <= Hub (upper bound) where
H = 10;
Hub = -1+ceil( (Ntrneq-O)/(I+O+1)) %O = 16
Nw = (I+1)*H+(H+1)*O %O = 16
Ndof = Ntrneq -Nw
Ntrials = 100; %if this is set to more than 1, comment the line <<rng('default')>> below
Hmin = 1; dH = 1; Hmax = 10;
R2_mat2 = [];
training_time2 = [];
net_mat = [];
%**************************************************************************
%coarse search
for H = Hmin:dH:Hmax
for k = 1:Ntrials
net = fitnet(H);
s = rng;
%rng('default') % For reproducibility
net.trainParam.epochs = 1000;
net.trainParam.goal = 0;
net.performParam.normalization = 'percent';
%net.trainParam.mu_max = 1e+020;
net.trainParam.min_grad = 1e-009;
%net.efficiency.memoryReduction = 3; %saving memory by slowing down training
[ net tr y e ] = train(net,x,t); %**********************
trainTime = tr.time; trainTime = trainTime(length(trainTime))
R2 = 1 - mse(e)/MSE00 %overall performacne
evaInd = [tr.valInd, tr.testInd];
yn = mapminmax('apply',y,TS);
R2n = 1 - mse(yn-tn)/MSEn00
eva_yn = yn(:,evaInd);
eva_tn = tn(:,evaInd);
eva_en = eva_yn - eva_tn;
eva_R2n = 1 - mse(eva_en)/MSEn00
%----------------------------------------
xtra_y = net(xtra_x);
xtra_yn = mapminmax('apply',xtra_y,TS);
xtra_tn = mapminmax('apply',xtra_t,TS);
xtra_en = xtra_yn - xtra_tn;
xtra_R2n = 1 - mse(xtra_en)/MSEn00
%----------------------------------------------
temp_net_mat = [{net};R2n;eva_R2n;xtra_R2n;H;trainTime;s;tr];
net_mat = [net_mat,temp_net_mat];
end
end
[row col] = size(net_mat);
for k = 1:col
if k == 1
R2n = net_mat(2,k); R2n = R2n{1,1};
eva_R2n = net_mat(3,k); eva_R2n = eva_R2n{1,1};
xtra_R2n = net_mat(4,k); xtra_R2n = xtra_R2n{1,1};
best_netInd = k;
continue
else
temp_R2n = net_mat(2,k); temp_R2n = temp_R2n{1,1};
temp_eva_R2n = net_mat(3,k); temp_eva_R2n = temp_eva_R2n{1,1};
temp_xtra_R2n = net_mat(4,k); temp_xtra_R2n = temp_xtra_R2n{1,1};
if (temp_R2n + temp_xtra_R2n) > (R2n + xtra_R2n)
R2n = temp_R2n;
eva_R2n = temp_eva_R2n;
xtra_R2n = temp_xtra_R2n;
best_netInd = k;
end
end
end
H = net_mat(:,best_netInd); H = H(5); H = H{1,1};
%*************************************************************************
%fine search
for k = 1:10*Ntrials
%break
net = fitnet(H);
s = rng;
%rng('default') % For reproducibility
net.trainParam.epochs = 1000;
net.trainParam.goal = 0;
net.performParam.normalization = 'percent';
%net.trainParam.mu_max = 1e+020;
net.trainParam.min_grad = 1e-009;
%net.efficiency.memoryReduction = 3; %saving memory by slowing down training
[ net tr y e ] = train(net,x,t); %**********************
trainTime = tr.time; trainTime = trainTime(length(trainTime))
R2 = 1 - mse(e)/MSE00 %overall performacne
evaInd = [tr.valInd, tr.testInd];
yn = mapminmax('apply',y,TS);
R2n = 1 - mse(yn-tn)/MSEn00
eva_yn = yn(:,evaInd);
eva_tn = tn(:,evaInd);
eva_en = eva_yn - eva_tn;
eva_R2n = 1 - mse(eva_en)/MSEn00
%----------------------------------------
xtra_y = net(xtra_x);
xtra_yn = mapminmax('apply',xtra_y,TS);
xtra_tn = mapminmax('apply',xtra_t,TS);
xtra_en = xtra_yn - xtra_tn;
xtra_R2n = 1 - mse(xtra_en)/MSEn00
%----------------------------------------------
temp_net_mat = [{net};R2n;eva_R2n;xtra_R2n;H;trainTime;s;tr];
net_mat = [net_mat,temp_net_mat];
end
[row col] = size(net_mat);
net_mat2 = [];
for k = 1:col
R2n = net_mat(2,k); R2n = R2n{1,1};
eva_R2n = net_mat(3,k); eva_R2n = eva_R2n{1,1};
xtra_R2n = net_mat(4,k); xtra_R2n = xtra_R2n{1,1};
if R2n > 0.8 && eva_R2n > 0.7 && xtra_R2n > 0.8
net_mat2 = [net_mat2,net_mat(:,k)];
end
end
[row2 col2] = size(net_mat2);
if (row2 + col2) == 0
[row col] = size(net_mat);
for k = 1:col
if k == 1
R2n = net_mat(2,k); R2n = R2n{1,1};
eva_R2n = net_mat(3,k); eva_R2n = eva_R2n{1,1};
xtra_R2n = net_mat(4,k); xtra_R2n = xtra_R2n{1,1};
best_netInd = k;
continue
else
temp_R2n = net_mat(2,k); temp_R2n = temp_R2n{1,1};
temp_eva_R2n = net_mat(3,k); temp_eva_R2n = temp_eva_R2n{1,1};
temp_xtra_R2n = net_mat(4,k); temp_xtra_R2n = temp_xtra_R2n{1,1};
if temp_xtra_R2n > xtra_R2n
R2n = temp_R2n;
eva_R2n = temp_eva_R2n;
xtra_R2n = temp_xtra_R2n;
best_netInd = k;
end
end
end
net = net_mat(:,best_netInd)
net = net(1); net = net{1,1};
else
disp('net_mat2 is not empty')
[row2 col2] = size(net_mat2); best_netInd2 = 0;
for k = 1:col2
if k == 1
R2n = net_mat2(2,k); R2n = R2n{1,1};
eva_R2n = net_mat2(3,k); eva_R2n = eva_R2n{1,1};
xtra_R2n = net_mat2(4,k); xtra_R2n = xtra_R2n{1,1};
best_netInd2 = k;
continue
else
temp_R2n = net_mat2(2,k); temp_R2n = temp_R2n{1,1};
temp_eva_R2n = net_mat2(3,k); temp_eva_R2n = temp_eva_R2n{1,1};
temp_xtra_R2n = net_mat2(4,k); temp_xtra_R2n = temp_xtra_R2n{1,1};
if temp_xtra_R2n > xtra_R2n
R2n = temp_R2n;
eva_R2n = temp_eva_R2n;
xtra_R2n = temp_xtra_R2n;
best_netInd2 = k;
end
end
end
net = net_mat2(:,best_netInd2)
net = net(1); net = net{1,1};
end
time_mat = zeros(6,(Hmax - Hmin - 1));
for H = Hmin:dH:Hmax
time_mat(1,(H-Hmin+1)) = H;
end
for Hind = Hmin:dH:Hmax
count = 0; totalTime = 0; total_R2n = 0; total_xtra_R2n = 0;
for k = 1:col
H = net_mat(:,k); H = H(5); H = H{1,1};
R2n = net_mat(:,k); R2n = R2n(2); R2n = R2n{1,1};
xtra_R2n = net_mat(:,k); xtra_R2n = xtra_R2n(4); xtra_R2n = xtra_R2n{1,1};
trainTime = net_mat(:,k); trainTime = trainTime(6); trainTime = trainTime{1,1};
if Hind == H
count = count + 1; totalTime = totalTime + trainTime; total_R2n = total_R2n + R2n; total_xtra_R2n = total_xtra_R2n + xtra_R2n;
end
time_mat(2,Hind-Hmin+1) = count;
time_mat(3,Hind-Hmin+1) = totalTime;
time_mat(4,Hind-Hmin+1) = totalTime/count;
time_mat(5,Hind-Hmin+1) = total_R2n/count;
time_mat(6,Hind-Hmin+1) = total_xtra_R2n/count;
end
end
%plot the result
y = net(x); xtra_y = net(xtra_x); e = y-t; xtra_e = xtra_y - xtra_t;
close all;
figure;
plot(t(1,:)); hold on; plot(y(1,:),'--r'); plot(e(1,:),'g');
title('Eye-Height'); ylabel('EH/V');xlabel('Test Cases');
legend('ADS','ANN Prediction','error');
figure;
plot(t(2,:)); hold on; plot(y(2,:),'--r'); plot(e(2,:),'g');
title('Eye-Width'); ylabel('EW/s');xlabel('Test Cases');
legend('ADS','ANN Prediction','error');
figure;
plot(xtra_t(1,:)); hold on; plot(xtra_y(1,:),'--r'); plot(xtra_e(1,:),'g');
title('Eye-Height'); ylabel('EH/V');xlabel('Test Cases');
legend('ADS','ANN Prediction','error');
figure;
plot(xtra_t(2,:)); hold on; plot(xtra_y(2,:),'--r'); plot(xtra_e(2,:),'g');
title('Eye-Width'); ylabel('EW/s');xlabel('Test Cases');
legend('ADS','ANN Prediction','error');
  4 Comments
goay chan hong
goay chan hong on 28 Sep 2016
Sir, May I know is it ok to not normalize the data, but use net.performParam.normalization = 'percent' instead? Another question, is the extra unseen data needed other than using the standard 'train', 'val', and 'test' data subset?
Greg Heath
Greg Heath on 28 Sep 2016
I think you are trying to do too much at once.
[ I N ] = size(input) % [ ? ? ]
[ O N ] = size(target) % [ ? ? ]
Using all defaults what is the minimum H that will yield
NMSE = mse(target-output)/mean(var(target',1)) < 0.01 ?
You should be able to do this in a few tens of lines of code.

Sign in to comment.

Answers (0)

Categories

Find more on Sequence and Numeric Feature Data Workflows in Help Center and File Exchange

Community Treasure Hunt

Find the treasures in MATLAB Central and discover how the community can help you!

Start Hunting!