Different results between y=net(test) and matrix multiplication of NN with same structures

2 views (last 30 days)
Hi,
I have a neural netwok with following strctures 5 inputs 1 hidden layer with 17 neurons and 1 output.
I use sigmoid function from input to hidden layer and purelin function from hidden layer to output. After training the nerual network is done.
I apply yy= net(test) and i got the result. However when i extract the weight and baise and i do the matrix multipicaton for NN structure I got different result compared with yy=net(test). i did not know what are the reasons for differnt results.
This is my code
hidden_layer_output = logsig( net.IW{1}* test + net.b{1});
output = net.LW{2,1} * hidden_layer_output + net.b{2};
This is my the full code for training the NN
x = input1;
t = target1;
% Choose a Training Function
% For a list of all training functions type: help nntrain
% 'trainlm' is usually fastest.
% 'trainbr' takes longer but may be better for challenging problems.
% 'trainscg' uses less memory. Suitable in low memory situations.
trainFcn = 'trainlm'; % Levenberg-Marquardt backpropagation.
% Create a Fitting Network
hiddenLayerSize = 17;
net = fitnet(hiddenLayerSize,trainFcn);
% Setup Division of Data for Training, Validation, Testing
net.divideParam.trainRatio = 70/100;
net.divideParam.valRatio = 15/100;
net.divideParam.testRatio = 15/100;
net.layers{1}.transferFcn = 'logsig';
%net.layers{2}.transferFcn = 'purline';
% Train the Network
[net,tr] = train(net,x,t);
% Test the Network
y = net(x);
e=y-t;
mse1=immse(t,y);
rmse=sqrt(mse1);
weights = getwb(net)
Iw = net.IW
b1 = net.b(1)
Lw = net.Lw
b2 = net.b(2)
  1 Comment
HASAN AL-KAF
HASAN AL-KAF on 4 Feb 2023
i got the answer on way is to turn off the normaization using
net.inputs{1}.processFcns = {};
net.outputs{2}.processFcns = {};
thank you

Sign in to comment.

Answers (0)

Products


Release

R2021a

Community Treasure Hunt

Find the treasures in MATLAB Central and discover how the community can help you!

Start Hunting!