Neural Network Script to have Specific Testing Data from Inputs.

Hello everyone,
I have been trying to write a neural network script that uses DivideInd to train my neural network. I'm using all the data for my input as the training and validating data for the neural network, however for the testing data I am using only the initial third of the data. However when I run the script, I run into the issue of,
??? Undefined function or method 'patternnet' for input arguments of type 'double'.
My code is as shown below, and yes I have made sure to be in the working directory when I run the script so that is not the issue.
inputs = Hand_Data_Inputs;
targets = Hand_Data_Targets;
% Create a Pattern Recognition Network
hiddenLayerSize = 31;
net = patternnet(hiddenLayerSize);
% Set up Division of Data for Training, Validation, Testing
net.divideFcn = 'divideInd';
net.divideParam.trainInd = 1:3000;
net.divideParam.valInd = 1:3000;
net.divideParam.testInd = 1:1000;
% Train the Network
[net,tr] = train(net,inputs,targets);
% Test the Network
outputs = net(inputs);
errors = gsubtract(targets,outputs);
performance = perform(net,targets,outputs);
Any help would be greatly appreciated, thank you

4 Comments

In the Command Window, type:
which patternnet -all
It should return a string with the full directory path and the file name.
Hmmmmmm, I typed
which patternnet -all
and it came up as,
'patternet' not found.
If you have the Neural Network Toolbox installed (I assume you do), see if my Answer solves the problem.

Sign in to comment.

 Accepted Answer

Your data division makes no sense.
Although the trn, val, and tst subsets can overlap in space, they should not contain common points.
total = design + test
design = train + val
train: estimate weights
val: stops training when performance on NONTRAINING val subset reaches a local optimum. However, resulting nontraining performance estimate is biased because validation is part of the design
test: yields UNBIASED estimate of performance on unseen nontraining data
Hope this helps
  • Thank you for formally accepting my answer *
Greg

7 Comments

Thank you for the answer,
My neural network is to be trained to recognize someones hand given the 31 features it looks at. I have 10 targets and I have obtained data from these people as follows.
  • 1st 1000 elements are a still hand capture of points.
  • 2nd 1000 elements are the hand opening and closing a closed fist.
  • 3rd 1000 elements are the hand rotating left and right with the arm stationary.
The test data should be looking at the 1st 1000 points, however the training and validating data should be looking at the overall data as a whole. I am considering getting another set of data of just the still hands and then using that as test data in the same order as the targets I've taken.
No. No. No.
Your training data performance is too biased to be used for testing.
The validation data performance is also biased, but it is less biased.
The test data performance is unbiased and the only reliable estimate of performance on new data.
No professor or customer will accept results that have not come from data that was in no way used for design.
You have enough data to have three independent subsets
trainInd = 1:2:3000;
valInd = 2:2:2999;
testInd = 3:2:2998;
Results are easily verified via the 9 different ways to use these 3 subsets.
In addition, for each of the 9 combinations you can obtain multiple (e.g., 10) designs via the random weight initialization.
If no success with those 90 using the default number of hidden nodes, you can always vary the number of hidden nodes.(I typically search over 10 different values).
Good Luck
Professor Heath
Thank you very much Professor Heath for the reply,
I am still very new to the use of neural networks but your help has been immensely useful. Though I am still slightly confused about what you say in terms of the random weight initialization. May I ask what it is and how does it affect my results?
Training starts with randomly assigned weights. In general, if you repeatedly design the net you will get different answers. If you choose enough hidden nodes, you will get a spectrum of answers including good ones. Choose the best design using the validation subset performance summarized in the training record, tr. The successful net with the smallest number of hidden nodes is usually the best choice for a robust design.
Make sure you initialize the RNG before creating the net. That way you can duplicate your work
I have posted hundreds of multi-design examples. Typically, I design Ntrials = 10 nets for each trial value of hidden nodes in the range h=Hmin:dH:Hmax (numel(h)~10 ==> ~ 100 designs )
Search the NEWSGROUP and ANSWERS with
greg patternnet Ntrials
Hope this helps.
Thank you for formerly accepting my answer
Greg
Okay I think I understand this fairly well except for what you mean by RNG, are you referring to the Random Number Generator function?
Also I have been training the neural network and only just realised that my network outputs the test performance with the NaN value. Not sure why and also does that mean that my neural network is not actually doing a test of the dataset to see whether it is accurate or not.
I've made little changes to the code as follows.
inputs = Hand_Data_Inputs;
targets = Hand_Data_Targets;
% Create a Pattern Recognition Network
hiddenLayerSize = 22;
net = patternnet(hiddenLayerSize);
% Choose Input and Output Pre/Post-Processing Functions
net.inputs{1}.processFcns = {'removeconstantrows','mapminmax'};
net.outputs{2}.processFcns = {'removeconstantrows','mapminmax'};
% Setup Division of Data for Training, Validation, Testing
net.divideFcn = 'divideind'; % Divide data randomly
net.divideMode = 'sample'; % Divide up every sample
net.divideParam.trainInd = 1:2:3000;
net.divideParam.valInd = 2:2:2999;
net.divideParam.testInd = 3:2:2998;
% For a list of all training functions type: help nntrain
net.trainFcn = 'trainlm'; % Levenberg-Marquardt
% Choose a Performance Function
net.performFcn = 'mse'; % Mean squared error
% Choose Plot Functions
net.plotFcns = {'plotperform','plottrainstate','ploterrhist', ...
'plotregression', 'plotfit'};
% Train the Network
[net,tr] = train(net,inputs,targets);
% Test the Network
outputs = net(inputs);
errors = gsubtract(targets,outputs);
performance = perform(net,targets,outputs)
% Recalculate Training, Validation and Test Performance
trainTargets = targets .* tr.trainMask{1};
valTargets = targets .* tr.valMask{1};
testTargets = targets .* tr.testMask{1};
trainPerformance = perform(net,trainTargets,outputs)
valPerformance = perform(net,valTargets,outputs)
testPerformance = perform(net,testTargets,outputs)
Just had a check of tr and realised that the test index doesn't appear declared. Not too sure as to why this is happening
Again thank you for all the help you have provided so far Professor Heath

Sign in to comment.

More Answers (0)

Asked:

on 12 Oct 2014

Commented:

on 19 Oct 2014

Community Treasure Hunt

Find the treasures in MATLAB Central and discover how the community can help you!

Start Hunting!