Error using trainnet (line 46)
Show older comments
Dear sir;
My XTrain 48941x1 cell and TTrain 48941x1 categorical as a shown at below

why does I get this error?
Error using trainnet (line 46)
Number of observations in predictors (48941) and targets (1) must match. Check that
the data and network are consistent.
layers = [
sequenceInputLayer([30 30 1],'Name','input') % For 2-D image sequence input, InputSize is vector of three elements [h w c], where h is the image height, w is the image width, and c is the number of channels of the image.
convolution2dLayer(5,8,'Stride',1,'Padding','same','WeightsInitializer','he','Name','conv','DilationFactor',1);
batchNormalizationLayer('Name','bn') % A batch normalization layer normalizes a mini-batch of data across all observations for each channel independently. To speed up training of the convolutional neural network and reduce the sensitivity to network initialization, use batch normalization layers between convolutional layers and nonlinearities, such as ReLU layers. After normalization, the layer scales the input with a learnable scale factor γ and shifts it by a learnable offset β.
reluLayer('Name','Relu') % An ELU activation layer performs the identity operation on positive inputs and an exponential nonlinearity on negative inputs.
convolution2dLayer(5,8,'Stride',2,'Padding','same','WeightsInitializer','he','Name','conv','DilationFactor',1);
batchNormalizationLayer('Name','bn') % A batch normalization layer normalizes a mini-batch of data across all observations for each channel independently. To speed up training of the convolutional neural network and reduce the sensitivity to network initialization, use batch normalization layers between convolutional layers and nonlinearities, such as ReLU layers. After normalization, the layer scales the input with a learnable scale factor γ and shifts it by a learnable offset β.
reluLayer('Name','Relu') % An ELU activation layer performs the identity operation on positive inputs and an exponential nonlinearity on negative inputs.
convolution2dLayer(5,8,'Stride',1,'Padding','same','WeightsInitializer','he','Name','conv','DilationFactor',1);
batchNormalizationLayer('Name','bn') % A batch normalization layer normalizes a mini-batch of data across all observations for each channel independently. To speed up training of the convolutional neural network and reduce the sensitivity to network initialization, use batch normalization layers between convolutional layers and nonlinearities, such as ReLU layers. After normalization, the layer scales the input with a learnable scale factor γ and shifts it by a learnable offset β.
reluLayer('Name','Relu') % An ELU activation layer performs the identity operation on positive inputs and an exponential nonlinearity on negative inputs.
convolution2dLayer(5,16,'Stride',2,'Padding','same','WeightsInitializer','he','Name','conv','DilationFactor',1);
batchNormalizationLayer('Name','bn') % A batch normalization layer normalizes a mini-batch of data across all observations for each channel independently. To speed up training of the convolutional neural network and reduce the sensitivity to network initialization, use batch normalization layers between convolutional layers and nonlinearities, such as ReLU layers. After normalization, the layer scales the input with a learnable scale factor γ and shifts it by a learnable offset β.
reluLayer('Name','Relu') % An ELU activation layer performs the identity operation on positive inputs and an exponential nonlinearity on negative inputs.
convolution2dLayer(5,16,'Stride',1,'Padding','same','WeightsInitializer','he','Name','conv','DilationFactor',1);
batchNormalizationLayer('Name','bn') % A batch normalization layer normalizes a mini-batch of data across all observations for each channel independently. To speed up training of the convolutional neural network and reduce the sensitivity to network initialization, use batch normalization layers between convolutional layers and nonlinearities, such as ReLU layers. After normalization, the layer scales the input with a learnable scale factor γ and shifts it by a learnable offset β.
reluLayer('Name','Relu') % An ELU activation layer performs the identity operation on positive inputs and an exponential nonlinearity on negative inputs.
convolution2dLayer(5,32,'Stride',2,'Padding','same','WeightsInitializer','he','Name','conv','DilationFactor',1);
batchNormalizationLayer('Name','bn') % A batch normalization layer normalizes a mini-batch of data across all observations for each channel independently. To speed up training of the convolutional neural network and reduce the sensitivity to network initialization, use batch normalization layers between convolutional layers and nonlinearities, such as ReLU layers. After normalization, the layer scales the input with a learnable scale factor γ and shifts it by a learnable offset β.
reluLayer('Name','Relu') % An ELU activation layer performs the identity operation on positive inputs and an exponential nonlinearity on negative inputs.
globalAveragePooling2dLayer(Name="gap1")
fullyConnectedLayer(7)
softmaxLayer];
options = trainingOptions("adam", ...
MaxEpochs=4, ...
InitialLearnRate=0.002,...
MiniBatchSize=128,...
GradientThreshold=1, ...
LearnRateSchedule="piecewise", ...
LearnRateDropPeriod=20, ...
LearnRateDropFactor=0.8, ...
L2Regularization=1e-3,...
Shuffle="every-epoch", ...
Plots="training-progress", ...
ObjectiveMetricName="loss", ...
OutputNetwork="best-validation", ...
ValidationPatience=5, ... % Specify the validation patience as 5 so training stops if the recall has not decreased for five iterations.
ValidationFrequency=50, ...
Verbose=false, ...
Metrics="accuracy", ...
ValidationData={XValidation,TValidation});
net = trainnet(XTrain,TTrain,layers,"crossentropy",options);
3 Comments
Matt J
on 15 Oct 2025
My XTrain 48941x1 cell and TTrain 48941x1 categorical as a shown at below
Please attach them (or non-proprietory data of the same structure) so that we can try to reproduce the error.
Bahadir
on 15 Oct 2025
Walter Roberson
on 15 Oct 2025
In order to test we would need corresponding XValidation and TValidation
Accepted Answer
More Answers (0)
Categories
Find more on Image Data Workflows in Help Center and File Exchange
Community Treasure Hunt
Find the treasures in MATLAB Central and discover how the community can help you!
Start Hunting!