Deep Learning Layers to increase training accuracy

3 views (last 30 days)
Hi everyones.
I want to use DL for modeling a problem with 6 inputs and one output.
I've used the following layers but I cannot increase the model accuracy not only for unseen samples but also for training samples.
I've check different structures and try to generated a complex model as much as possible to get at least good results in training stages
numHiddenNeuron = 100;
layers = [
featureInputLayer(numFeatures,'Normalization','rescale-symmetric')
fullyConnectedLayer(numHiddenNeuron)
reluLayer('Name','relu')
batchNormalizationLayer
fullyConnectedLayer(numOut)
regressionLayer('Name','regression')];
I would be appreciated it if you could help me.
Regards,

Answers (1)

yanqi liu
yanqi liu on 23 Feb 2022
yes,sir,may be add some dropoutLayer in net Layers,such as
numHiddenNeuron = 100;
layers = [
featureInputLayer(numFeatures,'Normalization','rescale-symmetric')
fullyConnectedLayer(numHiddenNeuron)
reluLayer('Name','relu')
batchNormalizationLayer
dropoutLayer
fullyConnectedLayer(numOut)
regressionLayer('Name','regression')];
% or
layers = [ ...
sequenceInputLayer(numFeatures)
lstmLayer(100,'OutputMode','sequence')
dropoutLayer(0.3)
lstmLayer(50,'OutputMode','sequence')
dropoutLayer(0.2)
fullyConnectedLayer(numOut)
regressionLayer];
if possible,may be upload your data to analysis
  2 Comments
Jahetbe
Jahetbe on 23 Feb 2022
Thank you for your response.
I've check the first one and the results didn't change.
For the 2nd one, my data is not time series and I cannot use the "sequenceInputLayer" as well as "lstmLayer".
Dou you have any other recommendation? Infortunately the resutls are vey bad
Regards
Jahetbe
Jahetbe on 23 Feb 2022
I've check it agsin, unfortunately after some iterations (lest than 20), the RMSE does not decrease

Sign in to comment.

Categories

Find more on Deep Learning Toolbox in Help Center and File Exchange

Community Treasure Hunt

Find the treasures in MATLAB Central and discover how the community can help you!

Start Hunting!