Deep Learning Network Regularization and how to normalized multiple outputs like net.performParam.normalization in nntoolbox?
10 views (last 30 days)
Show older comments
Hi. I am Building Science Researcher studying Built-Environment control algorithm based on Deep Learning.
I have some questions during studying Neural Network Toolbox and Deep Learning Toolbox as following:
1. In Neural Network Toolbox, It is able to improve predicting multi outputs by net.performParam.normalization='normalized'.
So I am wondering how to improve predicting multi outputs based on Deep Learning Network (not neural net, but CNN, LSTM, GRU and so on)
like normalization to multi outputs. is there any code that can improve Multi Output Deep Learning Network's prediction like net.performParam.normalization='normalized'?
2. Does L2 Factor in Deep Learning Network trainingOption work similary as "net.performparam.regularization = r" (Maybe I think it doesn't)?
and, is there any regularization parameter of Deep Learning Network like "net.performparam.regularization = r" that changes mse to msereg?
3.How to choose ConcatenationLayer's concatenation order in 2D-Array?
ex) X = [1,2,3,4,5] Y=[6,7,8,9,10], Z = [11, 12, 13, 14, 15]
I want to concatenate X,Y,Z to W = [Z;Y;X].
How can choose the concatenation's dim. and order in ConcatenationLayer by the order of arrangement what I want e.g.) above following?
Answers (1)
Shubham
on 14 Feb 2024
I'm glad to help with your questions regarding the improvement of multi-output predictions and regularization in deep learning networks, as well as the use of concatenation layers in MATLAB. Let's address each of your questions:
- In deep learning networks, such as CNNs, LSTMs, and GRUs, normalization is typically applied to the input data rather than as a parameter of the network's performance function. To improve multi-output predictions, you can standardize or normalize your input data before training. For multi-output regression tasks, it is also common to normalize the target data.Additionally, batch normalization layers can be used within the network to normalize the activations and gradients as the data moves through the network. You can refer to this documentation for preprocessing in MATLAB: https://in.mathworks.com/help/matlab/preprocessing-data.html
- In MATLAB's Deep Learning Toolbox, L2 regularization is applied through the L2Regularization parameter in the training options. This works similarly to the net.performParam.regularization parameter in the Neural Network Toolbox, applying a penalty to the weights of the network to prevent overfitting. The L2Regularization parameter multiplies the L2 norm of the weights by the regularization factor and adds it to the loss function during training, which helps to control the complexity of the model. You can set this parameter using trainingOptions and refer to this documentation for this: https://in.mathworks.com/help/deeplearning/ref/trainingoptions.html
- When using the ConcatenationLayer in MATLAB, you specify the dimension along which concatenation should occur using the concatenationDimension argument. However, the order of concatenation depends on how you arrange the inputs to the layer. If you want to concatenate along the first dimension and control the order, you should ensure that the inputs to the concatenation layer are provided in the desired order. Here's an example of how you can concatenate arrays X, Y, and Z in the order Z; Y; X:
% Define the arrays
X = [1,2,3,4,5];
Y = [6,7,8,9,10];
Z = [11,12,13,14,15];
% Concatenate in the order Z; Y; X
W = cat(1, Z, Y, X);
If you're using a ConcatenationLayer in a deep learning network, you need to ensure that the inputs to this layer are provided in the order [Z; Y; X]. The concatenation layer would then be defined with concatenationDimension set to 1 (for concatenating along rows):
concatenationLayer = concatenationLayer(1, 'Name', 'concat');
When constructing the network, make sure the layers feeding into the ConcatenationLayer are arranged such that Z is the first input, Y is the second, and X is the third.
You can refer to this documentation to read more about concatenationLayer: https://in.mathworks.com/help/deeplearning/ref/nnet.cnn.layer.concatenationlayer.html
0 Comments
See Also
Categories
Find more on Sequence and Numeric Feature Data Workflows in Help Center and File Exchange
Community Treasure Hunt
Find the treasures in MATLAB Central and discover how the community can help you!
Start Hunting!