LSTM Python hyperparameters v MATLAB
1 view (last 30 days)
Show older comments
I am reading a LSTM research paper and it states:
The following experiments investigate deep RNN models parameterized by the following hyperparameters: 1. num_layers – the number of memory cell layers 2. rnn_size – the number of hidden units per memory cell (i.e. hidden state dimension) 3. wordvec – dimension of vector embeddings 4. seq_length – number of frames before truncating BPTT gradient
I can see 2 and 3 as the number of hidden units and the input size but I cannot find where one would set 1 and 4.
0 Comments
Answers (2)
David Willingham
on 1 Jun 2022
Hi Philip,
For 1, by default layers are not a "settable" parameter. You need to setup an experiment that tests networks of different sizes and see which one might give the best results. This example Try Multiple Pretrained Networks for Transfer Learning shows how you can use the Experiment Manager App in MATLAB to do this.
For 4, while I don't have an example to share on this. You could use Experiment Manager to setup an experiment that changes the sequence length of the input data used to feed the LSTM training.
2 Comments
David Willingham
on 1 Jun 2022
For 1, in MATLAB this isn't a settable parameter, however you can set them manually:
[lstmLayer(64); lstmLayer(64)]
For 4, there is an option to set sequencelength for the mini batch in the trainingoptions:
See Also
Categories
Find more on Sequence and Numeric Feature Data Workflows in Help Center and File Exchange
Community Treasure Hunt
Find the treasures in MATLAB Central and discover how the community can help you!
Start Hunting!