LSTM in MATLAB compared to in Python Tensorflow for Timeseries predictions
16 views (last 30 days)
Show older comments
I wish to understand the differences between the TensorFlow Keras implementation of an LSTM and MATLAB's.
First of all, Keras requires a "lookback window" for prediction making, whereas MATLAB's implementation does not have this features. Does MATLAB's LSTM make a prediction based on a fixed size time window and, if yes, where can I adjust this length? If no, how does LSTM in MATLAB function in this regard? For instance, Keras requires users to restructure the dataset from a timeseries problem to a supervised problem by creating a 3D array from the usual 2D matrices of predictors and targets. This is done by creating smaller "batches" of predictor and target matrices of length lookback and stacking these slices in the 3D structure fed into the LSTM. Intuitively speaking, the original whole matrix for the time series of dimensions (n. of timeseries observations)x(predictors OR targets) is now ( n. of timeseries observations / window )x(window)x(predictors OR targets).
Secondly, I am struggling to understand the number of Hidden Units in the LSTM layer of MATLAB's implementation and what does increasing / decreasing the number actually do.
Finally, where is the cell state of the LSTM implemented in MATLAB and how can I adjust this?
Kind regards,
Angelo
0 Comments
Answers (1)
Vatsal
on 22 Sep 2023
Edited: Vatsal
on 29 Sep 2023
I understand that you wanted to know the difference between the TensorFlow Keras implementation of an LSTM and MATLAB's implementation.
MATLAB’s LSTM makes a prediction based on a fixed size time window, the length of this time window is determined by the number of time steps or sequence length used as input to the LSTM network. You can adjust the length of the input sequence by modifying the "SequenceLength" parameter when creating the LSTM network in MATLAB.
Unlike Keras, MATLAB's LSTM implementation does not require restructuring the dataset into a 3D array. Instead, you can directly provide a 2D matrix of predictors and targets to the LSTM network. The LSTM layer in MATLAB takes care of handling the sequential nature of the data internally.
In MATLAB’s LSTM implementation, the number of hidden units indicates the complexity and capacity of the LSTM model. If the number of hidden units are increased then the model can capture more complex patterns, whereas decreasing the number of hidden units reduces the model's capacity.
In MATLAB, the cell state of the LSTM is implemented within the LSTM layer itself. You do not need to explicitly adjust the cell state as it is automatically updated and maintained by the LSTM layer during training and prediction.
You can also refer to the MATLAB documentation for "LSTM" to obtain more information on its usage and syntax. The link is provided below: -
0 Comments
See Also
Categories
Find more on Sequence and Numeric Feature Data Workflows in Help Center and File Exchange
Community Treasure Hunt
Find the treasures in MATLAB Central and discover how the community can help you!
Start Hunting!