LSTM SequenceLength and Batch Explained
19 次查看(过去 30 天)
显示 更早的评论
Hi Matlab staff
For the current example https://www.mathworks.com/help/deeplearning/examples/time-series-forecasting-using-deep-learning.html
In this example, the sequencelength is by default longest. Does that mean that the sequencelength is 500 (I dont think it is! as this would make the algorithm much more computational extensive). Does that mean sequencelength is 1 then? It is unclear how the data is fed into the model. Further, a single batch is used (I assume! because changing minibatch size does not change the algorithm). Could you elaborate on the sequencelength and batchsize in this particular example ?.
I think this is very important for LSTM in Matlab community that there is full understanding of how the LSTM in Matlab is designed.
Cheers, MB
2 个评论
John D'Errico
2019-3-17
We are not MATLAB staff, although some MathWorks employees may drop in here, on their free time.
采纳的回答
Abhishek Singh
2019-3-25
Hi MB Sylvest,
According to the documentation provided you can know about the sequence length and batch size using trainingOptions.
When you try running the example and see the values of trainingOptions you’ll get to know that it took the default sequence length as ‘longest’ (which means it is 500) and batch size as 128.
I’ve added the screenshot for your reference:
You may also find these links to be useful:
- https://www.mathworks.com/help/deeplearning/ug/long-short-term-memory-networks.html
- https://www.mathworks.com/help/deeplearning/ref/trainingoptions.html
0 个评论
更多回答(0 个)
另请参阅
类别
在 Help Center 和 File Exchange 中查找有关 Image Data Workflows 的更多信息
Community Treasure Hunt
Find the treasures in MATLAB Central and discover how the community can help you!
Start Hunting!