Hi NGR,
Based on my understanding of your question, you have a small dataset of sequences and are seeking to determine the optimal mini-batch size for training an LSTM network to ensure efficient learning and strong model performance. Given the relatively small size of your dataset (51 training samples), opting for a smaller mini-batch size might be more suitable. This approach allows the model to update more frequently and can enhance learning from limited data.
For your dataset, starting with a mini-batch size between 1 and 8 could be beneficial, as it permits more frequent updates. You can experiment with these smaller sizes and gradually increase them to observe how they impact your model's performance and training duration.
You can refer to this article for more detailed information:
I hope this information is helpful.
Thank you.