Matlab Shallow Network Mini Batch Training
17 次查看(过去 30 天)
显示 更早的评论
Hello, I have been training my data through the patternnet provided by matlab and really like it's functionality and I've been seeing great results using it. I have a problem however, I want to start investigating all the functions that can be adjusted under the hood of the default patternnet, but I have such a large data size, that even though I'm connected to a cluster, my model takes about 10 hours at minimum to train. I know there are capabilities with training on the GPU but after several attempts, it says I have no memory for training. I know having a minibatch might be able to compensate for this, but I'm not entirely sure if I have to create a datastore for the minibatch to be effective. If anyone has input a minibatch into the shallow network inputs and trained on GPU, please give me some insight on the right direction to go with this. Thanks in advance.
4 个评论
sayak nag
2019-3-15
Please help I am following your advice but it seems that despite whatever I specify as my mini-batch the network is training in batch mode i.e. no of iterations per epoch is 1.
回答(1 个)
Greg Heath
2018-12-20
If you have a huge dataset, it is often rewarding to just randomly divide it into m subsets. Then design with 1 and test on m-1. If the subsets are sufficiently large. it is not necessary to use m-fold cross-validation. However, you may want to design more than one net.
Hope this helps.
Thank you for formally accepting my answer
Greg
0 个评论
另请参阅
类别
在 Help Center 和 File Exchange 中查找有关 Sequence and Numeric Feature Data Workflows 的更多信息
Community Treasure Hunt
Find the treasures in MATLAB Central and discover how the community can help you!
Start Hunting!