Best way to deal with large data for deep learning?
3 次查看(过去 30 天)
显示 更早的评论
Hi, I have been trying image classification with CNNs. I have some 350,000 images that I read and stored in a 4D matrix of size (170 x 170 x 3 x 350,000) in a data.mat file. I used matfile to keep adding new images to my data.mat file. The resultant file is almost 20GB
The problem now is that I cannot access the saved images because I run out of memory.
Do anyone have any suggestions for more efficient ways to build large data for deep learning?
One solution I can apply is to split the data and train two networks one with weights initialized by the others final weights, but I don't want to take that route!
2 个评论
KSSV
2016-6-22
You want to process the whole data (170 x 170 x 3 x 350,000) at once or you are using only one matrix (170X170X3) at one step?
采纳的回答
更多回答(0 个)
另请参阅
类别
在 Help Center 和 File Exchange 中查找有关 Recognition, Object Detection, and Semantic Segmentation 的更多信息
Community Treasure Hunt
Find the treasures in MATLAB Central and discover how the community can help you!
Start Hunting!