Out of memory gather()
1 次查看(过去 30 天)
显示 更早的评论
I am reading the documentation on this topic Deep Learning Denoise. After this line my computer is not responding.
[targets,predictors] = gather(targets,predictors);
Error: Out of memory.
I change number of dataset in this line to 300 and all works good
adsTrain = audioDatastore(fullfile(dataFolder,'train'),'IncludeSubfolders',true);
reduceDataset = true;
if reduceDataset
adsTrain = shuffle(adsTrain);
adsTrain = subset(adsTrain,1:1000); % to 300
end
but sometimes i get this Error.
How can I examine the entire dataset if it is really big (87000 files). Or at least 1000. I read the documentation about gather (), but I still don't understand how to get around this Error.
0 个评论
采纳的回答
jibrahim
2021-4-23
Hi Maxim,
In many practical situations, your data might not all fit in memory. There is a modified workflow for that case. See, for example:
and
更多回答(1 个)
neal paze
2021-9-1
I also have a problem of gather in this document. Have you solved your problem? Can you share how to modify this line of code?
0 个评论
另请参阅
类别
在 Help Center 和 File Exchange 中查找有关 Measurements and Feature Extraction 的更多信息
Community Treasure Hunt
Find the treasures in MATLAB Central and discover how the community can help you!
Start Hunting!