Out of memory during neural network training

4 次查看(过去 30 天)
I know this is a common problem, but all the solutions I have tried have failed.
Basically I want to train a big neural network and I obtain 'Out of memory' error.
My training set is a 729x3456 matrix of doubles and the neural network is a so called 'autoencoder' with layers of these sizes
3456 - 4000 - 2000 - 1000 - 300 - 1000 - 2000 - 4000 - 3456
In my code, first of all I do
net = feedforwardnet([layer1, layer2, layer3, layer4, layer3, layer2, layer1], 'trainscg');
net = configure(net, Dtrain', Dtrain');
where I use the 'trainscg' function because I read that it is the one that uses less memory. Then I initialize the weights and biases according to some values (which I have already calculated), set the 'transferFcn' and start training.
I tried cleaning the workspace as much as possible and I also tried to put
net.efficiency.memoryReduction = 4;
before training, since I read it can help. Anyway I still have 'Out of memory', even if I increase the value to 60.
Here is the output of the command 'memory', executed when the workspace contains just the training set and four numbers (the size of the layers)
>> memory
Maximum possible array: 4508 MB (4.727e+09 bytes) *
Memory available for all arrays: 4508 MB (4.727e+09 bytes) *
Memory used by MATLAB: 1927 MB (2.020e+09 bytes)
Physical Memory (RAM): 8080 MB (8.472e+09 bytes)
* Limited by System Memory (physical + swap file) available.
What else can I do to solve the problem?

采纳的回答

Greg Heath
Greg Heath 2015-4-20
You will never be able to solve a problem of that size. I suggest
1. Using feature extraction to SUBSTANTIALLY reduce the input dimensionality.
2. Use no more than 1 or 2 hidden layers.
  4 个评论

请先登录,再进行评论。

更多回答(0 个)

类别

Help CenterFile Exchange 中查找有关 Deep Learning Toolbox 的更多信息

Community Treasure Hunt

Find the treasures in MATLAB Central and discover how the community can help you!

Start Hunting!

Translated by