how to deal with out of memory error when training a large data set or large neural network narx

2 次查看(过去 30 天)
how to deal with out of memory error when training a large data set or large neural network stacked autoencoder ......just initiated to analyze deep neural network with time series predictions , what i guess is by using batch trainings but can any one help me in detail

回答(2 个)

Greg Heath
Greg Heath 2015-4-15
What are
1. input and target
2. size(input), size(target)
3. Significant lags of
a. target autocorrelation function
b. input/target cross correlation function

SAM
SAM 2015-4-15
编辑:SAM 2015-4-15
i am xtremly sorry i wrote narx but i actually tried it with stacked autoencoder . hidden layers [100 150] even on HPC server i was getting same out of memory error.
inputs are 4 *2795 and output 1*2795 .

类别

Help CenterFile Exchange 中查找有关 Sequence and Numeric Feature Data Workflows 的更多信息

Community Treasure Hunt

Find the treasures in MATLAB Central and discover how the community can help you!

Start Hunting!

Translated by