RBF newrb, array exceeds maximum size

1 次查看(过去 30 天)
My input dataset is 13x778162 large. I tried to create RBF network by newrb, but I got error: Error using zeros Requested 778162x778162 (4511.6GB) array exceeds maximum array size preference. My RBF network:
eg = 0.1; % sum-squared error goal
sc = 0.2; % spread constant
mn = 10; % maximum number of neurons
df = 1; % number of neurons to add between displays
net = newrb(input,target,eg,sc,mn);
Using all 778162 neurons is too much, I understand. But I use function newrb, so I thought, that I can set maximum number of neurons by parametr mn, which is set to 10 neurons, but matlab still uses too much space.
  1 个评论
SHAUIFENG JIANG
SHAUIFENG JIANG 2018-12-6
Hey Edwood,
I am facinig the same problem and I have a same consideration just as you did. I can not find the link of the answer of Greg. Would you please help me a little bit?

请先登录,再进行评论。

采纳的回答

Greg Heath
Greg Heath 2016-8-26
编辑:Greg Heath 2016-8-26
1. See my NEWRB posts in the NEWSGROUP and ANSWERS
NEWSGROUP hits ANSWERS hits
greg NEWRB 149 63
2. Reverse chronological order is probably the most efficient
Your data appears to be 13 dimensional. Typically, 30 random points per dimension is sufficient for a good training set.
You don't say whether this is classification or regression. The procedures will be different.
I would start with 10 random sets of ~400 or 500 and design 10 nets. Then run the rest of the data through the nets, saving all misclassified vectors to be used as training vectors for new clusters.
Hope this helps.
Thank you for formally accepting my answer
Greg
  1 个评论
EdWood
EdWood 2016-8-26
I use it for regression. Choosing all misclassified vectors could actually lead to overfitting right?

请先登录,再进行评论。

更多回答(0 个)

类别

Help CenterFile Exchange 中查找有关 Deep Learning Toolbox 的更多信息

标签

Community Treasure Hunt

Find the treasures in MATLAB Central and discover how the community can help you!

Start Hunting!

Translated by