How to avoid memory problem while processing huge table?

1 次查看(过去 30 天)
I have a huge observation table with around 30 Lacs of rows and 12 columns. While training knn classifier in 2016a version, I am getting errors related to memory. Is there any way to avoid this? I have tried to reduce rows but it's affecting the output quality.
Each row in table is a pixel and it's other values as features in columns. In one set of MRI scan, there are around 20 images of 512x512, I am loading one set for creating observation table. Is there another way to pass large amount of data to knn classifier?

回答(1 个)

KSSV
KSSV 2016-8-31
doc datastore, memmap, mapreduce.
  1 个评论
Nitinkumar Ambekar
Thanks @Dr. Siva, one small query: Can I pass one of these to a function which takes `table` or `matrix`?

请先登录,再进行评论。

Community Treasure Hunt

Find the treasures in MATLAB Central and discover how the community can help you!

Start Hunting!

Translated by