Peformance issue while fetching data from C Mex function.
2 次查看(过去 30 天)
显示 更早的评论
Hi,
We have C mex funciton that is implemented to fetch data from external source over network and fill this data in C mex function(using mxCreateDoubleMatrix and copy this data into plhs array), when this data is huge(around 22 MB), accessing this data on matlab is very slow takes while(20 seconds), if we chunk this data(in 100Kbs), data is copied and accessed on matlab variable faster. I wanted to know, is there memory limit while allocating and copying data and access into matlab variable. Appreciate your suggestions.
Thank you'
2 个评论
Jan
2022-6-8
Without seeing the code it is impossible to guess, if it contains an avoidable bottleneck. 22MB is not considered as "huge" usually. "100Kbs" is a strange unit and it is not clear, how this could define a chunk.
The memory limit is defined in Matlab's preferences => WorkSpace => Array size limit
回答(0 个)
另请参阅
Community Treasure Hunt
Find the treasures in MATLAB Central and discover how the community can help you!
Start Hunting!