Rick Amos
MathWorks
Followers: 0 Following: 0
Developer at MathWorks, working on Parallel Computing Toolbox.
Feeds
已回答
Integrating Parallel Computing with App Designer for Real-Time UI Updates
My recommendation is to use a mix of parfeval (to launch parallel work) with afterEach/afterAll (to update the app in response t...
Integrating Parallel Computing with App Designer for Real-Time UI Updates
My recommendation is to use a mix of parfeval (to launch parallel work) with afterEach/afterAll (to update the app in response t...
21 days 前 | 1
已回答
Reading portions of several CSV files
You might want to take a look at tabularTextDatastore and/or tall arrays. These are geared up to do exactly this kind of thing:-...
Reading portions of several CSV files
You might want to take a look at tabularTextDatastore and/or tall arrays. These are geared up to do exactly this kind of thing:-...
5 years 前 | 5
已回答
tallArray sorting and plot error
Instead of converting State to a character vector, I would recommend using one of the string utilities. These are all supported ...
tallArray sorting and plot error
Instead of converting State to a character vector, I would recommend using one of the string utilities. These are all supported ...
5 years 前 | 0
已回答
Pass fileDatastore explicit AWS credentials
It is currently not possible to pass explicit credentials directly into fileDatastore. Datastore retrieves the credentials in th...
Pass fileDatastore explicit AWS credentials
It is currently not possible to pass explicit credentials directly into fileDatastore. Datastore retrieves the credentials in th...
5 years 前 | 0
已回答
Too many .mat files when using Write Tall Table
As you've likely guessed, vertical concatenation of 1600 arrays results in at least 1600 files from tall/write. This is the case...
Too many .mat files when using Write Tall Table
As you've likely guessed, vertical concatenation of 1600 arrays results in at least 1600 files from tall/write. This is the case...
5 years 前 | 0
| 已接受
已回答
Generate a Datastore for binary data?
As of R2017b, MATLAB now allows you to write custom datastores. This will allow you to create datastores that handle large binar...
Generate a Datastore for binary data?
As of R2017b, MATLAB now allows you to write custom datastores. This will allow you to create datastores that handle large binar...
7 years 前 | 2
已回答
How to check Hadoop and Matlab Integrated Properly or Not ???
To look at whether MATLAB is running on the Hadoop cluster correctly, your best bet is to look at the Hadoop/Yarn Web UI. By def...
How to check Hadoop and Matlab Integrated Properly or Not ???
To look at whether MATLAB is running on the Hadoop cluster correctly, your best bet is to look at the Hadoop/Yarn Web UI. By def...
7 years 前 | 0
| 已接受
已回答
How to create tall datastore from multiple data parts?
A datastore can be created from a collection of folders and so the easiest way to achieve this is to place each block of data in...
How to create tall datastore from multiple data parts?
A datastore can be created from a collection of folders and so the easiest way to achieve this is to place each block of data in...
8 years 前 | 0
已回答
Datastore for custom files
For files on Hadoop, the File Datastore makes a temporary copy of the file on the local drive to be read by the ReadFcn. The Rea...
Datastore for custom files
For files on Hadoop, the File Datastore makes a temporary copy of the file on the local drive to be read by the ReadFcn. The Rea...
8 years 前 | 0
| 已接受
已回答
How to solve ' All the input files must be Sequence files. Invalid file: '' '
This error message occurred because MATLAB could not find the output files generated by the Hadoop Job. For now, this error mess...
How to solve ' All the input files must be Sequence files. Invalid file: '' '
This error message occurred because MATLAB could not find the output files generated by the Hadoop Job. For now, this error mess...
9 years 前 | 1
| 已接受
已回答
How to solve" The user home directory '/homes/' for the cluster either did not exist or was not writable by the HADOOP user."
The '/homes/' folder is related to the Hadoop property 'mapreduce.admin.user.home.dir'. This is used by Hadoop to set the HOME e...
How to solve" The user home directory '/homes/' for the cluster either did not exist or was not writable by the HADOOP user."
The '/homes/' folder is related to the Hadoop property 'mapreduce.admin.user.home.dir'. This is used by Hadoop to set the HOME e...
9 years 前 | 1
| 已接受
已回答
How to make a MAT-file that can be used to create a Datastore for MapReduce?
In R2014b there is currently not a direct way of creating a MAT file datastore. However, there are several indirect ways that wi...
How to make a MAT-file that can be used to create a Datastore for MapReduce?
In R2014b there is currently not a direct way of creating a MAT file datastore. However, there are several indirect ways that wi...
10 years 前 | 2
| 已接受
已回答
Mapreduce does not seem to use all available cores
In R2014b, there are some limitations with the minimum size of data that can be parallelized. To avoid this limitation, the inpu...
Mapreduce does not seem to use all available cores
In R2014b, there are some limitations with the minimum size of data that can be parallelized. To avoid this limitation, the inpu...
10 years 前 | 0
| 已接受
已回答
MCR problems with linux/hadoop
This behavior arises because Hadoop defines the 'HOME' environment variable of task processes to be '/homes' by default. MATLAB ...
MCR problems with linux/hadoop
This behavior arises because Hadoop defines the 'HOME' environment variable of task processes to be '/homes' by default. MATLAB ...
10 years 前 | 0