Testing Guidelines for Custom Datastores
All datastores that are derived from the custom datastore classes share some common behaviors. This test procedure provides guidelines to test the minimal set of behaviors and functionalities that all custom datastores should have. You will need additional tests to qualify any unique functionalities of your custom datastore.
If you have developed your custom datastore based on instructions in Develop Custom Datastore, then follow these test procedures to qualify your custom datastore. First perform the unit tests, followed by the workflow tests:
Unit tests qualify the datastore constructor and methods.
Workflow tests qualify the datastore usage.
For all these test cases:
Unless specified in the test description, assume that you are testing a nonempty datastore
ds
.Verify the test cases on the file extensions, file encodings, and data locations (like Hadoop®) that your custom datastore is designed to support.
Unit Tests
Construction
The unit test guidelines for the datastore constructor are as follows.
Test Case Description | Expected Output |
---|---|
Check if your custom datastore constructor works with the minimal required inputs. |
Datastore object of your custom datastore type with the minimal expected properties and methods |
Check if your datastore object Run this command: isa(ds,'matlab.io.Datastore') |
|
Call your custom datastore constructor with the required inputs and any supported input arguments and name-value pair arguments. |
Datastore object of your custom datastore type with the minimal expected properties and methods |
read
Unit test guidelines for the read
method
Test Case Description | Expected Output |
---|---|
Call the t = read(ds); |
Data from the beginning of the datastore If you specify read size, then the size of the returned data is equivalent to read size. |
Call the t = read(ds); |
Data starting from the end point of the previous read operation If you specify read size, then the size of the returned data is equivalent to read size. |
Continue calling the while(hasdata(ds)) t = read(ds); end |
No errors Correct data in the correct format |
When data is available to read, check the Call a datastore object
[t,info] = read(ds); |
No error
|
When no more data is available to read, call |
Either expected output or an error message based on your custom datastore implementation. |
readall
Unit test guidelines for the readall
method
Test Case Description | Expected Output |
---|---|
Call the |
All data |
Call the Read from the datastore until while(hasdata(ds)) t = read(ds); end readall(ds) |
All data |
hasdata
Unit test guidelines for the hasdata
method
Test Case Description | Expected Output |
---|---|
Call the |
|
Call the |
|
When more data is available to read, call the |
|
When no more data is available to read, call the |
|
reset
Unit test guidelines for the reset
method
Test Case Description | Expected Output |
---|---|
Call the Verify that the
reset(ds); t = read(ds); |
No errors The If you specify read size, then the size of the returned data is equivalent to read size. |
When more data is available to read, call the Verify that the |
No errors The If you specify read size, then the size of the returned data is equivalent to read size. |
When more data is available to read, call the Verify that the |
No errors The If you specify read size, then the size of the returned data is equivalent to read size. |
When no more data is available to read, call the Verify that |
No errors The If you specify read size, then the size of the returned data is equivalent to read size. |
progress
Unit test guidelines for the progress
method
Test Case Description | Expected Output |
---|---|
Call the |
|
Call the
readall(ds); progress(ds) |
|
Call the |
A fraction between |
Call the |
|
preview
Unit test guidelines for the preview
method
Test Case Description | Expected Output |
---|---|
Call |
The |
Call |
The |
Call |
The |
Call |
The |
Call |
The |
Call |
The If you specify read size, then the size of the returned data is equivalent to read size. |
Call |
The |
While datastore has data available to read, call |
The |
partition
Unit test guidelines for the partition
method
Test Case Description | Expected Output |
---|---|
Call Call subds = partition(ds,n,index) read(subds) Verify that the partition is valid. isequal(properties(ds),properties(subds)) isequal(methods(ds),methods(subds)) |
The The returned partition The partitioned datastore The Calling If you specify read size, then the size of the returned data is equivalent to read size. |
Call Verify the data returned by calling subds = partition(ds,1,1) isequal(properties(ds),properties(subds)) isequal(methods(ds),methods(subds)) isequaln(read(subds),read(ds)) isequaln(preview(subds),preview(ds)) |
The partition The partition The |
Call | The repartitioning of a partition of the datastore should work without errors. |
initializeDatastore
If your datastore inherits from matlab.io.datastore.HadoopFileBased
, then verify the behavior of
initializeDatastore
using the guidelines in this table.
Test Case Description | Expected Output |
---|---|
Call The
For example, initialize the info = struct('FileName','myFileName.ext',... 'Offset',0,'Size',500) initializeDatastore(ds,info) Verify the initialization by examining the properties of your datastore object. ds |
The |
getLocation
If your datastore inherits from matlab.io.datastore.HadoopFileBased
, then verify the behavior of
getLocation
using these guidelines.
Test Case Description | Expected Output |
---|---|
Call location = getLocation(ds) Based on your custom datastore implementation, the
If resolve(location) |
The |
isfullfile
If your datastore inherits from matlab.io.datastore.HadoopFileBased
, then verify the behavior of
isfullfile
using these guidelines.
Test Case Description | Expected Output |
---|---|
Call |
Based on your custom datastore implementation, the
|
Workflow Tests
Verify your workflow tests in the appropriate environment.
If your datastore inherits only from
matlab.io.Datastore
, then verify all workflow tests in a local MATLAB® session.If your datastore has parallel processing support (inherits from
matlab.io.datastore.Partitionable
), then verify your workflow tests in parallel execution environments, such as Parallel Computing Toolbox™ and MATLAB Parallel Server™.If your datastore has Hadoop support (inherits from
matlab.io.datastore.HadoopFileBased
), then verify your workflow tests in a Hadoop cluster.
Tall Workflow
Testing guidelines for the tall
workflow
Test Case Description | Expected Output |
---|---|
Create a tall array by calling t = tall(ds) |
The |
For this test step, create a datastore object with data that fits in your system memory. Then, create a tall array using this datastore object. t = tall(ds) If your data is numeric, then apply an appropriate function like the
If your data is of the data type Apply For examples, see Big Data Workflow Using Tall Arrays and Datastores (Parallel Computing Toolbox). |
No errors The function returns an output of the correct data type (not of a
The function returns the same result whether it is applied to
|
MapReduce Workflow
Testing guidelines for the MapReduce workflow
Test Case Description | Expected Output |
---|---|
Call outds = mapreduce(ds,@mapper,@reducer) mapreduce .
To support the use of the |
No error The MapReduce operation returns the expected result |
Next Steps
Note
This test procedure provides guidelines to test the minimal set of behaviors and functionalities for custom datastores. Additional tests are necessary to qualify any unique functionalities of your custom datastore.
After you complete the implementation and validation of your custom datastore, your custom datastore is ready to use.
To add help for your custom datastore implementation, see Create Help for Classes.
To share your custom datastore with other users, see Create and Share Toolboxes.
See Also
matlab.io.Datastore
| matlab.io.datastore.Partitionable
| matlab.io.datastore.HadoopLocationBased