Main Content

matlab.io.datastore.PartitionableByIndex Class

Namespace: matlab.io.datastore

(Not recommended) Add parallelization support to datastore

matlab.io.datastore.PartitionableByIndex is not recommended. For more information, see Compatibility Considerations.

Description

matlab.io.datastore.PartitionableByIndex is an abstract mixin class that adds parallelization support to your custom datastore for use with Deep Learning Toolbox™. This class requires Parallel Computing Toolbox™.

To use this mixin class, you must inherit from the matlab.io.datastore.PartitionableByIndex class in addition to inheriting from the matlab.io.Datastore base class. Type the following syntax as the first line of your class definition file:

classdef MyDatastore < matlab.io.Datastore & ...
                       matlab.io.datastore.PartitionableByIndex
    ...
end

To add support for parallel processing to your custom datastore, you must:

  • Inherit from an additional class matlab.io.datastore.PartitionableByIndex

  • Define the additional method: partitionByIndex

For more details and steps to create your custom datastore with parallel processing support, see Develop Custom Mini-Batch Datastore.

Methods

partitionByIndex(Not recommended) Partition datastore according to indices

Attributes

Abstracttrue
Sealedfalse

For information on class attributes, see Class Attributes.

Copy Semantics

Handle. To learn how handle classes affect copy operations, see Copying Objects.

Version History

Introduced in R2018a

collapse all

R2019a: matlab.io.datastore.PartitionableByIndex is not recommended

Before R2018a, to perform custom image preprocessing for training deep learning networks, you had to specify a custom read function using the readFcn property of imageDatastore. However, reading files using a custom read function was slow because imageDatastore did not prefetch files.

In R2018a, four classes including matlab.io.datastore.MiniBatchable and matlab.io.datastore.PartitionableByIndex were introduced as a solution to perform custom image preprocessing with support for prefetching, shuffling, and parallel training. Implementing a custom mini-batch datastore using matlab.io.datastore.MiniBatchable has several challenges and limitations.

  • In addition to specifying the preprocessing operations, you must also define properties and methods to support reading data in batches, reading data by index, and partitioning and shuffling data.

  • You must specify a value for the NumObservations property, but this value may be ill-defined or difficult to define in real-world applications.

  • Custom mini-batch datastores are not flexible enough to support common deep learning workflows, such as deployed workflows using GPU Coder™.

Starting in R2019a, datastores natively support prefetch, shuffling, and parallel training when reading batches of data. The transform function is the preferred way to perform custom data preprocessing, or transformations. The combine function is the preferred way to concatenate read data from multiple datastores, including transformed datastores. Concatenated data can serve as the network inputs and expected responses for training deep learning networks. The transform and combine functions have several advantages over matlab.io.datastore.MiniBatchable and matlab.io.datastore.PartitionableByIndex.

  • The functions enable data preprocessing and concatenation for all types of datastores, including imageDatastore.

  • The transform function only requires you to define the data processing pipeline.

  • When used on a deterministic datastore, the functions support tall data types and MapReduce.

  • The functions support deployed workflows.

Note

The recommended solution to transform data with basic image preprocessing operations, including resizing, rotation, and reflection, is augmentedImageDatastore. For more information, see Preprocess Images for Deep Learning.

There are no plans to remove matlab.io.datastore.PartitionableByIndex at this time.