Main Content
TFLiteModel
Description
A TFLiteModel
object enables support for simulation and code
generation for deep learning inference by using TensorFlow™ Lite models
Use a TFLiteModel
object with the predict
function in
your MATLAB® code to perform inference in MATLAB execution, code generation, or MATLAB Function block in
Simulink® models. For more information, see Prerequisites for Deep Learning with TensorFlow Lite Models.
To use this object, you must install the Deep Learning Toolbox Interface for TensorFlow Lite support package.
Creation
To create a TFLiteModel
object from a pretrained TensorFlow Lite model file, use the loadTFLiteModel
function.
Properties
Object Functions
predict | Compute deep learning network output for inference by using a TensorFlow Lite model |
Examples
Extended Capabilities
Version History
Introduced in R2022a
See Also
Topics
- Deploy Pose Estimation Application Using TensorFlow Lite Model (TFLite) Model on Host and Raspberry Pi
- Generate Code for TensorFlow Lite (TFLite) Model and Deploy on Raspberry Pi
- Deploy Super Resolution Application That Uses TensorFlow Lite (TFLite) Model on Host and Raspberry Pi
- Prerequisites for Deep Learning with TensorFlow Lite Models