MATLAB to OpenVINO (Intel-Inteference)

版本 1.0.0 (1.8 MB) 作者: Kevin Chng
Deploy and optimise your trained model to Intel Processor
221.0 次下载
更新时间 2019/2/18

查看许可证

Overview :

If you train your deep learning network in MATLAB, you may use OpenVINO to accelerate your solutions in Intel®-based accelerators (CPUs, GPUs, FPGAs, and VPUs) . However, this script don't compare OpenVINO and MATLAB's deployment Option (MATLAB Coder, HDL coder), instead, it will only give you the rough idea how to complete it (MATLAB>OpenVINO) in technical perspective.

Refers to the the link below to understand OpenVINO:
https://software.intel.com/en-us/openvino-toolkit

Highlights :
Deep Learning and Prediction
How to export deep learning model to ONNX format
How to deploy a simple classification application in OpenvinoR4 (Third-party software)

Product Focus :
MATLAB
Deep Learning Toolbox
Openvino R4 (Third-party Software)

Written at 28 January 2018

引用格式

Kevin Chng (2025). MATLAB to OpenVINO (Intel-Inteference) (https://www.mathworks.com/matlabcentral/fileexchange/70330-matlab-to-openvino-intel-inteference), MATLAB Central File Exchange. 检索时间: .

MATLAB 版本兼容性
创建方式 R2018b
兼容任何版本
平台兼容性
Windows macOS Linux
类别
Help CenterMATLAB Answers 中查找有关 Sequence and Numeric Feature Data Workflows 的更多信息

Community Treasure Hunt

Find the treasures in MATLAB Central and discover how the community can help you!

Start Hunting!
版本 已发布 发行说明
1.0.0