It is my understanding that you are trying to get the inferences in Python from a model trained in MATLAB.
You can export any neural network to ONNX file using the 'exportONNXNetwork' function in MATLAB.
net = imagePretrainedNetwork("resnet50")
filename = "resnet50.onnx";
exportONNXNetwork(net,filename);
You can use 'onnxruntime' to load the ONNX file in a Python script and perform inference.
Here, in the following implementation in Python, I have used the libraries 'onnxruntime', 'numpy', and 'cv2' to load an image and perform inference on it using the 'ResNet50' model from the exported ONNX file.
!pip install onnx onnxruntime opencv-python numpy
import onnxruntime
import numpy as np
import cv2
import time
% Load the ONNX model
model_path = '/content/resnet50.onnx'
onnxModel = onnxruntime.InferenceSession(model_path)
% Load the image
image_path = '/content/sample_traffic.png'
image = cv2.imread(image_path)
image = cv2.resize(image, (224, 224))
image = np.expand_dims(image, axis=0)
image = image.astype(np.float32)
image = np.transpose(image, (0, 3, 1, 2))
% Run inference
start = time.time()
output = onnxModel.run(None, {'input_1': image})
end = time.time()
print('Inference time: ', end - start)
% Print the output
print(output)
Refer the the following MAthWorks Documentations to know about training and exporting neural networks in MATLAB:
- https://www.mathworks.com/help/deeplearning/ref/exportonnxnetwork.html
- https://www.mathworks.com/help/deeplearning/ug/deep-learning-in-matlab.html
- https://www.mathworks.com/help/deeplearning/ref/trainnet.html
Additionally, you can also refer the following documentation to know about 'ONNX Runtime':
