Clear Filters
Clear Filters

onnxExport/onnxImport functions not working correctly

1 view (last 30 days)
I'm having trouble exporting to onnx. As a test case, I tried using https://www.mathworks.com/help/deeplearning/ug/classify-sequence-data-using-lstm-networks.html. I add the following section between the training and testing blocks and got the data format error as shown below.
exportONNXNetwork doesn't raise any issues, but importONNXNetwork does.
If I import with the following option set,
importONNXNetwork('onnxEx1.onnx','OutputsDataFormat','TBC')
it works fine. All is well, right? except that onnxruntime in other environments doesn't work. Trying to load the onnx file with onnxruntime in another environment (python, java) results in a 'ShapeInferenceError' complaining that the tensor must have rank 3.
It must be that exportONNX failing to define something that is required. Are there any workarounds known for this issue? I've tried this locally with R2022b and online which uses R2023a. Both give me the same issue.
  1 Comment
Sivylla Paraskevopoulou
Sivylla Paraskevopoulou on 4 Apr 2023
Does it help to specify the BatchSize name-value argument when you use the exportONNXNetwork function?
Also, if you are planning to use the exported network in TensorFlow, with the exportNetworkToTensorFlow function you can export directly to TensorFlow without first converting to ONNX format.

Sign in to comment.

Answers (0)

Products


Release

R2023a

Community Treasure Hunt

Find the treasures in MATLAB Central and discover how the community can help you!

Start Hunting!