ST Edge AI Developer Tool: Onnx IR version error, Matlab
Hello everyone,
I am trying to use the ST Edge AI Developer Tool and I am encountering the following error when I try to quantize my model (a RNN with an input layer, 3 GRU layers and a Fully Connected layer):
2024-12-11 07:25:35.451880: I tensorflow/core/util/port.cc:113] oneDNN custom operations are on. You may see slightly different numerical results due to floating-point round-off errors from different computation orders. To turn them off, set the environment variable `TF_ENABLE_ONEDNN_OPTS=0`. 2024-12-11 07:25:35.479788: I tensorflow/core/platform/cpu_feature_guard.cc:182] This TensorFlow binary is optimized to use available CPU instructions in performance-critical operations. To enable the following instructions: AVX2 AVX512F AVX512_VNNI FMA, in other operations, rebuild TensorFlow with the appropriate compiler flags. ONNX Runtime Model Optimization Failed! Consider rerun with option `--skip_optimization'. Traceback (most recent call last): File "/usr/local/lib/python3.9/site-packages/onnxruntime/quantization/shape_inference.py", line 91, in quant_pre_process _ = onnxruntime.InferenceSession(input_model_path, sess_option, providers=["CPUExecutionProvider"]) File "/usr/local/lib/python3.9/site-packages/onnxruntime/capi/onnxruntime_inference_collection.py", line 347, in __init__ self._create_inference_session(providers, provider_options, disabled_optimizers) File "/usr/local/lib/python3.9/site-packages/onnxruntime/capi/onnxruntime_inference_collection.py", line 384, in _create_inference_session sess = C.InferenceSession(session_options, self._model_path, True, self._read_config_from_model) onnxruntime.capi.onnxruntime_pybind11_state.Fail: [ONNXRuntimeError] : 1 : FAIL : Load model from /tmp/quantization-service/3e006691-c53a-4aca-bf0e-804efdb92544/RNN_Final_FT.onnx failed:/onnxruntime_src/onnxruntime/core/graph/model.cc:147 onnxruntime::Model::Model(onnx::ModelProto&&, const PathString&, const IOnnxRuntimeOpSchemaRegistryList*, const onnxruntime::logging::Logger&, const onnxruntime::ModelOptions&) Unsupported model IR version: 9, max supported IR version: 8
My network is exported from MATLAB 2024b in a ONNX file. Does anyone know how to fix this error?
Thank you in advance!
Silvia
