Skip to main content
Associate III
April 11, 2025
Solved

Where I can get the information about what layers the X-Cube-AI supports?

  • April 11, 2025
  • 2 replies
  • 386 views

I am trying to deploy a model onto a STM32N6570-DK board. And I encountered some warnings that the Cube-AI cannot support the unsigned int weight and it cannot support other layers such as linear_dynamic and ATen. Iam wondering where I can get some information about what layers or other things the X-Cube-AI actually support so that I can fix may models.

Best answer by hamitiya

Hello,

You can find more information here:

ST Neural-ART NPU - Supported operators and limitations

 

Best regards,

Yanis

2 replies

hamitiya
hamitiyaBest answer
ST Employee
April 11, 2025

Hello,

You can find more information here:

ST Neural-ART NPU - Supported operators and limitations

 

Best regards,

Yanis

​In order to give better visibility on the answered topics, please click on 'Accept as Solution' on the reply which solved your issue or answered your question.
Z-YFAuthor
Associate III
April 11, 2025

    Hi.

    Thanks for the response and your help. It turns out that I have to modify the quantization program again since the torch always skips certain layers instead of turning them into a int8 type.

    Thanks again for your time and suggestion.   :)