cancel
Showing results for 
Search instead for 
Did you mean: 

Keras import "NOT IMPLEMENTED: Unsupported layer type: ReLU"

ACapo
Associate II

Dear Community,

I am trying to import a standard Mobilenet Model from Keras.

Unfortunally I am getting this error during the load of the model inside cubemx.ai:

 [AI:Generator] Command ***/STM32Cube/Repository//Packs/STMicroelectronics/X-CUBE-AI/3.4.0/Utilities/windows/generatecode --auto -c C:/Users/ALESSA~1/AppData/Local/Temp/mxAI256000625112004454394714515898287/config.ai
  [AI:Generator] NOT IMPLEMENTED: Unsupported layer type: ReLU
  [AI:Generator] Python generation ended
  [AI:Generator] Invalid network

According to the user manual UM2526 (Page 46) https://www.st.com/resource/en/user_manual/dm00570145.pdf should be supported.

Activation: nonlinear activation layer, decoded also when part of Conv2D, DepthwiseConv2D, SeparableConv2D, or Dense. The following attributes are supported:

– nonlinearity: type of nonlinear activation; the following functions are supported: linear, relu, relu6, softmax, tanh, sigmoid, and hard_sigmoid.

Some of you has some insight about the problem?

Thanks,

Alessandro

1 ACCEPTED SOLUTION

Accepted Solutions
Romain LE DONGE
Associate III

Hi alessandro,

I found a solution to solve your problem.

I suppose that you have used the "ReLU" layer of Keras.

I propose you to replace this layer by this one : "Activation('relu')" which will be the relu function through a layer of type Activation

Romain

0690X00000883u0QAA.png

View solution in original post

6 REPLIES 6
fauvarque.daniel
ST Employee

Are you on the latest X-Cube-AI 3.4.0 or on 3.3.0 ? some layers implementation bugs have been fixed in X-Cube-AI 3.4.0.

Regards

Daniel


In order to give better visibility on the answered topics, please click on 'Accept as Solution' on the reply which solved your issue or answered your question.
ACapo
Associate II

Dear @fauvarque.daniel​,

unfortunally I am using the 3.4.0.

Alessandro

Romain LE DONGE
Associate III

Hi alessandro,

I found a solution to solve your problem.

I suppose that you have used the "ReLU" layer of Keras.

I propose you to replace this layer by this one : "Activation('relu')" which will be the relu function through a layer of type Activation

Romain

0690X00000883u0QAA.png

Dear Romain,

thanks I think your solution could work! I will try it soon.

Best,

Alessandro

EShai
Associate

But what about the already trained models with the "ReLU" layer ? Will the CubeMX.AI be corrected for normal work with "ReLU" ?