2019-03-15 07:48 AM
Dear Community,
I am trying to import a standard Mobilenet Model from Keras.
Unfortunally I am getting this error during the load of the model inside cubemx.ai:
[AI:Generator] Command ***/STM32Cube/Repository//Packs/STMicroelectronics/X-CUBE-AI/3.4.0/Utilities/windows/generatecode --auto -c C:/Users/ALESSA~1/AppData/Local/Temp/mxAI256000625112004454394714515898287/config.ai
[AI:Generator] NOT IMPLEMENTED: Unsupported layer type: ReLU
[AI:Generator] Python generation ended
[AI:Generator] Invalid network
According to the user manual UM2526 (Page 46) https://www.st.com/resource/en/user_manual/dm00570145.pdf should be supported.
Activation: nonlinear activation layer, decoded also when part of Conv2D, DepthwiseConv2D, SeparableConv2D, or Dense. The following attributes are supported:
– nonlinearity: type of nonlinear activation; the following functions are supported: linear, relu, relu6, softmax, tanh, sigmoid, and hard_sigmoid.
Some of you has some insight about the problem?
Thanks,
Alessandro
Solved! Go to Solution.
2019-03-15 08:21 AM
Hi alessandro,
I found a solution to solve your problem.
I suppose that you have used the "ReLU" layer of Keras.
I propose you to replace this layer by this one : "Activation('relu')" which will be the relu function through a layer of type Activation
Romain
2019-03-15 07:53 AM
Are you on the latest X-Cube-AI 3.4.0 or on 3.3.0 ? some layers implementation bugs have been fixed in X-Cube-AI 3.4.0.
Regards
Daniel
2019-03-15 07:55 AM
Dear @fauvarque.daniel,
unfortunally I am using the 3.4.0.
Alessandro
2019-03-15 08:21 AM
Hi alessandro,
I found a solution to solve your problem.
I suppose that you have used the "ReLU" layer of Keras.
I propose you to replace this layer by this one : "Activation('relu')" which will be the relu function through a layer of type Activation
Romain
2019-03-18 01:44 AM
Dear Romain,
thanks I think your solution could work! I will try it soon.
Best,
Alessandro
2019-03-30 09:41 AM
But what about the already trained models with the "ReLU" layer ? Will the CubeMX.AI be corrected for normal work with "ReLU" ?
2019-04-01 06:51 AM
Here you can find a notebook I created to do such operation: