2019-04-01 05:05 AM
2019-04-01 06:48 AM
If you speak about Keras models, I faced this issue on this discussion: https://community.st.com/s/question/0D50X0000AVUpc1SQD/keras-import-not-implemented-unsupported-layer-type-relu
Basically it is required to convert the layer ReLU into an Activation('ReLU').
Here you can find a notebook I created to do such operation:
# ReLU Converter Function
# According to the discussion [1], 'keras.advanced_activations.ReLU' layers are not usable directly in ST cubemx.ai.
# The function remove such layers and it substitutes to supported 'keras.activations.relu'
# [1] https://community.st.com/s/question/0D50X0000AVUpc1SQD/keras-import-not-implemented-unsupported-layer-type-relu
def convert_to_cubemx_ai(model):
layers = [l for l in model.layers]
x = layers[0].output
for i in range(1, len(layers)):
if isinstance(layers[i],ReLU):
#new_layer = \
# keras.layers.Activation(lambda x: \
# keras.activations.relu(x, \
# max_value=layers[i].get_config()["max_value"].item(),\
# threshold=layers[i].get_config()["threshold"].item(),\
# ))
#
#new_layer = Activation(relu6, name=layers[i].name)
new_layer = Activation('relu')
new_layer.name = layers[i].name
x = new_layer(x)
else:
x = layers[i](x)
return Model(inputs=layers[0].input, outputs=x)
Alessandro