cancel
Showing results for 
Search instead for 
Did you mean: 

But what about the already trained models with the "ReLU" layer ? Will the CubeMX.AI be corrected for normal work with "ReLU" ?

EShai
Associate
 
1 REPLY 1
ACapo
Associate II

If you speak about Keras models, I faced this issue on this discussion: https://community.st.com/s/question/0D50X0000AVUpc1SQD/keras-import-not-implemented-unsupported-layer-type-relu

Basically it is required to convert the layer ReLU into an Activation('ReLU').

Here you can find a notebook I created to do such operation:

https://github.com/alessandrocapotondi/mobilenet_stmcube_ai/blob/master/keras_model/keras_mobilenet_stm_cubemx_ai.ipynb

# ReLU Converter Function
# According to the discussion [1], 'keras.advanced_activations.ReLU' layers are not usable directly in ST cubemx.ai.
# The function remove such layers and it substitutes to supported 'keras.activations.relu'
# [1] https://community.st.com/s/question/0D50X0000AVUpc1SQD/keras-import-not-implemented-unsupported-layer-type-relu
def convert_to_cubemx_ai(model):
 
    layers = [l for l in model.layers]
 
    x = layers[0].output
    for i in range(1, len(layers)):
        if isinstance(layers[i],ReLU):
            #new_layer = \
            #    keras.layers.Activation(lambda x: \
            #            keras.activations.relu(x, \
            #                max_value=layers[i].get_config()["max_value"].item(),\
            #                threshold=layers[i].get_config()["threshold"].item(),\
            #            ))
            #
            #new_layer = Activation(relu6, name=layers[i].name)
            new_layer = Activation('relu')
            new_layer.name = layers[i].name
            x = new_layer(x)
        else:
            x = layers[i](x)
 
 
    return Model(inputs=layers[0].input, outputs=x)

Alessandro