cancel
Showing results for 
Search instead for 
Did you mean: 

Is SoftMax function supported by CubeMx-AI?

HAlzo
Senior

I am building CNN model on TensorFlow and after that I generate C code for STM32 micro using CubeMX-AI.

Here's my model

model = models.Sequential()
model.add(layers.Conv2D(32, (3, 3), activation='relu', input_shape=(32, 32,1)))
model.add(layers.MaxPooling2D((2, 2)))
model.add(layers.Conv2D(64, (3, 3), activation='relu'))
model.add(layers.MaxPooling2D((2, 2)))
model.add(layers.Conv2D(64, (3, 3), activation='relu'))
model.add(layers.Flatten())
model.add(layers.Dense(32, activation='relu'))
model.add(layers.Dense(4))

I tested it on micro controller and get good results.

here's the result of the 4 classes when input belongs to 3rd class

Output =-9.430237 
Output =-7.165620 
Output =1.203057 
 Output =-10.133130
 

but in order to get the probability I added SoftMax classifier to the last layer, here's the new model

model = models.Sequential()
model.add(layers.Conv2D(32, (3, 3), activation='relu', input_shape=(32, 32,1)))
model.add(layers.MaxPooling2D((2, 2)))
model.add(layers.Conv2D(64, (3, 3), activation='relu'))
model.add(layers.MaxPooling2D((2, 2)))
model.add(layers.Conv2D(64, (3, 3), activation='relu'))
model.add(layers.Flatten())
model.add(layers.Dense(32, activation='relu'))
model.add(layers.Dense(4, activation='softmax'))
 

after testing the modified model, results are OK for 3 classes -1st,2nd and 4th class- but I get wrong results for the 3rd class.

here's the output of the 4 classes while using input of the 3rd class

Output =0.523477 
Output =0.476523 
Output =0.000000 
Output =0.000000

according to my understanding the SoftMax is just exp(out_i)/sum(exp(out_j)) and the results Should be

0693W000000UhNcQAK.png

am I doing something wrong or do I understand SoftMax function mistakenly ?

or there's something wrong in my tensorflow model?

2 REPLIES 2
jean-michel.d
ST Employee

​Hi HAlzo,

Your understanding of Softmax is correct and X-CUBE-AI implements it correctly normally. Can you indicate me how you have generated the results (with and w/o softmax)?

Is it a STM32 project through STM32CubeIDEI? Project is gcc-base?

Have you generated the project with the option: -mfloat-abi=soft?

br,

Jean-Michel

HAlzo
Senior

Hi jean,

I am using STM32H734, so my FPU options are "-mfloat-abi=hard -mfpu=fpv5-d16"

Yes my project is generated using STM32CubeMx Ver 5.6.0 with X-Cube-AI version 5.0.0

Yes Project is gcc base , I am using SW4STM32 IDE

When SoftMax is included in my .h5 model , I just Perform the inference with the input data and print the output data

        res = aiRun(in_data/*object*/, out_data);
        if (res) {
            // ...
        	printf("Error %d",res);
            return;
        }
 
        /* Post-Process - process the output buffer */
        // ...
        char buffer[32] = {0};
 
        sprintf(buffer, "Output =%f \n", out_data[0]);
        printf(buffer);
        sprintf(buffer, "Output =%f \n", out_data[1]);
        printf(buffer);
        sprintf(buffer, "Output =%f \n", out_data[2]);
        printf(buffer);
        sprintf(buffer, "Output =%f \n", out_data[3]);
        printf(buffer);

and when soft max is not included in the model , I Perform the inference, then apply softmax in my code on the output and print the results

        res = aiRun(in_data/*object*/, out_data);
        if (res) {
            // ...
        	printf("Error %d",res);
            return;
        }
 
        /* Post-Process - process the output buffer */
        // ...
        char buffer[32] = {0};
        ai_double softmax[4], sum, max_prob;
        ai_i8 max_index = 0;
        sum=0.0;
        for (int i = 0;i<4;i++)	//calculate exp of outputs
        {
        	softmax[i] = exp(out_data[i]);
        	sum+= softmax[i];
        }
        for (int i = 0;i<4;i++)	//Normalize the results
        {
        	softmax[i] = softmax[i]/sum;
        }
        max_prob=0;
        for (int i = 0;i<4;i++)	//Find the max
        {
        	if(softmax[i]>max_prob)
        	{
        		max_prob = softmax[i];
        		max_index = i;
        	}
        }
 
        printf( "pattern %d, with prob %f\n",max_index+1,max_prob);

br,

Hossam Alzomor