cancel
Showing results for 
Search instead for 
Did you mean: 

no source available for "ai_platform_inputs_get()" X-cube.AI version 7.1.0 I attach a basic code snippet which might be helpful in recreating the problem

rickforescue
Associate III
/* USER CODE BEGIN PV */
static ai_buffer *ai_input;
static ai_buffer *ai_output;
/* USER CODE END PV */
 
int main(void){
...
	// Chunk of memory used to hold the intermediate values for neural network
	AI_ALIGNED(32) ai_u32 activations[AI_NETWORK_DATA_ACTIVATIONS_SIZE];
 
	// Buffers used to store input and output tensors
	AI_ALIGNED(32) ai_i32 in_data[AI_NETWORK_IN_1_SIZE_BYTES];
	AI_ALIGNED(32) ai_i32 out_data[AI_NETWORK_OUT_1_SIZE_BYTES];
 
	// Pointer to our model
	ai_handle my_model = AI_HANDLE_NULL;
 
	// structs that point to our input and output data and some meta data
	ai_input = ai_network_inputs_get(my_model , NULL);
	ai_output = ai_network_outputs_get(my_model , NULL);
 
....
}

As seen the function ai_network_inputs_get(ai_handle network, ai_u16 *n_buffer) resides in the network.c file and it makes a call to the ai_platform_inputs_get(network, n_buffer); in file ai_platform_interface.h

It doesnt give me any errors while running but the code breaks while getting the ai_buffer to the input of the model

I dont know what possibly can go wrong. Should I explicitally implement these functions in main() ? There exists some literature and videos on web but they are old.

Is there anyone who has faced similar problems or knows solutioon to this?

Thanks in advance

1 REPLY 1
rickforescue
Associate III

Alright, I think I found the solution to the problem.

The problem is before calling the ai_network_inputs_get function the ai model has to be created.

So the solution looks like this

    /* USER CODE BEGIN PV */
    static ai_buffer *ai_input;
    static ai_buffer *ai_output;
    /* USER CODE END PV */
     
    int main(void){
    ...
    	// Chunk of memory used to hold the intermediate values for neural network
    	AI_ALIGNED(32) ai_u32 activations[AI_NETWORK_DATA_ACTIVATIONS_SIZE];
 
            	// Buffers used to store input and output tensors
    	AI_ALIGNED(32) ai_i32 in_data[AI_NETWORK_IN_1_SIZE_BYTES];
    	AI_ALIGNED(32) ai_i32 out_data[AI_NETWORK_OUT_1_SIZE_BYTES];
     
    	// Pointer to our model
    	ai_handle my_model = AI_HANDLE_NULL;
        ..........
        /* USER CODE BEGIN 2 */
        ..........
        ai_network_params ai_params =
        {
		  {
				AI_NETWORK_DATA_WEIGHTS(ai_network_data_weights_get()),
				AI_NETWORK_DATA_ACTIVATIONS(activations)
		  }
        };
 
      // Create instance of the neural network
      ai_err = ai_network_create(&my_model, AI_NETWORK_DATA_CONFIG);
      if (ai_err.type != AI_ERROR_NONE)
      {
	  buf_len = sprintf(buf, "Error: Could not create NN instance\r\n");
	  HAL_UART_Transmit(&huart3, (uint8_t *)buf, buf_len, 100);
	  while(1);
       }
 
       // Initialize neural network. At this point we are ready to make inferences
       if (!ai_network_init(my_model, &ai_params))
       {
	  buf_len = sprintf(buf, "Error: could not initialize NN\r\n");
	  HAL_UART_Transmit(&huart3, (uint8_t *)buf, buf_len, 100);
	  while(1);
        }
 
        // structs that point to our input and output data and some meta data 
  	ai_input = ai_network_inputs_get(my_model, NULL);
  	ai_output = ai_network_outputs_get(my_model, NULL);
     
    ....
    }