AnsweredAssumed Answered

Big data array problem in the STM32F407.

Question asked by CS on Oct 3, 2013
Latest reply on Nov 12, 2013 by CS


Hi Folks 


We are porting a complex algorithm from the PC into the STM32F407 and we are testing it on a large test file (15000 items long) which I have include in a .h file in the form 


const int testdataA[15000] = { 16985337,16985390,16985531,16985397,16985364,16985320,16985257....... 


I have this main loop that feed data to the algorithm:-


    for (long ix = 0; ix <= 14999; ix++) // size of the data array 
     {


     if (j < 256)
         {
         int index = ix;
         res[j] = testdataA[index];
         j++; // increment the index
         }
     else
         {
         int index = ix;
         res[j] = testdataA[index];
         // Ok now  have a full array of data
         j = 0;  // reset the array counter
         IOCresult[k] = Form1::btc(&res[0]); // let the alg do its stuff
         k++; // the next cell in the results array
         };


     }
     
     Note how I have had to substute 'index' for 'ix' as after 256 samples the call to testdataA is corrupting the index to that data.
     
     Has anyone got any thoughts on what the problem is, if I am declearing the big data array correctly and what the biggest array is that can be created?? 
     
     Thanks CS
     

Outcomes