Big data array problem in the STM32F407.
Options
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Email to a Friend
- Report Inappropriate Content
‎2013-10-03 12:20 PM
Posted on October 03, 2013 at 21:20Hi Folks We are porting a complex algorithm from the PC into the STM32F407 and we are testing it on a large test file (15000 items long) which I have include in a .h file in the form const int testdataA[15000] = { 16985337,16985390,16985531,16985397,16985364,16985320,16985257....... I have this main loop that feed data to the algorithm:- for (long ix = 0; ix <= 14999; ix++) // size of the data array { if (j < 256) { int index = ix; res[j] = testdataA[index]; j++; // increment the index } else { int index = ix; res[j] = testdataA[index]; // Ok now have a full array of data j = 0; // reset the array counter IOCresult[k] = Form1::btc(&res[0]); // let the alg do its stuff k++; // the next cell in the results array }; } Note how I have had to substute 'index' for 'ix' as after 256 samples the call to testdataA is corrupting the index to that data. Has anyone got any thoughts on what the problem is, if I am declearing the big data array correctly and what the biggest array is that can be created?? Thanks CS #32f407-data-arrays
Labels:
- Labels:
-
STM32F4 Series
This discussion is locked. Please start a new topic to ask your question.
3 REPLIES 3
Options
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Email to a Friend
- Report Inappropriate Content
‎2013-10-03 12:27 PM
Posted on October 03, 2013 at 21:27
More likely res[] isn't defined with enough scope (ie int res[257];) or that if local the stack is inadequately large.
The selective cut-n-paste seems to have lost some important context, present as a complete subroutine if you can. The size shouldn't be an issue, might depend on tools, check .MAP for size and placement confirmation.
Tips, Buy me a coffee, or three.. PayPal Venmo
Up vote any posts that you find helpful, it shows what's working..
Up vote any posts that you find helpful, it shows what's working..
Options
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Email to a Friend
- Report Inappropriate Content
‎2013-11-01 9:44 AM
Posted on November 01, 2013 at 17:44
Clive
Thanks for the reply - in the end we worked around the restriction as we were only using the big data arrays during algorithm testing and we validated the algorithm using an 8000 point array. I guess like you say this may be a configuration issue - however I have had to move on to other areas and this is one issue that will have to leave on the shelf as we have moved on. thanks for time, your great support and apologies for the late replyOptions
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Email to a Friend
- Report Inappropriate Content
‎2013-11-12 12:52 AM
Posted on November 12, 2013 at 09:52
Hi Folks
..just come across a piece of code from ST (in the audio examples) they seem to define a large dataset as:-const uint16_t AUDIO_SAMPLE[] = {0x4952, 0x4646, 0x4f6e, 0xf, 0x4157, 0x4556, 0x6d66, 0x2074, 0x12, 0, 0x1, 0x2, 0xbb80, 0, 0xee00, 0x2, 0x4, 0x10, 0, 0x6166, 0x7463, 0x4, 0, 0xd3cf, 0x3, 0x6164, 0x6174, 0x4f3c, 0xf, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, etc...so this should work (but I've not tested it) CS