2017-04-24 03:28 AM
Hi, there!
First time developing Fatfs with lovely STM32F091 Nucleo board.
Have some serious memory problems here.
-----------------------------------------------------------------------------------
Hardware: STM32F091 Nucleo board.
Llbrary: Cube F0 1.7
IDE: Keil uVision 5.14
-----------------------------------------------------------------------------------
I am basically following the structure as in the example in
...\STM32Cube_FW_F0_V1.7.0\Projects\STM32F091RC-Nucleo\Applications\FatFs\FatFs_uSD
the simple functions like test f_mout, f_open, f_read, f_write were working in the beging.
As I wrote more and more codes, sometimes I have some stack or heap memory problems.
for example,
setting
Stack_Size EQU 0x800;
Heap_Size EQU 0x400;
in the startup file. (optimzed level 1)
As the the processing run to
uint8_t BSP_SD_ReadBlocks(uint32_t* pData, uint32_t ReadAddr, uint16_t BlockSize, uint32_t NumberOfBlocks)
{
...
ptr = malloc(sizeof(uint8_t)*BlockSize);
if( ptr == NULL ) { goto error; } memset(ptr, SD_DUMMY_BYTE, sizeof(uint8_t)*BlockSize);/* Data transfer */
while (NumberOfBlocks--) {...
}
...
}
something i got a ptr with address like 0x2000005C8, however the NumberOfBlocks lies in 0x200000658,
as memset function is called,
0x2000005C8+ 512 > 0x200000658. I got a hard fault error.
Another case is occurred if i use less stack memory, something the initialized objects are changed.
For example, I got a SPI_HandleTypeDef sd_Spi, which is initialized in the
DSTATUS USER_initialize (BYTE pdrv);
As the program runs, the sd_Spi.Instance may have an incorrect value, not the original SPIx base address.
I know that this is about stack and heap.
I would like to know that is there any simple method to assess how much heap and stack do I need?or is there any recommendation size for fatfs?
Thanks for reading this post!
#stm32f0 #spi #fatfsSolved! Go to Solution.
2017-04-24 07:15 PM
Ooooops! I just found that I made some serious mistakes.
I've created some big arrays in the main loop, should make them global!
So, the moral today: Never declare big arrays in the main function!
2017-04-24 03:50 AM
Llbrary: Cube F0 1.7
I don't use Cube, nor I comment on this code.
But
malloc()
takes memory from the heap, not from the stack.Not sure if Cube generates sensible values for both (heap + stack), better check yourself.
2017-04-24 01:45 PM
For the heap one can probe the memory arena by walking the allocator list, or use instrumented malloc/free to track usage. I would tend to avoid the type of malloc used here. ie rapid and repetitive
For the stack, one can generally probe the maximum depth periodically by checking a fill byte written into the stack allocation to see where it gets overwritten. Remember the stack moves downward from the end of the allocation, you'd write a check that starts at the beginning of the allocation and moves upward until it no longer encounters the fill byte.
It normally fills with zero, you might want a character you can recognize in the debugger easily, say 0xCD, or 0x23
2017-04-24 02:59 PM
Hi
I found it silly using malloc here and replaced it:
...
uint8_t block[BlockSize];
...
//ptr = malloc(sizeof(uint8_t) * BlockSize);
ptr = (uint8_t *) █
// if (ptr == NULL)
// {
// goto error;
// }
...
// if (ptr != NULL)
// free(ptr);
�?�?�?�?�?�?�?�?�?�?�?�?�?�?�?�?�?
Dieter
2017-04-24 06:25 PM
Thanks, I just checked the heap and stack as you guys told me!
Wow, I found something incredible!!
*****************************************************************************************
First I set my stack and heap smaller as default in the startup files:
Stack_Size
EQU 0x600
;AREA STACK, NOINIT, READWRITE, ALIGN=3
Stack_Mem SPACE Stack_Size __initial_spHeap_Size
EQU 0x400;
AREA HEAP, NOINIT, READWRITE, ALIGN=3
__heap_base Heap_Mem SPACE Heap_Size __heap_limitThen I checked the output map file xxxx.map
I found those sentences:
*****************************************************************************************
...
HEAP 0x200005c0 Section 1024 startup_stm32f091xc.o(HEAP)
STACK 0x200009c0 Section 1536 startup_stm32f091xc.o(STACK)...
__heap_base 0x200005c0 Data 0 startup_stm32f091xc.o(HEAP)
__heap_limit 0x200009c0 Data 0 startup_stm32f091xc.o(HEAP) __initial_sp 0x20000fc0 Data 0 startup_stm32f091xc.o(STACK)...
*****************************************************************************************
So, it should mean that I got:
---------------------------------(initial stack pointer 0x20000FC0, growing down)
|stack size 0x600|
---------------------------------(stack.heap limit
0x200009C0
)|heap size 0x400|
---------------------------------(heap base 0x2000005C0, growing up)
|Data memory....|
*****************************************************************************************
Now I open my IDE and prepare to see how SP Register is going to grow,
Just in the main loop, not beginning doing anything, I got the above bizzare two disassembly
0x08003C6C 4B51 LDR r3,[pc,#324] ; @0x08003DB4
0x08003C6E 449D ADD sp,sp,r3
when these two instruction are executed, and then begin to HAL_Init(), my stack point immediately run to
0x200000988 ->already stack overflow!
**************************************************************************************************************
I would like to know what is taking these gigantic stack memory, just in the beginning of main loop!!
attached files are my map file and startup file.
I use cube MX and microlib option is enabled in the Keil IDE.
Please help me and thanks for reading this post!
________________ Attachments : STM32F091xxx.map.zip : https://st--c.eu10.content.force.com/sfc/dist/version/download/?oid=00Db0000000YtG6&ids=0680X000006Hyho&d=%2Fa%2F0X0000000bBD%2FKke0pN5ZTm9jnmYwzb589pmyWygj7npYs0QgQHnW4iQ&asPdf=falsestartup_stm32f091xc.s : https://st--c.eu10.content.force.com/sfc/dist/version/download/?oid=00Db0000000YtG6&ids=0680X000006HyZg&d=%2Fa%2F0X0000000bBE%2FxDov8bMiaGwAIaNdCcR0t2bSLYBSkA1HqCvWqQPIVKI&asPdf=false2017-04-24 07:15 PM
Ooooops! I just found that I made some serious mistakes.
I've created some big arrays in the main loop, should make them global!
So, the moral today: Never declare big arrays in the main function!
2017-04-25 02:20 AM
The map file would be one of the first places to check in those cases.
So, the moral today: Never declare big arrays in the main function!
There is nothing inherently wrong with declaring such arrays on the stack - if the live time and visibility fits your requirements, AND you make the stack large enough.
Mistakes and omissions are just human ...