cancel
Showing results for 
Search instead for 
Did you mean: 

Debug ST-Link MINIE using the ADC , GP DMA and TIM strange behaviour

GDC
Associate II

Hi to everyone!

I'm using STM32U535: without HAL, with TIM1, ADC1 and GPDMA1_Channel1.
TIM1 regularly (every 312.5us) triggers the ADC1 (MSIK=16MHz,SMP=0b011=20ADC_ClockCycle, 14bits) which performs a sequence of 12 channels with oversampling 4x and without any shift. ADC1 triggers the GPDMA1_Channel1 to move the "ADC->DR" into "int32_t iCampioncini[12]" and when GPDMA1 has completed the twelve data transfer it rises the interrupt request handled by GPDMA1_Channel1_IRQHandler(void){...}

All works really fine.
TIM1 trigger has even a interrupt that rises and falls a pin (oscilloscope green)
GPDMA1_Channel1_IRQHandler rises and falls a pin (oscilloscope yellow)
Time interval between green and blue tracks on oscilloscope is very good = (17+20) / 16M * 12 * 4 = 111us

All works really fine...but
If I use the debuger, for example enabling or disabling a break point in the first instruction of main()  (which never hit during the while(1) cicle) something strange happens. When I terminate the debugger (ctrl+F2) the same strange behaviour. On power on without debugger device connected all works fine.
Here is the strange behaviour:

1 - The measures form ADC almost accurately halves

2 - The time between oscilloscope green and blue tracks change every debug comunication (for example enabling and disabling the previous mentioned breakpoint, which never hit). Add theese are the oscilloscope time lapses sequence i registered between the two tracks: 111us,102us,19us,65us,74us,83us,19us,102us,74us,102us,65us,102us,65us,102us,102us,102us,46us,111us....and it goes on ramdomly.
Theese intervals however seem to be displaced every about more or less 9us and when the interval is randomly back to 111us the ADC value is no more halved but is correct.

3 - The registers of ADC1, GPDMA1_Channel1, TIM1 result unchanged in every check.

4 - When in the debugger I reset the chip and restart all works fine back, but my debug opportunities are deeply limited, cause for example I can't insert a beakpoint without influence the measures!

Have someone an idea to solve this problems?

Where I am wrong?

 

scope_0.png

THE PICTURE ABOVE: yellow=dma interrupt handle every 12 samples sequence transfer complete; blue=toggle when idle in the while(1) loop; green_1=every 100us interrupt from TIM2; green_2= TIM1 triggers (every 312.5us) the ADC for a sequence. In this example picture please note cursors DeltaX=102us.

-----------------------------------------------------------------------------------------------------------------------------------------

 

 

 

Here are some additional information


ST-Link MINIE just updated
STM32CubeIDE 1.15.0 just updated

void main(void){
__disable_irq();//BREAKPOINT DISABLED
BspRCC_Init(); //SYSCLK=160MHz from MSIS=16MHz due to PLL1
Analog_Init(); //Init TIM1, ADC1, GPDMA1_Channel1
__enable_irq();
while(1){
//rise and fall of pin blue oscilloscope
}
}//void main(void){

 

 

0 REPLIES 0