cancel
Showing results for 
Search instead for 
Did you mean: 

ADC sampling rate checked by callback

Kévin
Associate III

Hello,

I am using a NUCLEO-H743ZI2.

I read the reference manual RM0433 Rev8.

I also read a bunch of topics in the forum but didn't find what I wanted.

 

I am trying to check my ADC sampling rate by looking at the timestamp at my half full/full callback from the DMA buffer. For now I don't have an AWG to generate a sine so I don't know other method to do so.

In order to do the experiment, I generate a number via the microcontroller and increases it by 100 every second. It goes to the DAC. I put a jumper cable to the ADC1 so I can read the value.

I want a sample rate of 10ksample/second with a ADC resolution of 12bits.

I changed the value of PLL2 in order to have a ADC frequency of 1.2MHz

ADCclock.png

 

 


Then I tried to guess the link between the ADC frequency and the sampling rate and didn't find a clear formula.

From what I understood the formula is something like this:

formulaFadc.png

with
f_sample = 10 kHz that I want
f_ADC= 1.2MHz that i configured in clock configuration
prescaler = 8 that I choose in the ADC parameter
nbcycle_samplingtime = 8.5 that I choose in the ADC parameter
nbcycle_convertingtime that depends on the ADC resolution.

 

ADCconfig.png

 

 


Some tutorials on the internet say the converting time is nbbits_ADC + 0.5, some others nbbits_ADC/2 + 0.5.

In the reference manual of this card, it appears to be  nbbits_ADC/2 + 0.5ADCdoc.png

 

So I wrote my parameters with this hypothesis.
I set a DMA of 40000 points, as the sampling rate is 10kS/s, this DMA is supposed to be half filled after 2 seconds, and fully filled after 4 seconds, with a period of 4 seconds.

This part of code is where I initialize the ADC, DAC, DMA. I also produce the value for the DAC which increase by 100 every second.

 

#define DMAsize 40000
uint16_t buffer_adc[DMAsize]={0};
uint16_t buffer_adc2[DMAsize]={0};

  int value_dac=0;
  HAL_DAC_Start(&hdac1, DAC_CHANNEL_2);
  HAL_ADCEx_Calibration_Start(&hadc1,ADC_CALIB_OFFSET,ADC_SINGLE_ENDED);
  HAL_ADC_Start_DMA(&hadc1,(uint32_t*)&buffer_adc,DMAsize);
  printf("ADC DMA TEST BEGINS\r\n");
  /* USER CODE END 2 */

  /* Infinite loop */
  /* USER CODE BEGIN WHILE */
  while (1)
  {

	  HAL_DAC_SetValue(&hdac1, DAC_CHANNEL_2, DAC_ALIGN_12B_R, value_dac);
	  if (value_dac < 4095-100) {
	  	value_dac+=100;
	  } else {
	  	value_dac=0;
	  }

	  HAL_Delay(1000);

    /* USER CODE END WHILE */

    /* USER CODE BEGIN 3 */
  }
  /* USER CODE END 3 */

 


With this part of code I write the callback functions: I display 8 values from the DMA and the timestamp.
As the DMA is 40k points and the sampling rate 10kpoints/s, one quarter of the DMA is supposed to change its value every second: the index and the index+0.5 should be the same.

 

// Called when first half of buffer is filled
void HAL_ADC_ConvHalfCpltCallback(ADC_HandleTypeDef* hadc) {

	RTC_TimeTypeDef sTime;
	RTC_DateTypeDef sDate;

	  int indice=0;
	  for(indice=0;indice<8;indice++){
		  buffer_adc2[indice*5000+200]= buffer_adc[indice*5000+200];
	  }

	HAL_RTC_GetTime(&hrtc, &sTime, RTC_FORMAT_BIN);
	HAL_RTC_GetDate(&hrtc, &sDate, RTC_FORMAT_BIN);

	printf("DMA half full \r\n");
	printf("ADC: 0: %d  0.5:%d  |  1: %d   1.5: %d  | 2:%d    2.5:%d   |  3:%d      3.5:%d \r\n"
			,buffer_adc2[200],buffer_adc2[5200],buffer_adc2[10200],buffer_adc2[15200],buffer_adc2[20200],buffer_adc2[25200],buffer_adc2[30200],buffer_adc2[35200]);
	printf("Time: %02d:%02d:%02d\r\n\r\n",sTime.Hours,sTime.Minutes,sTime.Seconds);
}

// Called when buffer is completely filled
void HAL_ADC_ConvCpltCallback(ADC_HandleTypeDef* hadc) {
	RTC_TimeTypeDef sTime;
	RTC_DateTypeDef sDate;

	  int indice=0;
	  for(indice=0;indice<8;indice++){
		  buffer_adc2[indice*5000+200]= buffer_adc[indice*5000+200];
	  }

	HAL_RTC_GetTime(&hrtc, &sTime, RTC_FORMAT_BIN);
	HAL_RTC_GetDate(&hrtc, &sDate, RTC_FORMAT_BIN);

	printf("DMA completely full -----------------\r\n");
	printf("ADC: 0: %d  0.5:%d  |  1: %d   1.5: %d  | 2:%d    2.5:%d   |  3:%d      3.5:%d \r\n"
			,buffer_adc2[200],buffer_adc2[5200],buffer_adc2[10200],buffer_adc2[15200],buffer_adc2[20200],buffer_adc2[25200],buffer_adc2[30200],buffer_adc2[35200]);
	printf("Time: %02d:%02d:%02d\r\n\r\n",sTime.Hours,sTime.Minutes,sTime.Seconds);
}

 

  


 

When I connect the NUCLEO board to my laptop, I read the value with a terminal:

 

ADC DMA TEST BEGINS<\r><\n>
ADC DMA TEST BEGINS<\r><\n>
DMA half full <\r><\n>
ADC: 0: 23  0.5:29  |  1: 101   1.5: 203  | 2:0    2.5:0   |  3:0      3.5:0 <\r><\n>
Time: 12:00:03<\r><\n>
<\r><\n>
DMA completely full -----------------<\r><\n>
ADC: 0: 23  0.5:29  |  1: 101   1.5: 203  | 2:302    2.5:402   |  3:501      3.5:599 <\r><\n>
Time: 12:00:07<\r><\n>
<\r><\n>
DMA half full <\r><\n>
ADC: 0: 699  0.5:799  |  1: 899   1.5: 994  | 2:302    2.5:402   |  3:501      3.5:599 <\r><\n>
Time: 12:00:11<\r><\n>
<\r><\n>
DMA completely full -----------------<\r><\n>
ADC: 0: 699  0.5:799  |  1: 899   1.5: 994  | 2:1098    2.5:1194   |  3:1297      3.5:1396 <\r><\n>
Time: 12:00:15<\r><\n>
<\r><\n>
DMA half full <\r><\n>
ADC: 0: 1496  0.5:1595  |  1: 1693   1.5: 1796  | 2:1098    2.5:1194   |  3:1297      3.5:1396 <\r><\n>
Time: 12:00:19<\r><\n>
<\r><\n>
DMA completely full -----------------<\r><\n>
ADC: 0: 1496  0.5:1595  |  1: 1693   1.5: 1796  | 2:1897    2.5:1991   |  3:2103      3.5:2199 <\r><\n>
Time: 12:00:23<\r><\n>

 


As you can see the value is changing every 1/8th of the DMA buffer, that means the sampling rate is not what it is supposed to be. Also the period of filling the buffer is approximately 8 seconds instead of 4.


Now, I suppose the hypothesis of nbcycle_convertingtime= nbbits_ADC/2 + 0.5 is wrong,
I suppose nbcycle_convertingtime= nbbits_ADC + 0.5
So I keep my previous parameters but change the sampling by 2.5.

The results are here:

 

ADC DMA TEST BEGINS<\r><\n>
ADC DMA TEST BEGINS<\r><\n>
DMA half full <\r><\n>
ADC: 0: 28  0.5:29  |  1: 23   1.5: 103  | 2:0    2.5:0   |  3:0      3.5:0 <\r><\n>
Time: 12:00:02<\r><\n>
<\r><\n>
DMA completely full -----------------<\r><\n>
ADC: 0: 28  0.5:29  |  1: 23   1.5: 103  | 2:199    2.5:203   |  3:299      3.5:303 <\r><\n>
Time: 12:00:04<\r><\n>
<\r><\n>
DMA half full <\r><\n>
ADC: 0: 399  0.5:501  |  1: 500   1.5: 603  | 2:199    2.5:203   |  3:299      3.5:303 <\r><\n>
Time: 12:00:07<\r><\n>
<\r><\n>
DMA completely full -----------------<\r><\n>
ADC: 0: 399  0.5:501  |  1: 500   1.5: 603  | 2:599    2.5:698   |  3:803      3.5:802 <\r><\n>
Time: 12:00:09<\r><\n>
<\r><\n>
DMA half full <\r><\n>
ADC: 0: 901  0.5:901  |  1: 997   1.5: 1093  | 2:599    2.5:698   |  3:803      3.5:802 <\r><\n>
Time: 12:00:11<\r><\n>
<\r><\n>
DMA completely full -----------------<\r><\n>
ADC: 0: 901  0.5:901  |  1: 997   1.5: 1093  | 2:1095    2.5:1194   |  3:1194      3.5:1295 <\r><\n>
Time: 12:00:14<\r><\n>
<\r><\n>
DMA half full <\r><\n>
ADC: 0: 1393  0.5:1396  |  1: 1495   1.5: 1492  | 2:1095    2.5:1194   |  3:1194      3.5:1295 <\r><\n>
Time: 12:00:16<\r><\n>
<\r><\n>
DMA completely full -----------------<\r><\n>
ADC: 0: 1393  0.5:1396  |  1: 1495   1.5: 1492  | 2:1593    2.5:1696   |  3:1698      3.5:1797 <\r><\n>
Time: 12:00:19<\r><\n>

 

 

It seems better but it is not exactly accurate, because the period to fill the buffer is closer to 5 seconds than 4 seconds. Also, values in the same quarter of the buffer may differ sometimes.


So I have several questions:
- What is the correct formula to link the parameter I just mentionned ?
- What is the nb cycle of converting time for ma NUCLEO-H743Zi2 ?
- How can I get my 10k samples per second ? 
- Is the PLL2 only used for the ADC clock in my card or is it connected to other peripherals (didn't find in clock configuration or in the reference manual)

Thank you 

1 ACCEPTED SOLUTION

Accepted Solutions
TDK
Guru

> As you can see the value is changing every 1/8th of the DMA buffer, that means the sampling rate is not what it is supposed to be. Also the period of filling the buffer is approximately 8 seconds instead of 4.

You are on the right track here but are likely missing an x2 divider in the ADC clock. You probably have a revision V device. See the clock tree in the reference manual:

TDK_0-1705674080399.png

 

> How can I get my 10k samples per second ? 

Use a timer to trigger the ADC conversion instead of letting it free-run. Still need to be aware of how fast each conversion takes so you don't trigger it too often, but this lets you get whatever conversion frequency you want.

If you feel a post has answered your question, please click "Accept as Solution".

View solution in original post

5 REPLIES 5
TDK
Guru

> As you can see the value is changing every 1/8th of the DMA buffer, that means the sampling rate is not what it is supposed to be. Also the period of filling the buffer is approximately 8 seconds instead of 4.

You are on the right track here but are likely missing an x2 divider in the ADC clock. You probably have a revision V device. See the clock tree in the reference manual:

TDK_0-1705674080399.png

 

> How can I get my 10k samples per second ? 

Use a timer to trigger the ADC conversion instead of letting it free-run. Still need to be aware of how fast each conversion takes so you don't trigger it too often, but this lets you get whatever conversion frequency you want.

If you feel a post has answered your question, please click "Accept as Solution".
Kévin
Associate III

Hello TDK,

I just found your answer about this default x2 divider in another post. Thank you for your answer, it made it clear too.

Concerning the trigger of the ADC conversion, I think I didn't get to this part. Are you talking about something to trigger a the beginning of the experiment or something to trigger for every filled buffer ?


Rather than continuous conversion, you can set up the ADC so that a conversion happens at each timer TRGO event. If you do so, and set your timer frequency to 10 kHz, then you'll get conversions at exactly that rate, regardless of your ADC sample time chosen.

If you feel a post has answered your question, please click "Accept as Solution".
Kévin
Associate III

So I corrected with the divider per 2 and I get approximatively a period of 4 seconds (4 seconds - 40 ms) for filling the buffer:

Time: 12:00:03 Subsec: 964<\r><\n>
Time: 12:00:07 Subsec: 921<\r><\n>
Time: 12:00:11 Subsec: 882<\r><\n>
Time: 12:00:15 Subsec: 839<\r><\n>
Time: 12:00:19 Subsec: 800<\r><\n>
Time: 12:00:23 Subsec: 761<\r><\n>
Time: 12:00:27 Subsec: 722<\r><\n>
Time: 12:00:31 Subsec: 679<\r><\n>
Time: 12:00:35 Subsec: 636<\r><\n>
Time: 12:00:39 Subsec: 597<\r><\n>
Time: 12:00:43 Subsec: 554<\r><\n>
Time: 12:00:47 Subsec: 511<\r><\n>
Time: 12:00:51 Subsec: 468<\r><\n>
Time: 12:00:55 Subsec: 429<\r><\n>
Time: 12:00:59 Subsec: 386<\r><\n>
Time: 12:01:03 Subsec: 343<\r><\n>
Time: 12:01:07 Subsec: 304<\r><\n>
Time: 12:01:11 Subsec: 261<\r><\n>
Time: 12:01:15 Subsec: 218<\r><\n>
Time: 12:01:19 Subsec: 175<\r><\n>
Time: 12:01:23 Subsec: 136<\r><\n>
Time: 12:01:27 Subsec: 093<\r><\n>
Time: 12:01:31 Subsec: 050<\r><\n>
Time: 12:01:35 Subsec: 007<\r><\n>
Time: 12:01:38 Subsec: 964<\r><\n>
Time: 12:01:42 Subsec: 925<\r><\n>
Time: 12:01:46 Subsec: 882<\r><\n>
Time: 12:01:50 Subsec: 839<\r><\n>
Time: 12:01:54 Subsec: 796<\r><\n>
Time: 12:01:58 Subsec: 753<\r><\n>
Time: 12:02:02 Subsec: 714<\r><\n>
Time: 12:02:06 Subsec: 675<\r><\n>
Time: 12:02:10 Subsec: 636<\r><\n>
Time: 12:02:14 Subsec: 597<\r><\n>
Time: 12:02:18 Subsec: 558<\r><\n>
Time: 12:02:22 Subsec: 519<\r><\n>
Time: 12:02:26 Subsec: 476<\r><\n>
Time: 12:02:30 Subsec: 437<\r><\n>
Time: 12:02:34 Subsec: 398<\r><\n>
Time: 12:02:38 Subsec: 355<\r><\n>
Time: 12:02:42 Subsec: 316<\r><\n>
Time: 12:02:46 Subsec: 273<\r><\n>
Time: 12:02:50 Subsec: 234<\r><\n>
Time: 12:02:54 Subsec: 191<\r><\n>
Time: 12:02:58 Subsec: 152<\r><\n>
Time: 12:03:02 Subsec: 109<\r><\n>
Time: 12:03:06 Subsec: 070<\r><\n>
Time: 12:03:10 Subsec: 035<\r><\n>
Time: 12:03:13 Subsec: 996<\r><\n>
Time: 12:03:17 Subsec: 957<\r><\n>
Time: 12:03:21 Subsec: 917<\r><\n>

 

So the sample frequency is something like 10.10kHz instead of 10kHz, faster than expected. Should it be that the clock for the time is faster than the clock for the ADC which leads to this difference in timestamp ? 

A 1% error is within the tolerance of the HSI. The 8 MHz input on HSE is driven by the HSI of the st-link. If you use a timer based on the internal clock rather than the RTC, you should see the exact value expected. Also not sure of the accuracy of your RTC clock, could also be contributing.

If you feel a post has answered your question, please click "Accept as Solution".