cancel
Showing results for 
Search instead for 
Did you mean: 

Start DAC and ADC at the same time

Kévin
Associate III

Hello,
I am using a NUCLEO-H743ZI2 and program on the STM32CubeIDE1.13.2
The purpose of my project is to create a 2kHz sine and to sample it at 40kHz at the same time.
I would like to have no delay between the 12bits DAC and the 12bits ADC: that the first point of my sampled sine is my first point of my buffer read by the DAC. So a phase of 0°.

In order to start the ADC and DAC at the same time I use TIM1, running on the internal clock of 32MHz, which after a delay of 2 seconds triggers an event TR0, one pulse mode.

TIM2 is the timer which triggers the reading of the DAC, and is triggered by TR0, reading the DMA buffer for DAC at 400kHz (presc=0, Counter=79)

TIM3 is the timer which triggers the reading of the ADC, and is triggered by TR0, reading the ADC value and writing in the DMA at 40Hz. (presc=0, Counter=799)
The frequency of the 12bits ADC is 1.2MHz, 12bits, 2.5 sample cycle, prescaler 1.

I generate a sine on 200 points to achieve the 2kHz frequency.
I write the values from the ADC in a buffer of 40000 points, so that 1 buffer is filled in one second.
I copy the values of the ADC DMA buffer in a big buffer which I send all the data after acquiring the signal for 3 seconds.

The buffer1time is a variable I added because the very first 2 points of the ADC values are 0 ( I guess the ADC was not "ready" yet), so I want to record one second later, as the sinus have a frequency multiple to 1Hz the phase at 1s should be the same as at 0s.

 

 

#define DMAsize 40000
#define nbDMAsize 3
uint16_t buffer_adc[DMAsize]={0};
uint16_t buffer_adc2[nbDMAsize*DMAsize]={0};
uint16_t bufferCounter=0;
uint16_t buffer1time=0;
#define NSsinus  200

.....
uint32_t sinusDAC[NSsinus];
uint32_t amplitude=(uint32_t)floor(powf(2,11)-1);
uint32_t offset=(uint32_t)floor(powf(2,11));

for(indice=0;indice<NSsinus;indice++){
   sinusDAC[indice]=floor(amplitude*sin(2* M_PI*indice/NSsinus)+offset);
 }


  HAL_ADCEx_Calibration_Start(&hadc1,ADC_CALIB_OFFSET,ADC_SINGLE_ENDED);

  HAL_ADC_Start_DMA(&hadc1,(uint32_t*)&buffer_adc,DMAsize);
  HAL_DAC_Start_DMA(&hdac1, DAC_CHANNEL_2, (uint32_t*)sinusDAC, NSsinus, DAC_ALIGN_12B_R);

  HAL_TIM_Base_Start(&htim1);



/* USER CODE BEGIN WHILE */


  while (1)
  {
	  HAL_Delay(3000);
	  if(bufferCounter>=nbDMAsize*2){
		  int nbTotalSizeToSend= nbDMAsize*DMAsize*2;
		  int maxSizePerPacket=powf(2,14);
		  int nbHeap=floor(nbTotalSizeToSend/maxSizePerPacket);
		  int restHeap=nbTotalSizeToSend%maxSizePerPacket;
		  int indice=0;
		  for(indice=0;indice<nbHeap;indice++){
			 HAL_UART_Transmit(&huart3,  (uint16_t *)&(buffer_adc2[indice*maxSizePerPacket/2]), (uint16_t) maxSizePerPacket, HAL_MAX_DELAY);

		  }

		  HAL_UART_Transmit(&huart3,  (uint16_t *)&(buffer_adc2[nbHeap*maxSizePerPacket/2]), (uint16_t) restHeap, HAL_MAX_DELAY);

		  HAL_Delay(1000000);
	  }
}
  /* USER CODE END WHILE */


//callback when ADC DMA buffer half full
void HAL_ADC_ConvHalfCpltCallback(ADC_HandleTypeDef* hadc) {
	int indice=0;
	if(buffer1time>0){
		//just fill one time
		if(bufferCounter<nbDMAsize*2){
			//copy a big buffer to send all
			for(indice=0;indice<DMAsize/2;indice++){
				buffer_adc2[indice+bufferCounter*DMAsize/2]= buffer_adc[indice];
			}
			bufferCounter++;
		}
	}
	else{
		buffer1time++;
	}
}

//callback when ADC DMA buffer full
void HAL_ADC_ConvCpltCallback(ADC_HandleTypeDef* hadc) {
	int indice=0;
	if(buffer1time>1){
		if(bufferCounter<nbDMAsize*2){
			//copy a big buffer to send all
			for(indice=0;indice<DMAsize/2;indice++){
				buffer_adc2[indice+bufferCounter*DMAsize/2]= buffer_adc[indice+DMAsize/2];
			}

			bufferCounter++;
		}
	}
	else{
		buffer1time++;
	}
}

 

 

 
After downloading the data, the spectrum (FFT 1 sided) shows a signal at 2kHz.

sinus_raw_1s_trigger_correct-spect.png

However the phase of the peak (so the phase of the first point of my sine) is not 0° but 260°.
At 2kHz, this is a 361µs delay (or a 139µs in advance) modulo 500µs.

sinus_raw_3s_trigger-phaseMax.png

This can be confirmed in the time series: the first point is at 1470, instead of 2048 as the sine is center around 2048.

 

sinus_3s_tempo_delay.png

 

In order to know who is delayed to whom, I made another experiment.
I wrote a 1s ramp starting from the value 1000 to 3000 in order to avoid any saturation around 0 and 4096 in the 12bits DAC.

 

 

#define NSramp  2000
uint32_t rampeDAC[NSramp];
  for(indice=0;indice<NSramp;indice++){
	  rampeDAC[indice]=indice+1000;
  }

 

 

 

And I changed the time of TIM2 to a counter period of 799 so the signal last 1 second.

The output shows the ramp lasting 1s.
ramp3s_whole.png

However the number 1000 doesn't start at 0, 1, 2s but a little earlier, 17 packets of 40kHz which means 425µs.

 

rampe3s_zoom.png

 

If I keep all the same parameters for the timers but I change the ADC frequency, this delay will change:
fADC=1.2MHz, delay=-17/40000=-425µs

fADC=1.5MHz, delay=-7/40000=-175µs

fADC=2MHz, delay=+1/40000=+2.5µs

fADC=3MHz, delay=+10/40000=+25µs

fADC=6MHz, delay=+20/40000=+50µs


The delay will also change for the sine (generated at 400kHz)

fADC=1.2MHz, delay=261°=362µs

fADC=1.5MHz, delay=254°=352µs

fADC=2MHz, delay=250°=347µs

I also tried with a sine generated at 40kHz (a little ugly because only 20 points to make a period)

fADC=1.2MHz, delay=216°=300µs

fADC=1.5MHz, delay=216°=300µs

fADC=2MHz, delay=234°=325µs

 

 

So I don't know how to conclude on the effect of the ADC frequency, I thought as long as it is faster the timer trigger frequency it would be fine but it seems to have a huge consequence on the delay between ADC and DAC.


How is it possible make a 0 delay between my DAC and ADC ?

 

Thank you

 

8 REPLIES 8
TDK
Guru

Realistically, an easy approach would be to start the timers yourself, manually, right after another. This would provide a very small, known, offset which could be compensated for in software.

 

__disable_irq();
TIM2->CR1 |= TIM_CR1_CEN;
TIM3->CR1 |= TIM_CR1_CEN;
__enable_irq();

 

 

Could also use slave timers to achieve this, or a number of other ways.

There is also going to be some delay (on the order of several timer ticks) between when the timers start and the ADC/DAC samples. Perhaps this is the same for both, perhaps not, but it should be constant.

If you feel a post has answered your question, please click "Accept as Solution".

Do you use HAL functions or macros to release ADC and DAC?

Even you act on the same TIM event - the overhead in HAL functions to release both "at the same time" matters.

BTW: it would be never possible to release/start ADC and DAC at the same time: your SW is sequential: release the first one (via HAL driver), afterwards release the second one. The "delay" might come from the SW "overhead" (doing the 1st first, afterwards the 2nd).

It is quite impossible to start really "synchronously": the SW is sequential.

And you do not know if there are "clock synchronizers" use in HW (in ADC and DAC): it can happen that your action to start is synchronized with an internal clock and does not happen at the same time for ADC and DAC (the AHB, APB bus fabric might have clock synchronizers, which you cannot avoid or influence).

Also providing data to DAC and getting values from ADC has a huge "offset": they are not really in sync. They might fight against each other (when you would realize an event to handle data) and your SW as "sequential" execution postpones one or the other. So, I would assume a remarkable phase offset and even "phase jitter".

BTW:
Your spectrum for 2 KHz sine wave generation shows me a lot of huge harmonics and maybe also a sinc modulation somewhere. Maybe your code writing to DAC and reading results from ADC has a "modulation" effect (when done on same MCU, the SW overhead and "sequential code execution" causing it).

A very delicate topic...

Hello,

In fact I start the DAC and ADC in the main.c but each conversion is triggered by hardware timer:

 

  HAL_ADC_Start_DMA(&hadc1,(uint32_t*)&buffer_adc,DMAsize);
  HAL_DAC_Start_DMA(&hdac1, DAC_CHANNEL_2, (uint32_t*)sinusDAC, NSsinus, DAC_ALIGN_12B_R);

  HAL_TIM_Base_Start(&htim1);

 

ADC.pngDAC.pngTIM1.pngTIM2.pngTIM3.png

 

If I use hardware timer does it still have software overhead ?

I thought hardware timer would by-pass the software layer and directly make the sampling and conversion.

My experiments show that for the same parameters, if I turn start several times the nucleo card, the phase delay will always be the same between the DAC and the ADC.

My problem is I want to control this delay (add delay in either ADC or DAC until it completes a whole period of my sine) so I can have a 0° phase delay.

 

I did this kind of development on FPGA where I could control the exact number of ticks in order to have a 0° phase delay for a sine, I wonder if I can do the same thing with a STM32 card.

Kévin
Associate III

Hello,
Yes, I also tried to compensate the delay manually by starting one timer later than the other one (by using a fourth timer which I adjust the counter) but the phase didn't change as expected.

I don't really understand your lines of code: 

 


 

__disable_irq();
TIM2->CR1 |= TIM2_CR1_CEN;
TIM3->CR1 |= TIM3_CR1_CEN;
__enable_irq();

 

 


TIM2->CR1 is the control register, but I can configure it in the .ioc file too, right ? (my screenshots in my other post)
I can't find TIM2_CR1_CEN in the reference manual or the auto-generated code, what does that mean ?

It was a typo. The right sides should be TIM_CR1_CEN instead. Those lines enable the timer after they have already been initialized.

If you feel a post has answered your question, please click "Accept as Solution".

For the spectrum of the sine, I forgot to write that the Y axis is in dB, so the harmonics are 50dB lower than the fundamental which is ok for my application.
I can lower the peaks of the harmonics by 20dB by lowering the amplitude because it got saturated near the value 4095.
I would use a filter to have a lower noise but I wanted to focus on the true delay without filter.

It is difficult to change the relative time between timers that have already started, but it can be done.

If you don't know the phase offset when you start the timer (which you should, right? Maybe not) then you're stuck starting the timers, measuring the phase, and adjusting.

Consider the following phase adjustment scheme for delaying/speeding up TIM1 relative to other timers:

  • Disable interrupts
  • Wait for TIM1->CNT to be > TIM1->ARR/2.
  • Wait for TIM1->CNT to be < TIM1->ARR/2.
  • Change TIM1->ARR up/down by the desired number of ticks.
  • Wait for TIM1->CNT to be > TIM1->ARR/2.
  • Wait for TIM1->CNT to be < TIM1->ARR/2.
  • Reset TIM1->ARR to the original value.
  • Enable interrupts

 

If you feel a post has answered your question, please click "Accept as Solution".
Kévin
Associate III

Hello,
Sorry I didn't understand your scheme: waiting for TIM1 to be bigger than ARR/2, how to program it ? is it in the main.c or the .ioc ? Isn't the condition the same as smaller than ARR/2?

I made a few more tests and it seems like starting hardware timers by an event is less precise than starting timers by writing HAL_TIM_Base_Start(&htim2) in the main.c


Method 1:
DAC is triggered by TIM2

ADC is triggered by TIM3

And I start TIM2 and TIM3 manually in the main.c

  //connect sinusDAC to DMA
  HAL_ADC_Start_DMA(&hadc1,(uint32_t*)&buffer_adc,DMAsize);
  HAL_DAC_Start_DMA(&hdac1, DAC_CHANNEL_2, (uint32_t*)sinusDAC, NSsinus, DAC_ALIGN_12B_R);

  HAL_TIM_Base_Start(&htim2);
  HAL_TIM_Base_Start(&htim3);

 

I proceed to run the code and measure the delay of the sampled sinus.
I obtain almost the same phase (around 0.01°=140ns) each time I run and flash the code in the micro-controller.

sinusOffset_t3t2.png

 

Method 2:
DAC is triggered by TIM2

ADC is triggered by TIM3

TIM2 and TIM3 are in slave mode and triggered by TIM1.
I start manually TIM1 which is on one pulse mode.

 

  //connect sinusDAC to DMA
  HAL_ADC_Start_DMA(&hadc1,(uint32_t*)&buffer_adc,DMAsize);
  HAL_DAC_Start_DMA(&hdac1, DAC_CHANNEL_2, (uint32_t*)sinusADC, NSsinus, DAC_ALIGN_12B_R);

  HAL_TIM_Base_Start(&htim1);

 

I process the same way as before, run and flash the code several times. I obtain a less precise phase delay each time (17°= 23µs)

 

sinusOffset_t1.png

 

 

 

 

 

 

 

 

 

 

 

 

 

 

I thought starting the hardware the timers by the same event (rising edge of another hardware timer) would lead to a constant offset delay. But on the contrary, it is more precise to start the timer individually in the software (main.c)

It is quite confusing.