cancel
Showing results for 
Search instead for 
Did you mean: 

DAC kernel clock and CPU clock domains: how to configure them to work asynchronously?

jeremierafin
Associate II

Hi,

My problem: how to configure DAC so that CPU core clock and DAC kernel clock can be asynchronous (i.e: 250MHz and 70MHz)?

Details:
- my hardware is based on the STM32H523CET6
- configuration:
  - SYSCLK (HCLK) running at 250MHz (with PLL1 from HSE=TCXO @26MHz)
  - TIM7 generates a 48kHz rectangle signal
  - DAC1 is triggered by TIM7 trgo
  - PLL2 configured for PLL2R=70MHz
- code loop:
  - pulling on TIM7 SR's UIF bit (update interrupt flag) until it is set
  - clear UIF bit
  - toggling DAC out between 0 and 0xFFF0 values
  - loop again (infinitly)
- results:
  - when DAC is clocked by HCLK (ADC,DAC Clock Mux = HCLK): OK :)
  => output is toggling with a rectangle frequency of 24kHz (as expected)
  - when DAC is clocked by PLL2 (ADC,DAC Clock Mux = PLL2R): KO :(
  => output is randomly toggling

My feeling is that the transfert of the DAC out value (DHR12L1 register) does not work if clocks (CPU and DAC ones) are asynchronous. Can you help me?

(Note: I need to run the ADC with an accurate frequency, so I need to set the mux to PLL2R. And I need DAC and ADC to work at same time. So far, the DAC does not satisfy me as its out is randomly generating the signal I program...)

Thanks!

P.S.: for any detail, I attache the Cube project "Hello" in the two above configurations (ok.tar.xz and ko.tar.xz). But feel free to ask me any detail!

9 REPLIES 9
TDK
Guru

> output is randomly toggling

Can you show this? Just a bit of jitter?

A better solution would be to handle this in DMA. Transfer to the DAC register when TRGO happens.

If you feel a post has answered your question, please click "Accept as Solution".

On the oscilloscope:

- when it is OK: perfect rectangle signal, freq is 24kHz

- when it is KO: the rectangle high or low levels last between 1 and about 20 time the expected duration; average freq is about 5kHz

Not simple to give you a picture of the oscilloscope (too old to export screenshots). But trust me: this is not "a bit of jitter", unless 2000% would be "a bit " for you! :D

Regarding the DMA, of course, in my full application, I use it. But I have exactly the same behavior: with DMA or CPU managed. As I preferred to provide a simpler example, I gave this code snippet without DMA.

Once again: my feeling is that when the DAC register is changed, by the CPU directly, or by the DMA, since the bus master clock differs from the slave one, then a clock domain synchronization would be required. Does my feeling make sense? Do I miss a configuration to get this synchronization?

Note that I did not find anything in the errata list...

 

Merged threads treating the same subject.

*** Sorry I closed my previous post (https://community.st.com/t5/stm32-mcus-products/dac-kernel-clock-and-cpu-clock-domains-how-to-configure-them-to/m-p/759757#M270108) by mistake. This one reopens it. ***

Hi,

My problem: how to configure DAC so that CPU core clock and DAC kernel clock can be asynchronous (i.e: 250MHz and 70MHz)?

Details:
- my hardware is based on the STM32H523CET6
- configuration:
  - SYSCLK (HCLK) running at 250MHz (with PLL1 from HSE=TCXO @26MHz)
  - TIM7 generates a 48kHz rectangle signal
  - DAC1 is triggered by TIM7 trgo
  - PLL2 configured for PLL2R=70MHz
- code loop (case 1):
  - pulling on TIM7 SR's UIF bit (update interrupt flag) until it is set
  - clear UIF bit
  - toggling a GPIO pin for probing CPU activity
  - toggling DAC out between 0 and 0xFFF0 values
  - loop again (infinitely)
- results:
  - when DAC is clocked by HCLK (ADC,DAC Clock Mux = HCLK): OK :)
  => output is toggling with a rectangle frequency of 24kHz (as expected)
  - when DAC is clocked by PLL2 (ADC,DAC Clock Mux = PLL2R): KO :(
  => output is randomly toggling

Attached are oscilloscope results:
- caseX_OK.png corresponds to DAC MUX set to HCLK
- caseX_KO.png corresponds to DAC MUX set to PLL2R

Source 1 (yellow) is the DAC OUT.
Source 2 (magenta) is a GPIO "PROBE" that I added for better investigation (to check when DAC register is reached).

Different caseX are tried (code extracts below):
- case 1: same as previous, the DAC OUT is toggled each time
  => you should better understand the above "randomly toggling"
- case 2: the DAC OUT follow a simple 16-stage stair signal ("(cnt & 0xf) << 12"); expected freq = 48k/16 = 3kHz
  => this illustrates better my feeling that DAC OUT is unsync'd with CPU;
  reminder: the magenta trace corresponds to CPU activity...
- case 3: the DAC OUT follow the simple 48kHz counter value, ending with triangle frequency = 48k/4096 = 11.719Hz
  => hard to see any problem, even if the signal is a bit noisy
  I created this case to rise a warning on measurements: in some cases, we might think all is ok...

Attached also the code sources for case 1, for reference (case1_KO_sources.tar.xz and case1_OK_sources.tar.xz).

My feeling is that the transfert of the DAC out value (DHR12L1 register) does not work if clocks (CPU and DAC ones) are asynchronous. Can you help me?

Notes:
- I need to run the ADC with an accurate frequency, so I need to set the mux to PLL2R. And I need DAC and ADC to work at same time. So far, the DAC does not satisfy me as its out is randomly generating the signal I program...
- in my program I am using a DMA for changing the DAC output value; this does not improve anything; since the DMA runs at CPU speed, and hence it accesses to DAC register in the same way as the CPU does, this is not surprising.

Thanks!


CASE 1 code:
============

 

 

/* USER CODE BEGIN WHILE */
uint32_t cnt = 0;
while (1)
{
/* USER CODE END WHILE */

/* USER CODE BEGIN 3 */
while (!READ_BIT(htim7.Instance->SR, TIM_SR_UIF));
CLEAR_BIT(htim7.Instance->SR, TIM_SR_UIF);

HAL_GPIO_TogglePin(PROBE_GPIO_Port, PROBE_Pin);

cnt++;
if (cnt & 0x1)
hdac1.Instance->DHR12L1 = 0;
else
hdac1.Instance->DHR12L1 = 0xfff0;
}
/* USER CODE END 3 */

 

 


CASE 2 code:
============

 

 

/* USER CODE BEGIN WHILE */
uint32_t cnt = 0;
while (1)
{
/* USER CODE END WHILE */

/* USER CODE BEGIN 3 */
while (!READ_BIT(htim7.Instance->SR, TIM_SR_UIF));
CLEAR_BIT(htim7.Instance->SR, TIM_SR_UIF);

HAL_GPIO_TogglePin(PROBE_GPIO_Port, PROBE_Pin);

hdac1.Instance->DHR12L1 = (cnt & 0xf) << 12;
cnt++;
}
/* USER CODE END 3 */

 

 


CASE 3 code:
============

 

 

/* USER CODE BEGIN WHILE */
uint32_t cnt = 0;
while (1)
{
/* USER CODE END WHILE */

/* USER CODE BEGIN 3 */
while (!READ_BIT(htim7.Instance->SR, TIM_SR_UIF));
CLEAR_BIT(htim7.Instance->SR, TIM_SR_UIF);

HAL_GPIO_TogglePin(PROBE_GPIO_Port, PROBE_Pin);

hdac1.Instance->DHR12L1 = (cnt << 4) & 0xfff0;
cnt++;
}
/* USER CODE END 3 */

 

 

Edit: code formatting

Try different TIM prescaler/ARR values.

Try to generate TRGO from OCxREF rather than Update.

JW

I tried these configuration for Prescaler/Period:
- 2604 / 1 (initial)
- 2604 / 2
- 2604 / 3
- 2605 / 1
- 2605 / 2
- 2605 / 3
- 2606 / 1
- 2606 / 2
- 2606 / 3
- 10000 / 100
- 1000 / 1
- 100 / 1
The "signature" of the "randomness" is changed, but the problem remains.

Regarding OCxREF, this is not available for basic timers TIM6 and TIM7.

*************************************

Anyway, I think you are not on the good way: the problem occurs when the DAC out value register (e.g. DHR12L1) is accessed for write; sometimes the change is not applied. This problem is seen _only if_ the DAC is supplied by the PLL2R.

To better illustrates this, I configured the DAC in software trigger. Then, the code becomes:

/* USER CODE BEGIN WHILE */
uint32_t cnt = 0;
while (1)
{
/* USER CODE END WHILE */

/* USER CODE BEGIN 3 */
while (!READ_BIT(htim7.Instance->SR, TIM_SR_UIF));
CLEAR_BIT(htim7.Instance->SR, TIM_SR_UIF);

HAL_GPIO_TogglePin(PROBE_GPIO_Port, PROBE_Pin);

cnt++;
if (cnt & 0x1)
hdac1.Instance->DHR12L1 = 0;
else
hdac1.Instance->DHR12L1 = 0xfff0;
SET_BIT(hdac1.Instance->SWTRIGR, DAC_SWTRIGR_SWTRIG1);
}
/* USER CODE END 3 */

 

In that case, the timer is only useful for the CPU, which takes care of driving the DAC.
If the DAC clock source is HCLK, it works well (with, of course, some very light jitter due to software driving), like any other "OK" cases. However, if the DAC is sourced by the PLL2R, I observe the same issue, still ("KO" cases). For that reason I think the problem is definitely not the timer, but the DAC. Do you get me?

Edit: code formatting

What's the content of DAC registers?

JW

Attached file corresponds to when CPU is stopped before the infinite loop (so after all inits).

Those look OK to me.

At this point I have no more ideas, sorry.

[EDIT]

One more thing, though. You mentioned DMA, but the code snippets above don't use it, so I'm not quite sure how is it used, exactly.

Regardless of that, could you please, in those "manually filled" examples, insert a delay between detecting the TIM7 Update event, and writing to the DAC holding register? Something like 1us. Any time wasting delay would do, perhaps toggling the pin not just once but several times.

[/EDIT]

JW

[DMA]

No, actually, the code of this post does NOT include any DMA utilization.

I talked about DMA because:
- in my "main" application, I use it (I would say: "of course")
- this does not improve anything (that's the main information)
- this is less flexible for tests (like inserting a delay, as you suggested)
- this is a bit more complex so harder to reproduce (I prefer to provide a simpler code)

[/DMA]

[DELAY]

I inserted a loop of nop's, calibrated on the oscilloscope to get 1.2us (about; more than 1us, for sure). The code is:

while (!READ_BIT(htim7.Instance->SR, TIM_SR_UIF));
CLEAR_BIT(htim7.Instance->SR, TIM_SR_UIF);

HAL_GPIO_TogglePin(PROBE_GPIO_Port, PROBE_Pin);
for (int i = 0; i < 10; i++) __asm__("nop");
HAL_GPIO_TogglePin(PROBE_GPIO_Port, PROBE_Pin);

cnt++;
if (cnt & 0x1)
hdac1.Instance->DHR12L1 = 0;
else
hdac1.Instance->DHR12L1 = 0xfff0;

Result: this does not change anything.

I also tried a loop on 100 nop's to get about 400us of delay: no more improvement.

[/DELAY]

Could you please double check with the design team about this? Could it be a silicon bug?

Edit: code formatting