cancel
Showing results for 
Search instead for 
Did you mean: 

STM32H563 ADC Oddity

tw1
Associate II

Hello,

Im working on a program which sends data from the ADC to the computer. My problem is that the data from my STM32H563ZIT6 is bad. I first noticed this from looking at the data sent from the computer and confirmed this by outputting the ADC data through the DAC. Attached is the output from a 50Khz sine wave. 

tw1_0-1744079456321.png

I am not sure what is causing this stair stepping pattern. For reference I have the ADC clock at 75Mhz, 6.5 cycle sample time, and triggered by a 1Mhz timer.

I did notice that the data in the buffer seemed to be mostly shifted left 4 bits (though it never goes above 4095) or is missing the first 4 LSBs. 

tw1_2-1744079710265.png

I ran the same program on an STM32F303RET6 and got the expected output from the DAC.

tw1_3-1744079851314.png


What am I missing that could be the cause of this oddity? Attached is the H563 program which exhibits this behavior.

Thank you!

 

 

 

3 REPLIES 3
mƎALLEm
ST Employee

Hello,

You said: "confirmed this by outputting the ADC data through the DAC"

Did you check the output of the DAC before converting its output? Did you test with another source instead of DAC?

To give better visibility on the answered topics, please click on "Accept as Solution" on the reply which solved your issue or answered your question.
AScha.3
Super User

Calibration of ADC done?

If you feel a post has answered your question, please click "Accept as Solution".
TDK
Super User

Update rate is 1 MHz, steps occur every 1us. Seems consistent. If you want a lower bandwidth, use a low pass filter on the output. Looks like an error in some of the calculations is causing the spike near the top of the wave. It's repeatable across periods--probably a bad value.

 

The adc_buff values seem off. Lots of duplicates. Probably a bug in the code there.

 

Not sure I believe these are the same program. The amplitude on the H5 example is 3V and 50 mV on the F3 example.

 

> HAL_ADC_Start_DMA(&hadc1, adc_buff, BUFF_SIZE);
> HAL_DAC_Start_DMA(&hdac1, DAC_CHANNEL_2, adc_buff, BUFF_SIZE, DAC_ALIGN_12B_R);

Using the same buffer for adc and dac? Doesn't seem like a good idea. Interference and noise is going to dominate after a few cycles.

 

I don't see initialization of adc_buff done anywhere, so it's zero initially. How is this code even giving you a sine wave at all?

 

adc_buff is uint32_t in the code and uint16_t in your screenshot. Clearly not getting a consistent story here.

 

Edit: this post is also two months old. Probably OP has solved it or moved on.

If you feel a post has answered your question, please click "Accept as Solution".