One of our project is using the STM32U575 and we use the ADCs analog watchdog feature of the CPU. At the beginning of the project, we did not used the ADC oversampling, but for stability and reliability purpose, we have turned on the ADC oversampling on ADCs.
Reading the RM, the ADCs analog watchdog thresholds need some adjustements when used in oversampling mode, but it is not clear about that, and the STM32 HAL / LL comments are not giving the same values that the RM.
1/ In the RM rev 4.0, in the note of the Table 313 (Chapter 33.4.30), it is written that the watchdog comparison is performed when the oversampling, gain and offset are complete.
2/ In RM chapter 33.4.31 (about oversampler), it is also written that the comparison is done on the oversampled accumulated value BEFORE shifting. However, the note right after (which is probably a copy paste from the same capter for ADC4) tells that the oversampler shift value will have an impact on the thresholds. Also, it tells that the comparison is done between ADC_DR and the HT/LT, but ADC_DR is the final data result AFTER shift is applied. That's the first clarification that I need
3/ In the STM32Cube HAL driver (file stm32u5xx_hal_adc.c), which I read for reference (we use the LL drivers for more flexibility and control), the HAL_ADC_AnalogWDGConfig uses as paremeter a ADC_AnalogWDGConfTypeDef structure. Looking at its definition, for the Thresholds, there's an interesting comment :
"Note: If ADC oversampling is enabled, ADC analog watchdog thresholds are impacted: the comparison of analog watchdog thresholds is done on oversampling intermediate computation (after ratio, before shift application): intermediate register bitfield [32:7] (26 most significant bits)."
Well, it is interesting, but where does the [32:7] come from ? There's no trace anywhere else. Does it mean that the "oversampled intermediate computation" is left-aligned ? Also, 26-bit comparison with HT/LT values which are 25-bits long seems strange.
4/ After reading all the documentation probably 20 times, reading the HAL code to detect any tricks, I've not been able to find the real way to compute the threshold, so I started experimenting on the real board, with some debug access. The ADC1 is configured with 10-bit resolution, and Analog Watchdog 1 is used on a channel where a constant DC voltage is fed. Here are the results :
=> It seems like I need to do a left shift of the threshold by 4 to compute the threshold, but that's not what I expect when I read the RM (that's what I expect for ADC4, but nor for ADC1), however, I've not yet made the test if I use a larger right shift that the oversampling ratio (let say oversampling 8x, and right shift 4).
These results are observations on the real hardware, with a constant DC signal at the ADC input, and only playing with ADC registers directly, so there's no software related, only ADC1 CFGR2 and HT1/LT1 registers are changed.
Therefore I need some help to understand how is the ADC1 Analog Watchdog threshold computation supposed to be done. Examples are welcome to illustrate the explanation, because the RM is not very helpfull on these points (there's no description of the ADC raw value format, is it right or left aligned, how is it used when the resolution is less than the full resolution, is data right or left aligned in the accumulator, ...).