I am using the ADC1 to sample A0 as a voltage input and A17 and A18 as temperature and VrefInt for Temperature and exact Vdda calculaion. So far my implementation works fine. I also calibrate the ADC everytime before I start the ADC. Datahandling is done via DMA in circular mode, so continous.
I also want to use the analog WatchDog to check wether my voltage on A0 is inside a specified range (0,8V - 3V). Sadly when I programm the lower and upper threshold value for the watchdog and try to test it with my laboratory power supply, The watchdog triggers from 2,7V not 3. When I compare the converted values I see some overshots in reading. As I am avagering my calculated values this is not bothering me, but for the accuracy of the WatchDog this is a problem.
Has anybody seen this kind of behaviour before? Am I doing something wrong, or forgot something? Or is this inherent since the integrated ADC is simply not very accurate?
For the thresholds I used the following formula: HT = 3V * 4095/3,3V and LT = 0,8V * 4095/3,3V
So I would be happy for any help