The ADC calibration on an STM32F303 is making my head hurt!!
I'm using latest HAL library (v1.9.0) and start with calling HAL_ADCEx_Calibration_Start() for each ADC I'm using. I then read the factory calibration factor for VREFINT from 0x1FFFF7BA and read input 18 to get the actual reference value for my VDD value.
One example chip has a calibration value of 1547 and I read a value of 1486 on channel 18 with a VDD value of 3.367. This gives me a HUGE error when adjusting the values read during normal program execution. In all cases (reading both VREF and external pins) I'm using 181 clocks/sample as speed is not a requirement.
With a VDD value of 3.367 and sampling half this (just a couple of equal resistors across the supply for testing) I read 2054 which calculates to 1.721 rather than 1.683 with these calibration values. To get the value to 1.683, the value read from channel 18 should be 1524. Taking the value of 1.721 and applying a ratio of 3.3v to my actual 3.367v, I get 1.687 which is pretty close to the correct value.
I've tested with multiple samples so I doubt it is a single duff chip and I've tried SO many different ways of running the ADC (polled, interrupt, DMA) but still way off.