AnsweredAssumed Answered

STM32F100 ADC performance variations

Question asked by Moore.Patrick on Oct 4, 2011
I have a few hundred identical STM32F100C4 based assemblies.  On 90% of them the ADC works within a couple of LSBs of accuracy in 12-bit mode when reading signals within the first 100 LSBs of the bottom of the range. 

On roughly 10% of the units, the ADC value read is roughly 30 LSBs lower than it should be with a 100LSB signal applied.  Vref is filtered, but equal to Vdd, which is 3.3V (from a nice quiet LDO regulator), and Vref is drawing about 100uA.  All of the units read correctly (within 2 to 5 LSBs) at mid-scale.  All the voltages to the pins are the same (within 1mV) on the "good" and "bad" units.  The signal applied to the ADC has a filter to roll off noise about about 10KHz.  Below about 30 LSB input, the ADC is sourcing roughly 700nA on all units.

We checked the ADC calibration reset and calibration process and the varied the sample time on the channel in question, with a net improve of about 7 LSB, or almost 5 times the error specified on the datasheet.  (Originally the calibration values were not being cleared prior to calibrating, and we changed from 1.5 Ts to 7.5 Ts....greatly Ts values did not help.)

This is not keeping the application from working, but it seems wrong as the datasheet lists 5 LSB as the worst case ADC error.  Yes, I have read and followed AN2834 in this design.

I understand one can only expect so much from this great little 1 USD part, but I'm wondering if anyone else seeing ADC oddities like this and, if so, are they also they seen in the higher end F105/7, F2xx, and F4xx parts? 

Perhaps I just got a funky production lot?