cancel
Showing results for 
Search instead for 
Did you mean: 

Temperature compensation on ADC?

Noname1937
Associate III
Posted on August 11, 2017 at 05:02

Hello, 

I am using an Olimex-E407 board to measure some signals (AC signals). After some reading on ADCs I learned that there are a few error that have to be taken into consideration when doing measurements and some of them can be attenuated by calibration.

Offset, gain, temperature drift, time drift, integral non linearity (INL) and differential non linearity (DNL) are some of the most common I've come accross. Offset and gain errors can be calibrated by using the two point method. INL, according to the datasheet is 1.5LSB typically but can go as high as 3LSB, which could be a problem, but simply by doing a curve fitting algorithm I can reduce this error by 20-50% according to some articles. Apparently it is even possible to mitigate this error even further by using a table containing calibration data (if someone knows how to implement this method pease let me know ... I checked the measurement of the ADC and it varies between 2014 and 2048 when reading a 1.262V signal @3.28V, so I wouldn't even know how to begin creating this table). DNL is negligible in comparison to the others so it doesn't bother me. The one that is bothering me is temperature drift ...

According to what I've been reading, some places, I could build a correction table for my device. Does this mean that I need to define several temperature ranges, perform measurements and then create a table for each range? Isn't it possible to compute the temerature drift in an equation? For that I'd need more information about the internals of the ADC, but the datasheet doesn't even inform about the temperature-coefficient. How would I procceed to mitigate the temperature error?

PS: I hacked my board to use an external Vref (0.1% 3V reference by TI ... planning to change to an Analog one with a better temperature coefficient)

#error #stm32f4 #adc #inl #temperature
1 REPLY 1
terdf
Associate

In my project, I have set up a system which displays the ADC reading of a constant voltage source. The MCU model is STM32F103C8T6. VDDA and VSSA are connected to separate linear voltage regulator. Voltage source is reliable and filtered perfectly. When I cold start the STM32 board it starts to read 2240 then it slowly drops to 2200 in 2 minutes. When I restart the STM32 board while it has been running and showing 2200, it shows 2200. This means the STM32 is showing different ADC results when it is cold and when it is hot.

I set up the project to show internal temperature reading together with voltage reading. When I cold start the board it shows voltage reading 2240, temperature sensor reading 1800. They both slowly change for about 2 minutes and stays very stable when the STM32 board gets hot. But ratio of their change is not same.

In fact I expected a better ADC performance from STM. Now I need to set up a formula to compensate the temperature changes for reliable ADC readings.

Do you think I have better options? Except using an external ADC.

How