ADC needs calibration ofter each power-up?

ADC needs calibration ofter each power-up?

- Hi Sander,

This is how I solved the problem.

I am using 3 analog ports to read in one signal now. On the firts I put the signal that want to read in(signal with the information). On the second input I putted a (LT1009) voltage reference to generate a 2,500 VDC exact voltage reference. I connected the thirt analog port to the AVss(O.000V). This is all the hardware needed.

With the corresponding digital values comming from the 2,5000 and the 0,000V analog channels you can calculate the gain correction factor. See page 251 of the datasheet. This factor will change due to temperature and drift over time of the ADC. Read the datasheet carefully!!!!

Now multiply your analog value with X so that your analog input(input one) 0-2,5 V corresponds with 0-4095.

X consists of Y and the gain correction factor. So if the gain factor in increasing you have to decrease the Y factor to correct the error inserted by the changing gain factor. This way you correct the digital output values of your ADC.

You can check/calibrate every cycle or ones a minute, what ever you like.

Regards,

Jimmey

I am using the ADC of the STR710 MCU at it's maximum resolution. Now I have some trouble with reading in analog signals with mV accuraty.

If I put 0,000V on the input the ADC outputs(after avaraging 32 samples) the decimal value of 430. And at 2,500V it outputs 3522. So the ADC outputs 3522-430=3092 steps which are equal to 0-2,5V. In my oppinion this is a(about)11,6 bits resolution.

I read the calibration code from ST and it just multiplies the output of the ADC to convert it to a maximum of 4096. This does not increase the resolution of the ADC. So, 11,6 bits is the maximum resolution?

I read in the STR71xF datasheet on page 63 that the total unadjusted error after calibration is typ:2.36 and max 3.95 LSB. Does this effect the resolution with typ 2.36 of the LSB? So for example at 2,500V the ADC can output 3522 but also 3520 or 3524?

On page 62 note 3 says: calibration is needed once after each power-up. Does this mean that the ADC outputs different values for the same input voltage after each startup?

I do really need answers to the above questions. Otherwise I can not garanty the accuraty of the instrument I am working on.

Regards,

Jimmey