cancel
Showing results for 
Search instead for 
Did you mean: 

offset and linearity calibration

R_DSP
Associate II

Hello,

I'm working with the offset and linearity calibration on an STMH755.  My board will run the full calibration at production, save off the resulting offset and linearity calibration constants, and then retrieve and apply them at any new restarts of the processor. I use the HAL and LL calls to first read them, for saving to NVRAM, then recalling and writing them at startup.  To confirm these were in fact being installed on the ADC at startup, I ran a test where all offset and linearity calibration constants were intentionally set to 0.  I would have assumed that would result in a 0 ADC readings, always (setting offsets and linearity constants to 0), but yet I am seeing non-zero values consistently.

Have I misunderstood something?  I don't see much explanation of the actual applied calibration logic in any reference manuals or documentation read so far.

Please advise,

Robert

2 REPLIES 2
uri_fridman_23
Associate

if the adc input pin is floating you will get noise,

and that will show up at the adc reading

 

uri_fridman_23
Associate

if the pin not connected to referance voltage you will get nois at the input