AnsweredAssumed Answered

STM32F373 sdadc linearity at low input drops off

Question asked by haselwood.don on Mar 20, 2016
Latest reply on Mar 22, 2016 by haselwood.don
With a 'F373 I find that the sdadc counts-per-volt of the SDADC drops off rather dramatically when the input voltage decreases below about 40% of FS.  I would like to understand the mechanism behind this before locking in the rest of the design and a board layout.

Before laying out pc board a test setup was made using a DiscoveryF3 board with the F'303 processor removed and replaced with a 'F373, along with adding a 8 MHz xtal, several jumpers and capacitors for Vrefsd+.

SDADC3 is setup to scan 9 ports, single-ended, DMA, 1.8v internal reference.

The input to one port: 13.8v regulated bench supply to a 500 ohm pot.  The wiper of the pot drives a resistive divider, e.g. 6.8K|1.8K, with 100u cap across the 1.8K (which is similar to the application).  A 900 ohm series resistor goes from the junction of the divider to the SDADC port pin. 

The following table is an example of the sdadc counts-per-volt changing versus the input level.  The first column is the voltage at the junction of the resistive divider; the second column the readings after filtering with two, three section cic filters; and, the third column the sdadc reading per volt.  The total change is roughly 23% which seems rather high.


                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                               
0.01544529666.7
0.045154034222.2
0.112397335473.2
0.164587835841.5
0.241870336112.0
0.3081114836194.8
0.4231532136219.9
0.5632042636280.6
0.8152961636338.7
0.9843580436386.2
1.2634598436408.6
1.5605684736440.4
1.7536391036457.5

Running this test with dividers, 75K|12K|10u, and 604K|100K|1u, I get virtually identical results.  Normalized, the curves match quite well.  This suggests that it is not a source impedance type of issue.

The application is a battery management monitoring system and since the voltage range is expected to stay within a 2:1 range the non-linearity in the lower half of the sdadc range is not as important, though the change in the upper half still cuts in the total tolerances allowable in the design. 

Applying a correction computation, such as a least squares fit of a polynomial, or table with interpolation, is a possibility, however at the moment how this non-linearity changes with temperature and time is unknown.

Another issue that is a bit puzzling is the effect of the series resistor.  Using the 6.8K|1.8K|100u divider and varying the series resistor changes the readings of course, but the curve shows a peak around 900 ohms.  One would expect the readings to increase as the series resistance drops from say 100K to zero, the rate of change becoming less as the resistance nears zero, however to reach a peak then drop off was a surprise.

Also, I noticed the noise of the raw readings are worse when the capacitor is from the port pin to ground, rather than on the resistive divider side of the series resistor.

The noise also rides on top of a low level waveform (roughly 5-20 adc counts) that has a 4 and 64 sdadc clock repetition.  I couldn't find anything that correlated with these rates, and the cic filtering takes this out.

Early in the testing I discovered that switching a LED on the Discovery board had a large effect (150-200 adc counts) on the readings.  The LEDs are driven by port pins associated with the SDADC and share the same power pin so it makes clear that one should not be using doing any switching of port pins associated with the sdadc.  Eliminating the switching of the LEDs removed the gross noise. 

The DiscoveryF3 board layout is probably not optimum for minimizing noise, but the only thing running besides the SDADC is the USART1 on port A and given the heavy filtering the serial port effect should be small.  The bigger issue, however, is the dependence of the calibration with input voltage.

Before locking in the design based on the 'F373 sdadc I need to have a better understanding of the underlying mechanism that gives rise to the sdadc-counts-per-volt changing with input level. 

Any insights, or directions to documentation would be appreciated.

Outcomes