cancel
Showing results for 
Search instead for 
Did you mean: 

STM32F3 Discovery - ADC Offset

mattfox
Associate II
Posted on July 10, 2013 at 23:04

I'm working on getting the ADC fully integrated with the rest of my code.  When I'm testing out my ADC, however, I keep getting an output of about 300-400 from ADC1->DR when I'm inputting a voltage of 0 to the pin.  I have six pins configured for the ADC, and all of them exhibit this behavior.  Furthermore, if I put in a voltage of 2.9 or higher, I am maxed out at 4095.  Does anyone know why the values between 2.9 and 3.3 don't register a difference in output?

One of my speculated reasons for this behavior is that I have stopped waiting for the hardware to tell me that it is done with calibration (for reasons I stated in another thread recently).  I am wondering if this could be a product of incomplete calibration.  I'm not sure what really happens during ADC calibration, and if someone knows, I'd love to have a quick description.

One other question that I have is whether this offset would be generalized across devices.  I'm currently working with just one discovery board, but I'm planning on putting this chip on a new board, and i'd like to know if I can expect this sort of offset, or if this is a product of this just chip.  

Thanks so much for any help you can provide!  I really appreciate it
5 REPLIES 5
mattfox
Associate II
Posted on July 10, 2013 at 23:09

And of course, as soon as I post something after doing my research, I find the answer I was looking for.  I found this in the data sheet for anyone else who has a similar question:

Calibration is preliminary to any ADC operation. It removes the offset error which may vary

from chip to chip due to process or bandgap variation.

I'm not sure if the maxing out at the top is also a product of not waiting for calibration, so if anyone has information on that, it would be great.

frankmeyer9
Associate II
Posted on July 11, 2013 at 12:45

Calibration is preliminary to any ADC operation.

 

Correct so far, but I think your problem is another one.

To get zero from an ADC channel, you really need to apply 0V, i.e. shorting the input to ground.

 Does anyone know why the values between 2.9 and 3.3 don't register a difference in output?

 

Just measure the supply voltage of the MCU, which happens to be Vdda, too. It is not 3.3V, but only about 3.0V. This is caused by the shottky diode between the 3.3V regulator and the Vdd/Vdda pins of the MCU. So, the ADC value saturates when the input exceeds Vdda.

mattfox
Associate II
Posted on July 11, 2013 at 15:27

Yeah, I was definitely supplying 0 V and had input shorted to ground.  I'm going to keep working at getting this calibration thing going to see if that can fix my problem.

mattfox
Associate II
Posted on July 11, 2013 at 21:34

Update:  I now have it calibrating fully.  However, I cannot get the offset to disappear.  Furthermore, the limit at the top is now around 2.75 V.  This is directly using the example code provided by ST.  Does anyone know why this may be?  

As a side note, I'm working on Eclipse with GCC.  I downloaded a trial version of IAR to see if that one would work, and it does output 0 when I have an input of 0.  It also maxes out around 2.95 V which is much better.  However, when I use the exact same code that gives this good behavior in IAR but instead in Eclipse, I get a lower output limit of around 350-400 and an upper voltage limit of 2.75 V.  Any ideas?

mattfox
Associate II
Posted on July 13, 2013 at 05:04

Final Update:  I figured out what was wrong.  I have been using USART to debug my code, sending back the converted value over a serial line.  However, in IAR, I hadn't configured USART, so I was debugging via setting a pin high when it reached a certain threshold.  It turns out the USART was giving the weird behavior.

I realized that if I would detach the RX line on the discovery board, I would start getting good values.  This made me wonder why driving the RX pin could change something with the ADC, so I went through attaching the RX line to every pin on the board.  There was a 1-1 correlation between the pins that would create this offset and the pins that were channels for ADC1.  I then tried configuring those pins as outputs and realized that this would not create an offset.  Then I realized I could even drive another one of these error prone pins by the one that I set as output without creating the offset.  This made me start wondering why driving it with the computer's TX line would cause this offset.

I'm using a breakout board for the USART communication.  This board is supposed to have 3.3 V as its default output voltage.  However, when I tested the voltage, it turned out to be 5 V.  It had an option to break a bridge and solder a new one to switch to 3.3 V.  When I did this, and connected everything back together, I had no offset whatsoever.  

Turned out to be a really stupid mistake that was very difficult to catch.  If anyone runs across this error in the future, make sure to check that you aren't overdriving any pins.  Thanks to everyone for the help!