cancel
Showing results for 
Search instead for 
Did you mean: 

Cortex-M3 STM32 ADC low threshold

jim239955_stm1_st
Associate II
Posted on November 03, 2009 at 06:13

Cortex-M3 STM32 ADC low threshold

16 REPLIES 16
johnfitzgerald9
Associate II
Posted on May 17, 2011 at 13:27

There's lots of guidance in

http://www.st.com/stonline/products/literature/an/15067.pdf

which explains all about noise / pcb layout etc.

Also have a look at,

http://www.micromouseonline.com/blog/2009/05/25/simple-adc-use-on-the-stm32

which has clearly explained simple routine.

Good luck. Please post if you reach some resolution (see that pun?).

jim239955_stm1_st
Associate II
Posted on May 17, 2011 at 13:27

I looked at

http://www.st.com/stonline/products/literature/an/15067.pdf

and it is ADC accuracy app note which focuses mainly on hardware issues. Since one version of the code seems to work ok, I am focusing on software for now.

At this point the 2 mV transition only works when using the HSE clock. The ST Micro libraries used by both Rowley and IAR have great support for the HSE clock but not the HSI other than to say it is the default after reset. I am attempting to switch the working IAR/Rowley software from the HSE to the HSI clock. But so far the software using the HSI clock has only supported transitions at 80 mV.

I will definitely post the solution when I have confidence the problem is isolated.

And thanks again.

jim239955_stm1_st
Associate II
Posted on May 17, 2011 at 13:27

I looked at

http://www.st.com/stonline/products/literature/an/15067.pdf

and it is ADC accuracy app note which focuses mainly on hardware issues. Since one version of the code seems to work ok, I am focusing on software for now.

At this point the 2 mV transition only works when using the HSE clock. The ST Micro libraries used by both Rowley and IAR have great support for the HSE clock but not the HSI other than to say it is the default after reset. I am attempting to switch the working IAR/Rowley software from the HSE to the HSI clock. But so far the software using the HSI clock has only supported transitions at 80 mV.

I will definitely post the solution when I have confidence the problem is isolated.

And thanks again.

jim239955_stm1_st
Associate II
Posted on May 17, 2011 at 13:27

I have all versions of the software working under the Rowley tool set. I have verified that when using the Rowley toolset, I need a time delay after turning on the ADS CR2 ADON bit before doing the calibration.

Below is a list of the number of iterations I spend in the busy loop and the ADC reading I get. Note that these readings will vary 1 or 2 either way but seem to be consistent after the calibration. That is if I put a 3000 iteration busy loop subsequent readings will be between 0x39 and 0x3D.

10,000 0x43

5,000 0x3D

3,000 0x3B

2,000 0x3B

1,000 0x2C

500 0x0D

The 0x43 reading corresponds to a 0 to 1 count transition at about 2 mV. The 0x0D corresponds to about 86 mV. These numbers are at the board connector. For the voltage at the STM103 ADC port divide by about 2.5.

I draw two conclusions.

- The IAR toolset masks the problem

- The IAR and Rowley toolsets differ somehow

- Something needs to happen after asserting CR2 ADON before calibration

I have not found any bit to poll waiting until the calibration should be started. Still searching.

johnfitzgerald9
Associate II
Posted on May 17, 2011 at 13:27

Have a look at other forum entries ...

62.193.238.133/forums-cat-8198-23.html

and

62.193.238.133/forums-cat-6845-23.html

seems to back up what you're finding.

The IAR (IAR) and Rowley (Gcc) tools use different compilers. Also check the optimisation settings which may affect how fast the program runs.

johnfitzgerald9
Associate II
Posted on May 17, 2011 at 13:27

There is a warning in the Ref Manual RM0008 that, ''Note: If RSTCAL is set when conversion is ongoing, additional cycles are required to clear the

calibration registers.'' It is not stated whether RSTCAL will remain set until the calibration registers are cleared.

I had wondered about ''tstab'' stabilisation time but that's spec'd at 1 microsecond. Also when is the ADC clock first applied before you try to use the ADC? (In case some internal state machine has to clock itself through to some state).

The 200 microseconds delay you seem to need makes me think a conversion may be completing while you are waiting.

I agree that testing CAL and RSTCAL bits appears to control the process. Could you just repeat the reset calibration / calibration process to avoid having a delay loop? Does that fix the problem?

Is your VRef stable and settled before you attempt calibration?

Beyond this, I have no ideas. Sayonara.

jim239955_stm1_st
Associate II
Posted on May 17, 2011 at 13:27

Hi,

The compiler difference would seem to cause the difference. However, I do not find this delay in the sample software or in the documentation. Did I miss that? Plus, checking both the RSTCAL and CAL bits during calibration gave me the impression that I could not overrun the hardware. But it seems I can. Is there a spec anywhere on how long I must wait after turning on the ADON bit before starting calibration? My guess is my loop takes about 200 usec.

I looked at those two links provide. They seem to discuss DMA engine problems where the data is occasionally DMA'd to the wrong memory location. Our initial tests over 12,500 readings to not seem to exhibit any problem like this one. All of the data seems reasonably consistent.

Once again, thanks for the help.