STM32G071RBT6 > Vbat monitoring characteristics > ts_vbat min = 12us (fastest sampling time).
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Email to a Friend
- Report Inappropriate Content
‎2025-03-31 8:17 AM
Hi,
I'm using STM32G071RBT6 microcontroller where I need the fastest ADC time sampling as possible. I'm sampling 12 ADC channels. One of them is Vbat.
According microcontroller datasheet "5.3.22 Vbat monitoring characteristics", the minimum ADC sampling time when reading the VBAT should be ts_vbat = 12us. This is a limitation for my application and I would like to reduce this time.
My questions are:
1. What is the technical reason for this minimum ts_vbat = 12us, while others like "ts_temp" or "ts_vrefint" have much lower values?
2. Are there any workaround to reduce this time?
3. Which could be the impact in the "Error on Q" if we configure in our application the ts_vbat = 10us?
Thanks in advance,
Marco.
Solved! Go to Solution.
- Labels:
-
ADC
-
Documentation
-
STM32G0 Series
Accepted Solutions
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Email to a Friend
- Report Inappropriate Content
‎2025-04-03 12:03 PM - edited ‎2025-04-03 12:14 PM
Cain sounds like you have connected an external capacitor as it is on that figure under Cparasitic.
In that case, at the moment when sampling begins (and Cadc is at unkown, i.e. presumably worst case voltage, let's assume 0V and let's assume voltage at Cain - let's call it Vain - is the maximum i.e. VREF+), these two capacitors form a capacitive divider, i.e. the voltage at Cain drops to Vain *Cain/(Cain+Cadc), i.e. with your values it drops by some 0.5%. Let's assume Vain=VREF+, LSB of 12-bit ADC is cca 0.0244% so then the drop is cca 20LSB.
Now this drop is going to be charged from the external voltage source through Rain to both capacitors in parallel, from which Cain prevails, i.e. this charging has time constant tau = 10kOhm*1nF=10us; if I am not mistaken it takes two tau to get from 20LSB to 3 LSB, so that's your minimum sampling time.
Using relatively big capacitor at the input works only if it is big enough so that the drop is negligible, but also the period *between* samplings is long enough so that the capacitor gets recharged from the external high-impedance source. If this won't suit your purposes, you'd have to resort to buffering.
See also AN2834, especially ch.4.4.
JW
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Email to a Friend
- Report Inappropriate Content
‎2025-03-31 10:21 AM
> 1. What is the technical reason for this minimum ts_vbat = 12us, while others like "ts_temp" or "ts_vrefint" have much lower values?
VBAT is measured at a resistor divider (possibly followed by some amplifier), and this source has a higher impedance than the outputs of temperature sensor and vrefint source.
> 2. Are there any workaround to reduce this time?
No.
> 3. Which could be the impact in the "Error on Q" if we configure in our application the ts_vbat = 10us?
I don't know what's "Error on Q".
Failing to observe adequate sampling time according to signal source impedance results in the ADC readout to be somehow influenced by the previous conversion (i.e. previous channel's voltage), but the relationship is not straighforward nor documented.
The 'G0 ADC allows to use two different sampling times and selectively set them for individual channels, see description of ADC_SMPR register.
JW
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Email to a Friend
- Report Inappropriate Content
‎2025-04-01 1:46 PM
Thanks, JW! Very helpful response.
"Error on Q" is the error in [%] on "Ratio on Vbat measurement" described in the microcontroller electrical characteristics:
I have one question more:
4. Is there any practical formula to calculate which is the minimum sampling rate (ts) for a given analogic input? For instance, in an ADC PT1000 sensor circuitry with next data: Vin=3,3V; Rain=10Kohms; Cain=1nF; Cadc = 5pF; fADC=8Mhz; 12 bit resolution; Accuracy = +/-3LSB. Where:
Radc is not specified in the STM32G071RBT6 datasheets. So, may I guess it's a negligible value.
From my point of view, in a PT1000 sensor with constant Vin, the time constant (tc) could be rounded to tc = (Rain * Cadc) = 50ns. So, setting an ADC time sample around 500ns-600ns for +/-3LSB should be a good approach. Am I missing something??
Thanks in advance,
Marco.
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Email to a Friend
- Report Inappropriate Content
‎2025-04-03 12:03 PM - edited ‎2025-04-03 12:14 PM
Cain sounds like you have connected an external capacitor as it is on that figure under Cparasitic.
In that case, at the moment when sampling begins (and Cadc is at unkown, i.e. presumably worst case voltage, let's assume 0V and let's assume voltage at Cain - let's call it Vain - is the maximum i.e. VREF+), these two capacitors form a capacitive divider, i.e. the voltage at Cain drops to Vain *Cain/(Cain+Cadc), i.e. with your values it drops by some 0.5%. Let's assume Vain=VREF+, LSB of 12-bit ADC is cca 0.0244% so then the drop is cca 20LSB.
Now this drop is going to be charged from the external voltage source through Rain to both capacitors in parallel, from which Cain prevails, i.e. this charging has time constant tau = 10kOhm*1nF=10us; if I am not mistaken it takes two tau to get from 20LSB to 3 LSB, so that's your minimum sampling time.
Using relatively big capacitor at the input works only if it is big enough so that the drop is negligible, but also the period *between* samplings is long enough so that the capacitor gets recharged from the external high-impedance source. If this won't suit your purposes, you'd have to resort to buffering.
See also AN2834, especially ch.4.4.
JW
