cancel
Showing results for 
Search instead for 
Did you mean: 

STM32G071RBT6 > Vbat monitoring characteristics > ts_vbat min = 12us (fastest sampling time).

marcoantoniovlh-vm
Associate

Hi,

I'm using STM32G071RBT6 microcontroller where I need the fastest ADC time sampling as possible. I'm sampling 12 ADC channels. One of them is Vbat.

According microcontroller datasheet "5.3.22 Vbat monitoring characteristics", the minimum ADC sampling time when reading the VBAT should be ts_vbat = 12us. This is a limitation for my application and I would like to reduce this time. 

My questions are:

1. What is the technical reason for this minimum ts_vbat = 12us, while others like "ts_temp" or "ts_vrefint" have much lower values?

2. Are there any workaround to reduce this time?

3. Which could be the impact in the "Error on Q" if we configure in our application the ts_vbat = 10us? 

Thanks in advance,

Marco.

 

2 REPLIES 2

> 1. What is the technical reason for this minimum ts_vbat = 12us, while others like "ts_temp" or "ts_vrefint" have much lower values?

VBAT is measured at a resistor divider (possibly followed by some amplifier), and this source has a higher impedance than the outputs of temperature sensor and vrefint source.

> 2. Are there any workaround to reduce this time?

No.

> 3. Which could be the impact in the "Error on Q" if we configure in our application the ts_vbat = 10us?

I don't know what's "Error on Q".

Failing to observe adequate sampling time according to signal source impedance results in the ADC readout to be somehow influenced by the previous conversion (i.e. previous channel's voltage), but the relationship is not straighforward nor documented.

The 'G0 ADC allows to use two different sampling times and selectively set them for individual channels, see description of ADC_SMPR register.

JW

Thanks, JW! Very helpful response.

"Error on Q" is the error in [%] on "Ratio on Vbat measurement" described in the microcontroller electrical characteristics:

marcoantoniovlhvm_0-1743535948395.png

 

I have one question more:

4. Is there any practical formula to calculate which is the minimum sampling rate (ts) for a given analogic input? For instance, in an ADC PT1000 sensor circuitry with next data: Vin=3,3V; Rain=10Kohms; Cain=1nF; Cadc = 5pF; fADC=8Mhz; 12 bit resolution; Accuracy = +/-3LSB. Where:

marcoantoniovlhvm_1-1743538694771.png

Radc is not specified in the STM32G071RBT6 datasheets. So, may I guess it's a negligible value.

From my point of view, in a PT1000 sensor with constant Vin, the time constant (tc) could be rounded to tc = (Rain * Cadc) = 50ns. So, setting an ADC time sample around 500ns-600ns for +/-3LSB should be a good approach. Am I missing something??

Thanks in advance,

Marco.