AnsweredAssumed Answered

STM32F3 using 2 channels ADC with high impedance inputs causing cross-channel interference

Question asked by Mark Walters on Dec 19, 2016
Latest reply on Dec 21, 2016 by Burton.Mark


While I have found a solution to this problem, I still do not understand the root cause and would appreciate any insight as to why it occurs.


To simplify the issue I am using the standard stm32F334 Nucleo hardware. It has two ADC's.

I have connected two 500k ohm pots between 3V3 and GND, their wipers connected to PC5 (ADC2 channel 11) and PB12 (ADC2 channel 13).

The system config as follows:

  • Analog supply voltages VDDA = VDD = 3V3 and VSSA is tied to GND.
  • System clock running at 64MHz
  • ADC clock running at 32MHz
  • ADC's are configured as follows:
    • ADC1 is master, ADC2 is slave (dual mode)
    • Channels are interleaving (regular simultaneous has the same issue)
    • continuous conversion
    • DMA1 channel 1 used to transfer ADC result data to buffer
    • ADC channel sample time 1.5 cycles
    • conversion sequence is ADC2 channel 11, then channel 13.


Now I am aware that the ADC clock & channel sample time are way too fast for the high ADC input impedance. This is done on purpose for this example to exacerbate the issue for easy replication & identification of the problem.


The problem is replicated as follows:

  • set pot for ADC2 channel 11 to max with measured Vin = 3V3.
  • set pot for ADC2 channel 13 to mid-way with measured Vin = 1V5
  • while turning pot for ADC2 channel 11 from max to min, note the following:
    • ADC2 channel 11 Vin drops from 3V3 to 0V as expected
    • ADC2 channel 13 Vin drops from 1V5 to 0V. This is not what I expected & cannot explain.


The following observations are noted:

  • Varying the pot for ADC2 channel 13 has no affect on the Vin of ADC2 channel 11.
  • If the conversion sequence is swapped, channel 13 now affects Vin of channel 11 - i.e. the ADC channel conversion sequence order is relevant.
  • A similar experiment reveals the same issue with ADC1 channels 11 and 13. 
  • The effect is limited to an ADC, i.e. ADC1 channels do not effect ADC2 channels, no matter what sequence is used.


The solution to the problem was to decrease the ADC conversion time (thus accounting for the high input impedance) by:

  • ADC clock set to 1MHz
  • ADC channel sample time set to 601.5 cycles

It is noted that with these settings, while the problem is still occurring, it is limited to a few mV of change.


So my question / problem is that I do not understand how the Vin of an ADC channel is effecting the Vin of the next ADC channel in the conversion sequence (for the same ADC).