Hi to all the community's members,
I would like to ask some help about the ADC peripheral present on the STM32L0xx MCUs.
I found in this community site the following question (STM32 multichannel) that help me a little bit, but I want to
ask neverthless other questions.
I'm trying to write a generic device driver (based on HAL ADC driver) for the ADC peripheral capable of adding and/or to
removing at run time ADC channels on which to make ADC conversion. The device driver will use the interrupt mechanism
(for the moment no DMA transfer involved, no low power mode used).
The user will be able to add a channel specifying its ADC number identifier and its sampling time (chosen between the
allowable values present in HAL stm32l0xx_hal_adc.h header file).
Here some questions I pose:
- When there is a sequence of channels to convert each one having different sampling times selected, is the single
conversion mode with discontinuous mode enabled the only possible mode to use?
- In the previous point, is the call to HAL_ADC_Init() function (to be made inside the HAL_ADC_ConvCpltCallback() after checking the EOC bit status is active) the only way to change the sampling time of the next channel the equencer will convert?
- Inside the HAL_ADC_ConvCpltCallback() interrupt callback function, how can I know which ADC channel the interrupt is executing for? Have I to consider solely on the fact that every channel added in the sequence rank will be converted sequentially? But, if so, what does it happen if the sampling time of a channel is lesser that another one by a factor (for example I set a sampling time of 7,5 ADC cycle for channel 3 and 71,5 cycle for channel 7)?
- When adding or removing at run time a channel, have I to perform the adding/removing operation at the end of sequence conversion (via a call to HAL_ADC_ConfigChanne() function), when the ADSTART bit is low (assuming that no continuous mode is selected)?
Thanks in advance to everyone would like to respond these questions.