2018-03-21 05:07 PM
Hello,
I'm using SPL libs for stm32F103RB in a nucleo board. I was creating a first ADC level driver and testing.
After configuring individual, regular and scan mode adc group channels, so dma with the correct values the conversions seem to be working. I have adc1 channels, from channel 9 to channel 15, so I used:
ADC_RegularChannelConfig(
ADC1, ADC_Channel_9, 1, ADC_SampleTime_55Cycles5
);...(
ADC1, ... , ++, ADC_SampleTime_55Cycles5
);ADC_RegularChannelConfig(
ADC1, ADC_Channel_15, 7, ADC_SampleTime_55Cycles5);So DMA is configured as:
#define ADC1_DR_ADDR ((uint32_t)0x4001244C)
uint16_t regularConvData_Tab[7];
DMA_DeInit(DMA1_Channel1);
DMA_InitStructure.DMA_PeripheralBaseAddr = ADC1_DR_Address;
DMA_InitStructure.DMA_MemoryBaseAddr = (uint32_t)regularConvData_Tab;
DMA_InitStructure.DMA_DIR = DMA_DIR_PeripheralSRC;
DMA_InitStructure.DMA_BufferSize = 7;
DMA_InitStructure.DMA_PeripheralInc = DMA_PeripheralInc_Disable;
DMA_InitStructure.DMA_MemoryInc = DMA_MemoryInc_Enable;
DMA_InitStructure.DMA_PeripheralDataSize = DMA_PeripheralDataSize_HalfWord;
DMA_InitStructure.DMA_MemoryDataSize = DMA_MemoryDataSize_HalfWord;
DMA_InitStructure.DMA_Mode = DMA_Mode_Circular;
DMA_InitStructure.DMA_Priority = DMA_Priority_High;
DMA_InitStructure.DMA_M2M = DMA_M2M_Disable;
DMA_Init(DMA1_Channel1, &DMA_InitStructure);
The fact is that when I read regularConvData_Tab[7], debugging, after varying voltage values at input channels I can see how the converions are done propperly.
But there happens anotherthing weird:
Now, I was varying the voltage value input pins, one by one. And I could observe that channel_12 end conversion value has been transfered and saved at first memory address, channel_13 at second, 14 at next, 15 at fourth, 9 channel at 5th, 10 at sixth and 11 at the last. I observed it looking into regularConvData_Tab[index] array.
It seems to be starting dma transfer at channel_12 end conversion, instead by channel_9, which it was programmed to be the rank = number 1 at the sequence of conversion.
order in channel conversion seq index channel content order in regularConvData_Tab
[index]
chn 9 0 chn 12
chn 10 1 chn 13
chn 11 2 chn 14
chn 12 3 chn 15
chn 13 4 chn 9
chn 14 5 chn 10
chn 15 6 chn 11
Has any body explanation for this behaviour?
I have deeply read adc and dma chapters in datasheet, but I only see that dma and adc seems to be correctly configurated and I don't understand this type of transfer offset.
How can be chn 9 ending conversion and transfering to dma after chn 12 or chn 15 doing it, in example?
2018-03-21 05:43 PM
Make sure not to initialize DMA until after you've done the calibrations steps.
2018-03-22 05:27 AM
Hi Clive, thanks for answering soon,
I was doing calibration steps before dma initialization, and sequence channel transfers were experimenting an offset. But today, I have found an example where an user was dividing system clock by 6 for adc and I try to do the same because I have the same system frq 72Mhz. So DMA starts transfering in the correct sequence. This case make me guess some kind of timming conflicts between dma and adc modules.
I was thinking that adc starts converting voltage before the dma is enabled. So I changed the order of enable and initializations. Now I init and enable first the dma struct, and after this I init the adc struct, enable and starts conversion. Now it's transfering with the correct channel sequence.
So with this current setup, calibration steps are done after adc init, hence after dma init. But now it's working.
If it's not a propper way I don't know why it's working.
2018-07-15 09:05 PM
Eugenia,
I have been having the same problem. It began working correctly when I delayed setting the CONT and DMA in CR2 until the calibration, followed by DMA setup and enabling, was complete. The final step was writing the ADC_CR2_ADON bit for the second time which starts the conversions.
Earlier when the data was shifted, the CONT and DMA bits were set before the calibration and DMA setup. The calibration steps in my code has several short delays where the Ref Manual says there is a hardware delay. Changing the duration of those delays resulted in different shifts in the data, which suggests that a scan counter was running during the calibration. (However, setting the SCAN bit in CR1 early or late didn't seem to matter, though the number of channels to scan was set early, so it could be a case where the scan counter always runs but isn't used unless the SCAN bit in CR1 is set.) When the setting of those bits was done at the end of the setup, the amount of delay in the wait loops had no effect and the data always stored in the locations as expected.
My guess is that in your case the divide by 6 that resulted in the correct storage slowed the clocking so that the setup sequence completed before the scan counter had changed.