cancel
Showing results for 
Search instead for 
Did you mean: 

how can I reduce the ADC interrupt entering time?

michael_nuaa
Associate II

The project is using MCU stm32f103C8T6 for controlling. Here we use the peripheral ADC for current sampling, and we need the time from the interrupt happening to the interrupt executing less than 2us. But the codes following spending about 6us. How can I reduce the time to 2us?

void ADC_Conf(void)

{

  ADC_InitTypeDef ADC_InitStructure;

  GPIO_InitTypeDef GPIO_InitStructure;

  NVIC_InitTypeDef NVIC_InitStructure;

  RCC_APB2PeriphClockCmd(RCC_APB2Periph_ADC1, ENABLE);

  GPIO_StructInit(&GPIO_InitStructure);

  GPIO_InitStructure.GPIO_Pin = GPIO_Pin_2;

  GPIO_InitStructure.GPIO_Mode = GPIO_Mode_AIN;

  GPIO_Init(GPIOA, &GPIO_InitStructure);

  ADC_DeInit(ADC1);

  ADC_StructInit(&ADC_InitStructure);

  ADC_InitStructure.ADC_Mode = ADC_Mode_Independent;

  ADC_InitStructure.ADC_ScanConvMode = DISABLE;

  ADC_InitStructure.ADC_ContinuousConvMode = DISABLE;

  ADC_InitStructure.ADC_ExternalTrigConv = ADC_ExternalTrigConv_T4_CC4;

  ADC_InitStructure.ADC_DataAlign = ADC_DataAlign_Left;

  ADC_InitStructure.ADC_NbrOfChannel = 1;

  ADC_Init(ADC1, &ADC_InitStructure);

  ADC_RegularChannelConfig(ADC1, ADC_Channel_2,1,ADC_SampleTime_13Cycles5);

  NVIC_InitStructure.NVIC_IRQChannel = ADC1_2_IRQn;

  NVIC_InitStructure.NVIC_IRQChannelPreemptionPriority = 2; // AD中断优先级最高

  NVIC_InitStructure.NVIC_IRQChannelSubPriority = 0;

  NVIC_InitStructure.NVIC_IRQChannelCmd = ENABLE;

  NVIC_Init(&NVIC_InitStructure);

  ADC_ClearFlag(ADC1, ADC_FLAG_EOC);

  ADC_ITConfig(ADC1, ADC_IT_EOC, ENABLE);

  ADC_ExternalTrigConvCmd(ADC1, ENABLE); 

  ADC_Cmd(ADC1, ENABLE);

}

void ADC1_2_IRQHandler(void)

{

  if (ADC1->SR & 0x02)

  {

      ADC1->SR = ~(uint32_t)0x02;

      adc.adc_samples[0] = _IQ16toIQ(ADC_GetConversionValue(ADC1));

  }

}

10 REPLIES 10
S.Ma
Principal

Compiler options and maybe do the _IQ16tolQ outside the interrupt.

In debug mode, look at the assembly code to understand what's going on.

Interrupt priority vs others and what is the SYSCLK MHz?

michael_nuaa
Associate II

11.6 Channel-by-channel programmable sample time

ADC samples the input voltage for a number of ADC_CLK cycles which can be modified using

the SMP[2:0] bits in the ADC_SMPR1 and ADC_SMPR2 registers. Each channel can be

sampled with a different sample time.

The total conversion time is calculated as follows:

Tconv = Sampling time + 12.5 cycles

Example:

With an ADCCLK = 14 MHz and a sampling time of 1.5 cycles:

Tconv = 1.5 + 12.5 = 14 cycles = 1 μs

according to 11.6

Tconv = 13.5 + 12.5 = 26 cycles / 12M = 2.17us​

Uwe Bonnes
Principal III

Putting the Interrupt Routine into RAM may also spare you the Flash load time.

AvaTar
Lead

Avoid the interrupt, and use DMA. Collect a number of samples, which you get with the DMA_TC interrupt.

Or, if you need the current value for cycle-by-cycle control, you can try the ADC continuous mode and read the ADC.DR register directly. Not sure if the jitter is acceptable to your application.

S.Ma
Principal

Interrupt vector in RAM as well as the ISR.

DMA for ADC normal channels usually reduce the ISR frequency and latency.

Reading the ADC will clear the status. If the only source you could skip reading the SR and just read the DR.

Both ISRs and vector table in RAM, as much other code as you can sustain also.

Avoid subroutine calls. Check code actually generated and optimization levels.​

Tips, Buy me a coffee, or three.. PayPal Venmo
Up vote any posts that you find helpful, it shows what's working..
michael_nuaa
Associate II

to @KIC8462852 EPIC204278916

SYSCLK frequency is 72M

I did a further test, first, I trigger the ADC by software and count the time, which is 2.45us, by setting GPIO,

second, I use ExternalTrigConv_T4_CC4 signal to trigger the ADC and count the time. which is 3.48us , from TIM4->CNT == TIM4->CCR4 to entering interrupt.

michael_nuaa
Associate II

also, if I use high optimization the 3.48us will be shortened to 3.28us​

S.Ma
Principal

Are you using interrupt to deinterlace the ADC array samples? Or no DMA channel available? Save RAM?

Look at the ASM code on the debugger.

How is the array C declared?

As clive said, you don't need to clear the SR bit, it's done automatically by reading the DR.