cancel
Showing results for 
Search instead for 
Did you mean: 

UDP packages make my MCU reset

parisa
Senior

Hello,

I wrote this code to send each 100us a 16bytes to PC. (STM32f10x)

int32_t socket;
uint8_t *sendbuf;
 
 
void Time3(void){
TIM_TimeBaseInitTypeDef TimeStruct; 
NVIC_InitTypeDef nvicStructure;
	
NVIC_PriorityGroupConfig(NVIC_PriorityGroup_3);
nvicStructure.NVIC_IRQChannel = TIM3_IRQn;
nvicStructure.NVIC_IRQChannelPreemptionPriority =1;
nvicStructure.NVIC_IRQChannelSubPriority = 0;
nvicStructure.NVIC_IRQChannelCmd = ENABLE;
NVIC_Init(&nvicStructure);		
	
RCC_APB1PeriphClockCmd(RCC_APB1Periph_TIM3,ENABLE);
TimeStruct.TIM_Prescaler=35;
TimeStruct.TIM_Period=200;  ///200=100us
TimeStruct.TIM_ClockDivision=TIM_CKD_DIV1;	
TimeStruct.TIM_CounterMode=	TIM_CounterMode_Up ;
TIM_TimeBaseInit(TIM3, &TimeStruct);
TIM_Cmd(TIM3, ENABLE);
TIM_ITConfig(TIM3,TIM_IT_Update,ENABLE);
}
 
/*###########################################
##############Main Function##################
###########################################*/ 
 
void SetSCK(){	
			
			socket = netUDP_GetSocket(udp_cb_func);
			if (socket >= 0) 
			{
					netUDP_Open(socket,0);
					netUDP_SetOption (socket, netUDP_OptionTTL, 20);
			}
}
 
 
int main (void){
	CLOCK();
	GPIO_SetBits(GPIOA,GPIO_Pin_5);
	LongDelay();
	netInitialize();
	LongDelay();
	Time3();
	GPIO_ResetBits(GPIOA,GPIO_Pin_5);
 
	do {
  	osDelay(500U);
  	netIF_GetOption(NET_IF_CLASS_ETH | 0, netIF_OptionIP4_Address, (uint8_t *)&addr, sizeof (addr));
	} while (addr == 0U);
 
 
 
	SetSCK();
 
	Watch();
 
	while (1) 
 
	{
		
		if(Alarm==1)
		{
			Alarm=0;
			Ethers();
		}
		
		if(CheckAlarm>2)
		{
			netUDP_Close (socket);
			netUDP_ReleaseSocket (socket);
			SetSCK();
			CheckAlarm=0;
		}
 
 
		IWDG_ReloadCounter();
 
 
	}
}
 
void Ethers(void){
			NET_ADDR addrUDP={ NET_ADDR_IP4, 5022, 192, 168,1,10};
			if(socket>=0){
			CheckAlarm=0;
			Data();
			sendbuf =  netUDP_GetBuffer (Ether.Size);
			memcpy (sendbuf, MSG, Ether.Size);
			State=netUDP_Send (socket,&addrUDP,sendbuf,Ether.Size);	
}
void TIM3_IRQHandler(){
	
	CheckAlarm=CheckAlarm+1;
	Alarm=1;
	TIM_ClearITPendingBit(TIM3, TIM_IT_Update);
}
 
					
 
void Watch(void){
NVIC_InitTypeDef nvicStructure;
NVIC_PriorityGroupConfig(NVIC_PriorityGroup_3);
    nvicStructure.NVIC_IRQChannel = WWDG_IRQn;
    nvicStructure.NVIC_IRQChannelPreemptionPriority = 2;
    nvicStructure.NVIC_IRQChannelSubPriority = 2;
    nvicStructure.NVIC_IRQChannelCmd = ENABLE;
    NVIC_Init(&nvicStructure);	
	
	RCC_APB1PeriphClockCmd(RCC_APB1Periph_WWDG,ENABLE);
	IWDG_SetPrescaler(IWDG_Prescaler_32);
	IWDG_SetReload(200);
	IWDG_Enable();
}
 
 
 
 

After some seconds/minutes my MCU get rested( I receive correct data before and after resting). It makes me crazy. I can't understand why?

I call SetSCK() when after 300us socket is failed(Socket<0). what is my mistake?

In addition in some situations while I am receiving the correct data, my mcu doesn't respond to ping command.(I have set the IP at Net_Config_ETH file).

 0693W000000W3S7QAK.png

6 REPLIES 6
TDK
Guru

The most likely scenario is that IWDG is resetting the chip because IWDG_ReloadCounter() isn't being called in time. Looks like you're using blocking UDP calls.

Verify this is the case by checking reset flags.

If you feel a post has answered your question, please click "Accept as Solution".
parisa
Senior

Yes, You are right. But I can't understand how this happens. Because by setting (IWDG_Prescaler_32)( Fr=40000/32=1250 hz) and IWDG_SetReload(200); it takes more than 160ms to set the watchdog flag. In while loop (IWDG_ReloadCounter();) is placed which resets the watchdog before 160ms. what is my mistake?

In my point of view the above process clears watchdog flag so faster than 160ms( because each 100us it sends 16 bytes in while loop).

TDK
Guru

Probably your loop takes more than 160ms to execute. With blocking UDP functions, it could take many seconds in some cases.

Disable IWDG, or set it to reset after 10s or something, or improve your main loop so it doesn't take as long.

If you feel a post has answered your question, please click "Accept as Solution".
parisa
Senior

Thanks for your comment.

But I need to know more, When my UDP sender sends each 100microS with DMA, why this loop takes more than 160ms?Because the heaviest function (Ethers

) doesn't do anything special.

parisa
Senior

I disabled the watchdog and added a Pin for counting the delay between the commands.

GPIO_SetBits(GPIOC,GPIO_Pin_10);
		if(Alarm==1)
		{
			Alarm=0;
			Ethers();
		}
		
		if(CheckAlarm>2)
		{
			netUDP_Close (socket);
			netUDP_ReleaseSocket (socket);
			SetSCK();
			CheckAlarm=0;
		}
 
 
		
 
		GPIO_ResetBits(GPIOC,GPIO_Pin_10);

I see that the maximum delay ( when Pin is high to being Low) is about 50us( 1us<t<50us). It is too strange.

parisa
Senior

I can't figure it out.