cancel
Showing results for 
Search instead for 
Did you mean: 

Setting Clock causing skip in code

steves1
Associate
Posted on January 02, 2006 at 07:24

Setting Clock causing skip in code

1 REPLY 1
steves1
Associate
Posted on December 28, 2005 at 17:51

I have some software which first initializes the clocks and then starts running. The software is as follows:


 // 
 // Configure the clock. The external clock runs at 4 Mhz, and we want a speed of 32 Mhz, so we will run 
 // the system off of the PLL1 clock. CLK2 will run at 2 Mhz, so we need a multiplier of  To get this, 
 // we multiply by 16, and divide by 1. We will run all systems at the default clock, which has no divider. 
 // 
 RCCU_MCLKConfig (RCCU_DEFAULT); 
 RCCU_FCLKConfig (RCCU_DEFAULT); 
 RCCU_PCLKConfig (RCCU_DEFAULT); 
 RCCU_PLL1Config (RCCU_PLL1_Mul_16, RCCU_Div_1); 
 // 
 // configure the device to run using the PLL clock. 
 // 
 RCCU_RCLKSourceConfig (RCCU_PLL1_Output); 
 // 
 // initialize the free packet queue 
 // 
 packet_queue_init(); 

This code worked on a development board, but failed on a board we engineered based on the development board design. While debugging this code to figure out what happened. I found that the processor skipped (!) over the packet_queue_init function, and the result in my code is that there are no available packets to use, which is why it wasn't working. The really strange part is that if I step over the clock code, it works fine. But if I run from the beginning my function is skipped. I put a delay loop in front of my function call, and it seems to workaround the problem, but I don't understand why this is happening. Is there something undocumented about setting the clock source where you are supposed to wait for a specific time before continuing? I'm somewhat alarmed that the processor arbitrarily skipped the function call.