AnsweredAssumed Answered

Debugging Ethernet

Question asked by dumaresq.jonathan on Sep 14, 2012
Latest reply on Sep 3, 2013 by spinucci.joseph
Hi,

I have a new board based on STM32F407 that is wired iin RMII with a  KSZ8031RNL PHY. 

I would like to port the stm32 f4 eval board sample to my board. 

I have defined the RMII_MODE

Here the maping of the phy

        ETH_MDIO -------------------------> PA2
        ETH_MDC --------------------------> PC1
        ETH_MII_RX_CLK/ETH_RMII_REF_CLK---> PA1
        ETH_MII_RX_DV/ETH_RMII_CRS_DV ----> PA7
        ETH_MII_RXD0/ETH_RMII_RXD0 -------> PC4
        ETH_MII_RXD1/ETH_RMII_RXD1 -------> PC5
        ETH_MII_TX_EN/ETH_RMII_TX_EN -----> PB11
        ETH_MII_TXD0/ETH_RMII_TXD0 -------> PB12
        ETH_MII_TXD1/ETH_RMII_TXD1 -------> PB13


I have change the GPIO init to 

  /* Configure PA1, PA2 and PA7 */
  GPIO_InitStructure.GPIO_Pin = GPIO_Pin_1 | GPIO_Pin_2 | GPIO_Pin_7;
  GPIO_Init(GPIOA, &GPIO_InitStructure);
  GPIO_PinAFConfig(GPIOA, GPIO_PinSource1, GPIO_AF_ETH);
  GPIO_PinAFConfig(GPIOA, GPIO_PinSource2, GPIO_AF_ETH);
  GPIO_PinAFConfig(GPIOA, GPIO_PinSource7, GPIO_AF_ETH);


  /* Configure PC1, PC4 and PC5 */
  GPIO_InitStructure.GPIO_Pin = GPIO_Pin_1 | GPIO_Pin_4 | GPIO_Pin_5;
  GPIO_Init(GPIOC, &GPIO_InitStructure);
  GPIO_PinAFConfig(GPIOC, GPIO_PinSource1, GPIO_AF_ETH);
  GPIO_PinAFConfig(GPIOC, GPIO_PinSource4, GPIO_AF_ETH);
  GPIO_PinAFConfig(GPIOC, GPIO_PinSource5, GPIO_AF_ETH);
                                
  /* Configure PB11, PB12 and PB13 */
  GPIO_InitStructure.GPIO_Pin =  GPIO_Pin_11 | GPIO_Pin_12 | GPIO_Pin_13;
  GPIO_Init(GPIOB, &GPIO_InitStructure);
  GPIO_PinAFConfig(GPIOB, GPIO_PinSource11, GPIO_AF_ETH);
  GPIO_PinAFConfig(GPIOB, GPIO_PinSource12, GPIO_AF_ETH);
  GPIO_PinAFConfig(GPIOB, GPIO_PinSource13, GPIO_AF_ETH);

here my part of bsp init

void ETH_BSP_Config(void)
{
  /* Configure the GPIO ports for ethernet pins */
  ETH_GPIO_Config();
  
  /* Config NVIC for Ethernet */
  ETH_NVIC_Config();


  /* Configure the Ethernet MAC/DMA */
  ETH_MACDMA_Config();
}

In the ETH_MACDMA_Config

  /* Reset ETHERNET on AHB Bus */
  ETH_DeInit();


  /* Software reset */
  ETH_SoftwareReset();


  /* Wait for software reset */
  while (ETH_GetSoftwareResetStatus() == SET);


  /* Enable ETHERNET clock  */
  RCC_AHB1PeriphClockCmd(RCC_AHB1Periph_ETH_MAC | RCC_AHB1Periph_ETH_MAC_Tx |
                         RCC_AHB1Periph_ETH_MAC_Rx, ENABLE);
.......
}

If i put the RCC_AHB1PeriphClockCmd before the while (ETH_GetSoftwareResetStatus() == SET); I loop there foreever. I see somewhere in this forum That the MAC clock sould not be anabled before reseting it. I don't know why this is done in the ST sample. 


Now in the ETH_Init I always loop here

  if(ETH_InitStruct->ETH_AutoNegotiation != ETH_AutoNegotiation_Disable)
  {  
    /* We wait for linked status... */
    do
    {
      timeout++;
    } while (!(ETH_ReadPHYRegister(PHYAddress, PHY_BSR) & PHY_Linked_Status) && (timeout < PHY_READ_TO));
}

I have change the   PHYAddress to fit with the real PHY (0x00).

I try to debug this. This is the first time I debug a PHY !!!

I have looked at some pin With the SCOPE and I can confirm this.

I have a 50Mhz clock on the Clk_ref pin. 
When a packed is received from the network (normal traffic) I see the RDX0 and RXD1 moving. But When I look at RMII_TX_EN and RMII_TXD0 and RMII_TX1, nothing change on thoses pin. It look like this pin is tied to low. 

I'm probably missing something there. For Now I have no idea what !

Is there a technic (like a step by step) to debug this thing ?

regards

Jonathan



Outcomes