cancel
Showing results for 
Search instead for 
Did you mean: 

I2C slave no ack when not in debugging mode.

laserfaser
Associate II

Dear Community.

Short Rant (sorry, need to let off some steam):

I wonder, why there is not more questions on I2C-slave here. What started as a "cool, I will use the HAL, greater people than me have written these libs" has now turned out to be my worst nightmare.

I try to make it short: The HAL seems to be useless when trying to make an I2C slave work. I still had to read the reference manual, the errata and all questions that you can google by typing "i2c, slave, not working, no address match, only works with debugger, slave sends back address to master as first byte,..."

I already contacted a priest to come with holy water, I am really at the point where I have no clue what I can do anymore. I used to use PICs (Microchip) as controllers, but heard sooo many good things about STM32, so I thought, lets do the next project with a STM32F103R8T. Now I am at a point where I am this close to just blame the hardware as garbage.

My problem now:

I am trying to make an I2C slave that returns a fixed number of bytes to master upon being called with a read from the master (7Addr-Bits+1Read-Bit). In Debug-Mode it works, in normal run, it doesn't.

I have thrown almost any HAL-functionality out, and wrote most of the code myself (concerning I2C). I only use the HAL to initialize everything, but from there (in the interrupts) I do stuff myself.

The problem is: Now it (kinda) works, I get different bytes on the bus from my slave to the master, BUT ONLY WITH THE DEBUGGER CONNECTED. When it is not connected, my slave simply does not ack the addressing-frame coming from the master.

main.c:

 

 

int main(void)
{

  /* USER CODE BEGIN 1 */

  /* USER CODE END 1 */

  /* MCU Configuration--------------------------------------------------------*/

  /* Reset of all peripherals, Initializes the Flash interface and the Systick. */
  HAL_Init();

  /* USER CODE BEGIN Init */

  /* USER CODE END Init */

  /* Configure the system clock */
  SystemClock_Config();

  /* USER CODE BEGIN SysInit */

  /* USER CODE END SysInit */

  /* Initialize all configured peripherals */
  MX_GPIO_Init();
  MX_SPI1_Init();
  MX_TIM3_Init();
  MX_I2C2_Init();
  MX_TIM1_Init();
  /* USER CODE BEGIN 2 */
  //__HAL_TIM_SET_AUTORELOAD(&htim1,100000); //100.000 µsec
  //HAL_TIM_Base_Start_IT(&htim1);
  HAL_Delay(5);
  HAL_I2C_EnableListen_IT(&hi2c2);



  /* USER CODE END 2 */

  /* Infinite loop */
  /* USER CODE BEGIN WHILE */
  while (1)
  {
    uint8_t a = 0;
    a++;
    uint8_t b = a;
    /* USER CODE END WHILE */

    /* USER CODE BEGIN 3 */
  }
  /* USER CODE END 3 

 

 

I shortened the EnableListen_IT function so it does no state-related stuff but only sets the PE-Bit, sets up the slave so it acks incoming messages, and enables the corresponding interrupts:

 

 

HAL_StatusTypeDef HAL_I2C_EnableListen_IT(I2C_HandleTypeDef *hi2c)
{


    /* Check if the I2C is already enabled */

      /* Enable I2C peripheral */
      __HAL_I2C_ENABLE(hi2c);


    /* Enable Address Acknowledge */
    SET_BIT(hi2c->Instance->CR1, I2C_CR1_ACK);

    /* Enable EVT and ERR interrupt */
    __HAL_I2C_ENABLE_IT(hi2c, I2C_IT_EVT | I2C_IT_ERR);

    return HAL_OK;

}

 

 

The two interrupts immediately jump to my own code:

 

 

void I2C2_EV_IRQHandler(void)
{
  /* USER CODE BEGIN I2C2_EV_IRQn 0 */
  JOJO_InterruptHandler(&hi2c2);
  /* USER CODE END I2C2_EV_IRQn 0 */
  //HAL_I2C_EV_IRQHandler(&hi2c2);
  /* USER CODE BEGIN I2C2_EV_IRQn 1 */

  /* USER CODE END I2C2_EV_IRQn 1 */
}

/**
  * @brief This function handles I2C2 error interrupt.
  */
void I2C2_ER_IRQHandler(void)
{
  /* USER CODE BEGIN I2C2_ER_IRQn 0 */
	uint16_t sr1 = hi2c2.Instance->SR1;
  hi2c2.Instance->SR1 = ~(1 << 10) & sr1 ;
//  hi2c2.Instance->CR1 = 1 << 15;
//  hi2c2.Instance->CR1 = 1 << 0;
  /* USER CODE END I2C2_ER_IRQn 0 */
  //HAL_I2C_ER_IRQHandler(&hi2c2);
  /* USER CODE BEGIN I2C2_ER_IRQn 1 */

  /* USER CODE END I2C2_ER_IRQn 1 */
}

 

 

In the error-interrupt I only reset the AF-Error (which is not actually an error but it only shows that the master "has had enough").

My actual slave routine (which is called upon event-interrupts) checks if the last frame from the master was an addressing call, or just clocking out bits from the slave. I have some debugging variables in there (in order to create a history of interrupts and a "log" of what happened in the previous calls to the interrupt routine, so I could do some aftermath)

 

 

void JOJO_InterruptHandler(I2C_HandleTypeDef *hi2c)
{
  uint32_t sr1itflags;
  volatile uint32_t sr2itflags;
  sr1itflags = hi2c->Instance->SR1;
  sr2itflags = hi2c->Instance->SR2;

//  if(sr1itflags = )
//  sr1itflags = hi2c->Instance->SR1;
//  sr1itflags = hi2c->Instance->SR1;
//  sr1itflags = hi2c->Instance->SR1;
  call_count++;
  count++;
  debug_arr[count]=0xFF;
  uint8_t already_sent = 0;
  if(count > 22){
	  //sr1itflags = 5;
  }

  if(IS_BIT_SET(sr1itflags,ADDR_FLAG)){
	  //Slave has just been addressed. Load Data to the DR reg.

	  //sr2itflags = hi2c->Instance->SR2; //read SR2 after new Data has been written to DR to clear ADDR-Flag

	  //sr2itflags = hi2c->Instance->SR2;
	  hi2c->Instance->DR = call_count;
	  already_sent = 1;
	  //sr1itflags = hi2c->Instance->SR1;
	  count++;
	  debug_arr[count] = 1;
	  //return;

  }
  if(IS_BIT_SET(sr1itflags, TXE_FLAG)){
	  if(already_sent == 0){
		  hi2c->Instance->DR = call_count; //write data to DR reg as it has been loaded to shift reg
		  already_sent = 1;
	  }
	  count++;
	  debug_arr[count] = 2;
  }
  if(IS_BIT_SET(sr1itflags, BTF_FLAG)){
	  //hi2c->Instance->DR = 0b11001100;
	  count++;
	  debug_arr[count] = 3;
  }
  if(IS_BIT_SET(sr1itflags, AF_FLAG)){
	  //TODO: Communication done.
	  count++;
	  debug_arr[count] = 4;
	  hi2c->Instance->SR1 = 1 << AF_FLAG;
  }
  if(IS_BIT_SET(sr1itflags, OVF_FLAG)){
	  count++;
	  debug_arr[count] = 5;


  }
  if(count > 60){
	  count ++;
	  count --;
  }
  return;

}

 

 

I first read SR1 and SR2 to reset the ADDR-Flag, then I look if an ADDR flag or a TxE Flag was set, so I load the DR-reg. (all the count++ and debug_arr stuff is for debugging).

When running the whole thing in debug-mode it seems to work as expected: First an interrupt with ADDR and TxE set occurs, after that only TxE and BTF are set at the following calls to the interrupt-routine. After the master has read all the bytes, it NACKs and I get an error, as I said I reset the AF-Flag and all seems to work as expected.

All this is done with clockstretching ENABLED. When I disable it, nothing works, not even in Debug.

WHY DOES IT ONLY WORK IN DEBUG? WHEN RUNNING NORMAL, THE SLAVE DOES NOT ACK THE ADDRESSING-FRAME.

0 REPLIES 0