cancel
Showing results for 
Search instead for 
Did you mean: 

HAL_I2C_Mem_Write_DMA sends only one byte

iforce2d
Associate III

The blocking version (HAL_I2C_Mem_Write) works nicely, now I'm trying to use HAL_I2C_Mem_Write_DMA. Here are the settings as given in the configuration tool.

0693W000004IGU0QAO.png

0693W000004IGR1QAO.png

0693W000004IGYfQAO.png

The resulting code generated by the configurator (I removed some comments for brevity):

static void MX_DMA_Init(void) 
{
  __HAL_RCC_DMA1_CLK_ENABLE();
  HAL_NVIC_SetPriority(DMA1_Channel4_IRQn, 0, 0);
  HAL_NVIC_EnableIRQ(DMA1_Channel4_IRQn);
}
 
static void MX_I2C2_Init(void)
{
  hi2c2.Instance = I2C2;
  hi2c2.Init.ClockSpeed = 400000;
  hi2c2.Init.DutyCycle = I2C_DUTYCYCLE_2;
  hi2c2.Init.OwnAddress1 = 0;
  hi2c2.Init.AddressingMode = I2C_ADDRESSINGMODE_7BIT;
  hi2c2.Init.DualAddressMode = I2C_DUALADDRESS_DISABLE;
  hi2c2.Init.OwnAddress2 = 0;
  hi2c2.Init.GeneralCallMode = I2C_GENERALCALL_DISABLE;
  hi2c2.Init.NoStretchMode = I2C_NOSTRETCH_DISABLE;
  if (HAL_I2C_Init(&hi2c2) != HAL_OK)
  {
    Error_Handler();
  }
}
 
void HAL_I2C_MspInit(I2C_HandleTypeDef* hi2c)
{
  GPIO_InitTypeDef GPIO_InitStruct = {0};
  if(hi2c->Instance==I2C2)
  {  
    __HAL_RCC_GPIOB_CLK_ENABLE();
    /**I2C2 GPIO Configuration    
    PB10     ------> I2C2_SCL
    PB11     ------> I2C2_SDA 
    */
    GPIO_InitStruct.Pin = GPIO_PIN_10|GPIO_PIN_11;
    GPIO_InitStruct.Mode = GPIO_MODE_AF_OD;
    GPIO_InitStruct.Speed = GPIO_SPEED_FREQ_HIGH;
    HAL_GPIO_Init(GPIOB, &GPIO_InitStruct);
 
    /* Peripheral clock enable */
    __HAL_RCC_I2C2_CLK_ENABLE();
  
    /* I2C2 DMA Init */
    /* I2C2_TX Init */
    hdma_i2c2_tx.Instance = DMA1_Channel4;
    hdma_i2c2_tx.Init.Direction = DMA_MEMORY_TO_PERIPH;
    hdma_i2c2_tx.Init.PeriphInc = DMA_PINC_DISABLE;
    hdma_i2c2_tx.Init.MemInc = DMA_MINC_ENABLE;
    hdma_i2c2_tx.Init.PeriphDataAlignment = DMA_PDATAALIGN_BYTE;
    hdma_i2c2_tx.Init.MemDataAlignment = DMA_MDATAALIGN_BYTE;
    hdma_i2c2_tx.Init.Mode = DMA_NORMAL;
    hdma_i2c2_tx.Init.Priority = DMA_PRIORITY_LOW;
    if (HAL_DMA_Init(&hdma_i2c2_tx) != HAL_OK)
    {
      Error_Handler();
    }
 
    __HAL_LINKDMA(hi2c,hdmatx,hdma_i2c2_tx);
 
    /* I2C2 interrupt Init */
    HAL_NVIC_SetPriority(I2C2_EV_IRQn, 0, 0);
    HAL_NVIC_EnableIRQ(I2C2_EV_IRQn);
  }
 
}

The code where I'm calling this from:

void SSD1306_BlitLineBuffer(uint8_t row) {
 
    if ( row >= (SSD1306_SCREEN_HEIGHT / SSD1306_LINEBUFFER_HEIGHT) )
        return;
 
    SSD1306_WriteCommand(0xB0 + row);
    SSD1306_WriteCommand(0x00);
    SSD1306_WriteCommand(0x10);
 
    //HAL_I2C_Mem_Write(&SSD1306_I2C_BUS, SSD1306_ADDRESS, 0x40, 1, ssd1306Bytes, SSD1306_BUFFER_SIZE, HAL_MAX_DELAY);
 
    if ( HAL_OK != HAL_I2C_Mem_Write_DMA(&SSD1306_I2C_BUS, SSD1306_ADDRESS, 0x40, 1, ssd1306Bytes, SSD1306_BUFFER_SIZE) ) {
        Error_Handler();
	}
}
 
void HAL_I2C_MemTxCpltCallback(I2C_HandleTypeDef *hi2c) {
 
}

The commented out non-DMA call works fine.

The buffer size is 128 bytes.

When I place a breakpoint in the callback, the breakpoint is successfully triggered.

HAL_I2C_Mem_Write_DMA returns HAL_OK. When I step inside it, this is the state immediately before it returns:

0693W000004IGhcQAG.png

Note that XferCount starts as 128.

Here is the view from my logic analyzer, first the result from the blocking (non-DMA) call:

0693W000004IGxQQAW.png

Now the result from the DMA version:

0693W000004IGiBQAW.png

Only the MemAddress (0x40) is sent. The data buffer seems to be completely ignored.

When I place a breakpoint in the callback, XferCount has become zero, and the State and Mode are 'ready' and 'none' respectively. As far as I know this is what should be expected from normal operation.

0693W000004IGuRQAW.png

The only other thing running is a pretty simple timer (no channels enabled) which is used to trigger this I2C call every 20 ms. When measured with the non-blocking version, the data transfer takes 3.2ms so there should be plenty of spare time.

So I have no clue why the buffer itself is not being sent. Any suggestions would be helpful.

1 ACCEPTED SOLUTION

Accepted Solutions
iforce2d
Associate III

Okay, I finally figured this out. The configuration tool was generating calls like this:

  MX_I2C2_Init();
  MX_DMA_Init();

.... but these need to be in the opposite order, with the DMA one first. So all I had to do is reverse them.

I think the problem is that I first configured I2C without any DMA, and then later came back to enable the DMA setting. As you enable more features, the tool simply appends any newly required calls to the end of the generated list.

Unfortunately the configuration tool also continued to persistently re-arrange these calls in the wrong order every time I generated code after this. The solution seems to be like this:

  1. completely disable the I2C peripheral (and I suppose others using DMA too...)
  2. generate code (both calls will be removed from the generated code)
  3. enable the I2C again with DMA request, interrupt etc
  4. generate code again

When the I2C and DMA are enabled and generated together, the two calls will be in the correct order and will remain so in future code generations.

So I lost 6 hours here, but still probably less time than if I didn't have the configuration tool. Would be nice to see this problem fixed sometime though.

View solution in original post

5 REPLIES 5

Which STM32?

Debug as your own code. Read out and check/post content of I2C and DMA registers.

JW

iforce2d
Associate III

Okay, I finally figured this out. The configuration tool was generating calls like this:

  MX_I2C2_Init();
  MX_DMA_Init();

.... but these need to be in the opposite order, with the DMA one first. So all I had to do is reverse them.

I think the problem is that I first configured I2C without any DMA, and then later came back to enable the DMA setting. As you enable more features, the tool simply appends any newly required calls to the end of the generated list.

Unfortunately the configuration tool also continued to persistently re-arrange these calls in the wrong order every time I generated code after this. The solution seems to be like this:

  1. completely disable the I2C peripheral (and I suppose others using DMA too...)
  2. generate code (both calls will be removed from the generated code)
  3. enable the I2C again with DMA request, interrupt etc
  4. generate code again

When the I2C and DMA are enabled and generated together, the two calls will be in the correct order and will remain so in future code generations.

So I lost 6 hours here, but still probably less time than if I didn't have the configuration tool. Would be nice to see this problem fixed sometime though.

AFAIK this has been fixed at around the beginning of this year. What version of CubeMX are you using?

JW

Thanks again Jan. I was using 1.1.0 that I downloaded late last year. After trying 1.4.0 I can confirm that this issue is already fixed.

Thanks for confirming this.

JW