cancel
Showing results for 
Search instead for 
Did you mean: 

STM32F446RE UART Tx DMA: The tx buffer gets mixed up

LCho.1
Associate II

Hello all,

I am very lost and would appreciate any advice. Currently, my final goal is to send an ADC buffer using HAL_UART_Transmit_DMA.

But I am stuck on transferring a chronologically ordered array starting from 240 - 289.

#define ADC_BUF_LEN 50

uint16_t adc_buf[ADC_BUF_LEN];

   for (uint16_t i = 0; i< ADC_BUF_LEN; i ++)

  {

  adc_buf[i] = 240 + i;

  }

I am using HAL_UART_Transmit_DMA to transmit data.

HAL_UART_Transmit_DMA(&huart2, (uint8_t *)&adc_buf[0], ADC_BUF_LEN*2);

The PC receives the buffer and decodes the raw data.

Interestingly, the data is correct but the order gets mixed up on the receiver side of the uart.

The first number is the counter for the order it was received

The second number is the value of the decoded raw data

----------------

0

267 <----- as you can see it's starting from 267 not 240

----------------

1

268

----------------

2

269

----------------

3

270

----------------

4

271

----------------

5

272

----------------

6

273

----------------

7

274

----------------

8

275

----------------

9

276

----------------

10

277

----------------

11

278

----------------

12

279

----------------

13

280

----------------

14

281

----------------

15

282

----------------

16

283

----------------

17

284

----------------

18

285

----------------

19

286

----------------

20

287

----------------

21

288

----------------

22

289

----------------

23

511 <--------- Instead of 240 it's 511

----------------

24

241 <---- after half of the buffer is completed it goes back to 241

----------------

25

242

----------------

26

243

----------------

27

244

----------------

28

245

----------------

29

246

----------------

30

247

----------------

31

248

----------------

32

249

----------------

33

250

----------------

34

251

----------------

35

252

----------------

36

253

----------------

37

254

----------------

38

255

----------------

39

0

----------------

40

257

----------------

41

258

----------------

42

259

----------------

43

260

----------------

44

261

----------------

45

262

----------------

46

263

----------------

47

264

----------------

48

265

----------------

49

266

The only missing number is 240 but the other data are correct. Any advice on how I can fix it?

I am also decoding by big-endian. However, it seems like stm32 uses little endian.

#define ADC_BUF_LEN 50
uint16_t adc_buf[ADC_BUF_LEN];
/* USER CODE BEGIN 2 */
 
  	for (uint16_t i = 0; i< ADC_BUF_LEN; i ++)
  	{
  		adc_buf[i] =  240 + i;
  	}
  	HAL_UART_Transmit_DMA(&huart2,  (uint8_t *)&adc_buf[0], ADC_BUF_LEN*2);
 
 /* USER CODE END 2 */
....
 
static void MX_USART2_UART_Init(void)
{
 
  /* USER CODE BEGIN USART2_Init 0 */
 
  /* USER CODE END USART2_Init 0 */
 
  /* USER CODE BEGIN USART2_Init 1 */
 
  /* USER CODE END USART2_Init 1 */
  huart2.Instance = USART2;
  huart2.Init.BaudRate = 115200;
  huart2.Init.WordLength = UART_WORDLENGTH_8B;
  huart2.Init.StopBits = UART_STOPBITS_1;
  huart2.Init.Parity = UART_PARITY_NONE;
  huart2.Init.Mode = UART_MODE_TX_RX;
  huart2.Init.HwFlowCtl = UART_HWCONTROL_NONE;
  huart2.Init.OverSampling = UART_OVERSAMPLING_16;
  if (HAL_UART_Init(&huart2) != HAL_OK)
  {
    Error_Handler();
  }
  /* USER CODE BEGIN USART2_Init 2 */
 
  /* USER CODE END USART2_Init 2 */
 
}
 
/**
  * Enable DMA controller clock
  */
static void MX_DMA_Init(void)
{
 
  /* DMA controller clock enable */
  __HAL_RCC_DMA1_CLK_ENABLE();
  __HAL_RCC_DMA2_CLK_ENABLE();
 
  /* DMA interrupt init */
  /* DMA1_Stream6_IRQn interrupt configuration */
  HAL_NVIC_SetPriority(DMA1_Stream6_IRQn, 0, 0);
  HAL_NVIC_EnableIRQ(DMA1_Stream6_IRQn);
  /* DMA2_Stream0_IRQn interrupt configuration */
  HAL_NVIC_SetPriority(DMA2_Stream0_IRQn, 0, 0);
  HAL_NVIC_EnableIRQ(DMA2_Stream0_IRQn);
 
}

I would really appreciate any help on this issue. I am a newbie so it would be really helpful if you can point to the reference if there's anything I should be aware of.

3 REPLIES 3
KnarfB
Principal III

How do you read and decode the chars on the PC side? Sending arbitrary bytes through a serial line can cause issues by "intelligent" software or dirves. I refrain from that if I can and use printable ASCII chars if possible. Yeah, that causes some overhead but eases debugging using a terminal prog.

hth

KnarfB

LCho.1
Associate II

Thank you for your answer!

Could you elaborate on what you mean by "arbitrary bytes"? Is it because I did not set the bytes' size?

I am using Python to decode on the PC side

After the code reads a binary

data = serialPort.readline()

*** data looks like

b'\x01\x0b\x01\x0c\x01\r\x01\x0e\x01\x0f\x01\x10\x01\x11\x01\x12\x01\x13\x01\x14\x01\x15\x01\x16\x01\x17\x01\x18\x01\x19\x01\x1a\x01\x1b\x01\x1c\x01\x1d\x01\x1e\x01\x1f\x01 \x01!\x01\xf0\x00\xf1\x00\xf2\x00\xf3\x00\xf4\x00\xf5\x00\xf6\x00\xf7\x00\xf8\x00\xf9\x00\xfa\x00\xfb\x00\xfc\x00\xfd\x00\xfe\x00\xff\x00\x00\x01\x01\x01\x02\x01\x03\x01\x04\x01\x05\x01\x06\x01\x07\x01\x08\x01\t\x01\n'

Then I decode

value = ((data[ind+1] & 0xff)<< 😎 + (data[ind] & 0xff)

while looping every other index

Then how can I fix it on the terminal program?

LCho.1
Associate II

One more question,

for value 256 or any values that are multiples of 256 (512, 768, ....) seems to give the wrong value because after 255 (which in hex FF)

it should be 256 (which in hex 100) but the one gets ignored (as you can see from the counter 39) and prints out 0

How can I fix this problem? (if I change the decoding method to little endian, it solves the problem but the other values get screwed)