cancel
Showing results for 
Search instead for 
Did you mean: 

UART Synchronisation issue LCM GUI display for NUCLEO-h743zi2 board

sushma
Associate III

Dear Team,

I am interfacing Nucleo-h743zi2 board  with LCM display(GUI) through UART using Interrupt method.

I can transmit the data to LCM and it is displaying properly on the LCM display.

Now I am trying to send the commands from LCM display to STM controller. 

 I tried to press the button for many times on LCM display and observed sometimes it is triggering and sometimes not triggered). I think there is a  problem with synchronisation for receiving from LCM display to STM controller. I can see the events are triggered sometimes in debugging steps.

I tried to change the system clock to different values but it did not worked. Also I tried to increase the delays, decreasing delays at different places did not worked.

I tried to copy the receiving buffer into the other buffer and cleared the buffer. But this also does not worked.

Please find the below Receive function for your reference:

 

extern uint8_t data_u[100];
uint8 j, max;

 

uint8 ReadSerialByte( uint8 *data)
{
 
if (max == 0) {
max = huart2.RxXferSize - huart2.RxXferCount;
if ( max == 0 ) return 0;
}
if (j >= max && max > 0) {
j=0;
max=0;
 
HAL_UART_Receive_IT(&huart2, data_u, 100);
return 0;
 
}
*data=data_u[j];
j++;
 
   return 1;
}/* ReadSerialByte */

 

It is mentioned in LCM document as follows:

your serial receive function should timeout after 0.1s (= SERIAL_ATOMIC_TIMEOUT defined in "config.master.h").

All other timeouts must be multiples of this (see "config.master.h" again):

 

#define SERIAL_ATOMIC_TIMEOUT 1   /*!< in tenth of seconds (default 0.1s); the
                                       timeout of the serial port should be
                                       configured to the same time (0.1s) */
#define SERIAL_NORMAL_TIMEOUT 20  /*!< in tenth of seconds (default 2s), must
                                       be a multiple of the
                                       SERIAL_ATOMIC_TIMEOUT */
#define SERIAL_FLASH_TIMEOUT  600 /*!< in tenth of seconds (default 60s), must
                                       be a multiple of the
                                       SERIAL_ATOMIC_TIMEOUT */

 

For every "normal" command you should receive a response package after 2s (= SERIAL_NORMAL_TIMEOUT).

You have to wait for the response befor sending the next command.

 

Please guide me to know how we can resolve the synchronisation issues by using above LCM information .

Do I need to use timer for 0.1sec to receive function. If yes could you please help me to know how we can write it in code.

 

1 ACCEPTED SOLUTION

Accepted Solutions
sushma
Associate III

Dear All,

Thank you so much for your Guidence.

I tried with polling method and receiving each byte for 100ms timeout instead of receiving with interrupt method. It worked for me. 

Now, I do not have synchronisation issues.

 

Regards,

Sushma

 

 

View solution in original post

13 REPLIES 13

How have you verified the the LCM is actually transmitting?

Have you checked your baud rates?

Have you checked the STM32 UART's status flags - are you getting errors (eg, Framing, Overrun?)

 

Please use this button to properly post source code:

AndrewNeil_0-1710342072345.png

 

 

Dear Andrew,

Thank you for reply.

Yes I have verified on oscilloscope and it is transmitting.

I set the baudrate for LCM as well STM to 115200bits/sec.

I am not receiving any errors.


@sushma wrote:

I have verified on oscilloscope and it is transmitting.


Have you verified that it is transmitting valid data?

 


@sushma wrote:

I set the baudrate for LCM as well STM to 115200bits/sec.


And have you verified the actual speeds on the wires?

 

Pavel A.
Evangelist III

You use RX with interrupts, HAL_UART_Receive_IT  - but nowhere in your code is the completion callback function.

Do you know how to use HAL_UART_Receive_IT correctly? Need some example?

Do I need to use timer for 0.1sec to receive function. 

No, you only need to detect absence of response within specified time (~ 2s). The normal 1ms HAL clock can be used for this.

TDK
Guru

If you want UART to capture until idle, use the HAL_UARTEx_ReceiveToIdle_* functions instead.

If you feel a post has answered your question, please click "Accept as Solution".

ST HAL and UART Interrupt schemes are extremely clunky, inefficient and don't fit neatly in to many processing paradigms.

If you're dealing with other devices, ie modems, GNSS, etc which can constantly or randomly stream data of varying length and periodicity, it makes a lot of sense just to implement you're own ring/fifo buffering methods, with enough depth to cover your processing bandwidth and abilities. Then the interrupts need only deal with sending/receiving bytes really efficiently, and processing can be deferred to main processing tasks or loops independently of the interfaces. This also benefits being highly portable and platform agnostic.

Tips, Buy me a coffee, or three.. PayPal Venmo
Up vote any posts that you find helpful, it shows what's working..

Yes, I know how  HAL_UART_Receive_IT works. I have written callback function but it is empty. Because in above code in ReadSerialByte(), I am enabling the interrupt after all the bytes are read.

I have seen on oscilloscope and the data is binary(I think the birts are correct) and it is transmitting properly. But ,I did not checked the actual speeds on wires.

Could you please help me to know how we can check the actual speeds on the wires?

@Pavel A. 

Could  you please guide me where I should add 1ms clock? Should I add in main(), receive() or transmit()?