cancel
Showing results for 
Search instead for 
Did you mean: 

Behind FreeRTOS and TouchGFX Interaction

Andi
Associate II

I try to debug a problem with the touch, but I cannot see how FreeRTOS does periodically the functions. See pictures from the call stack. In code the HAL methods are all virtual and I can’t find the code where they are called. It seems there is something hidden in the source ore in my understanding.

0690X00000ArjkAQAR.png

It would be very helpful to get a description about the concept about the bindings from the graphical lib and the operating system. 

5 REPLIES 5
Martin KJELDSEN
Chief III

You're right in that the core HAL of TouchGFX is closed source. I'll give you an overview related to your stack trace and you can tell me if you need more or not:

  1. Main eventloop HAL::taskEntry() waits for a VSYNC semaphore through OSWrappers::waitForVsync(); The OSWrappers interface is something every OS must implement, even "No OS".
  2. Once the vsync semaphore is signaled by an external event (VSYNC, TE, etc), HAL "ticks" and renders its current frame.
  3. While doing its tick, HAL also samples touch if such a controller is configured - This is what you're seeing in STM32F7TouchController::sampleTouch()

What is the issue you're seeing?

/Martin

Andi
Associate II

Thank you for the answer

I came across this while I was debugging a problem with the button event. See other post “The display is not reacting on touch“.

I tried to understand the lib, but was confused because I did not know the way of hiding code and I could not found any description about it.

At the end it has cost me a lot of time.

I assume we are talking about the touchgfx-float-abi-hard? Is that right?

For similar situations it would be very useful to have an overview of the methods names inside the lib.

Very helpful would be a description about the lib and how it interacts with the customized part of the code like you did in the first replay.

I assume there are more dependencies than the button event.

Martin KJELDSEN
Chief III

I replied to your other post 5 days ago, but you haven't replied back :) Can we start there?

Thanks.

/Martin

Andi
Associate II

My question about the lib is more general for understanding. I believe it will be worth to have that information. The time I have spent on that burden could hold other costumers from using this great solution. Therefore I recommend a tutorial or webinar specific to that topic.

We can switch to solve the button problem. My timetable tells me, that I don’t have to work on that for the next 2 – 3 weeks. 

Martin KJELDSEN
Chief III

This documentation doesn't exist, but it will in the coming time because we're working on something else that will make it easier for people to do custom ports, so an "Understanding TouchGFX" article will make sense.

I assumed that the reason you're asking here is because of the issues you had in the other thread, so i thought it would be relevant to work on that rather than explaining the call flow in touchgfx. For the button issue, there's no other dependency than providing a custom touch controller with a sample() function. I really want you to try what i suggested in the other post - It may make sense to you then.

Describing the internals of the HAL is not simple, so i'm not really sure i want to do that here because i may not be hitting the spot you're after and it would take a long time. In the end, the exposed parts should be very simple for the user to interact with be it screen definitions (designer) or supplying a touch controller.

It has nothing to do with FreeRTOS. You can try running without an RTOS if you want to.

The flow i specified is the major driver behind touchgfx that you should worry about. And if you're ticking touchgfx you're getting your touch controller sampled, and you showed how it returns true, i believe, indicating a touch and you mentioned that the coordinates seemed correct.

What you should do now is go to that other post, implement handleClickEvent() for the view and then place a box whereever the TouchController thinks you're pressing. This will give you an idea. Maybe the touch coordinates are inverted or just plain wrong or offset in some way which means you won't activate your buttons.

/Martin