2019-05-10 08:20 AM
Hi All,
Our board has touch screen of resolution 480 x 272, surrounded by a touch sensitive area containing 11 touch sensitive buttons. The touch controller handles the screen and the touch buttons, we have coding written C working
handling this. Also, we have SDRAM with 64Mb with 16-bit data interface. The on board micro is the STM32F429. So far all the code is written in C using Keil uVision.
Handling out custom touch buttons, is that just a matter of creating a custom callback for touch interface handler? if not please suggest how it might be handled.
The aim is speed up GUI development and use our existing non-GUI code.
Kind Regards
MikeZ
Solved! Go to Solution.
2019-05-13 12:53 AM
Hi @kweisi50,
The Touch Controller for the display is different from the "custom" buttons which should just be treated as any other peripheral input. I'm just going to outline a general solution here and you can tell me if it doesn't make sense to you.
Every VSYNC from your display the application gets ticked. During that tick TouchGFX will sample the touch controller for new coordinates. It will also tick other elements of the application such as Views (It's a virtual method that you need to override if you're interested) and the Model which is the heart of your application.
What we usually do when sampling other peripherals in the GUI task is to have a seperate OS task with some priority that samples these custom buttons and sends a message through an OS message queue which is then checked by Model::tick() - From here on out you decide what you want to do with this value (Or several if you can press multiple buttons at the same time) - You can tell it to the active Screen (View/Presenter) through the ModelListener interface.
Check out one of the stickies in this subforum for a youtube video/info on how to integrate peripheral data into your TouchGFXapplication.
Let me know!
/Martin
2019-05-11 01:11 AM
Sounds about right, if your display manager can handle that. Otherwise you may need to mess with the scene graph (or whatever it's called in your system).
2019-05-13 12:53 AM
Hi @kweisi50,
The Touch Controller for the display is different from the "custom" buttons which should just be treated as any other peripheral input. I'm just going to outline a general solution here and you can tell me if it doesn't make sense to you.
Every VSYNC from your display the application gets ticked. During that tick TouchGFX will sample the touch controller for new coordinates. It will also tick other elements of the application such as Views (It's a virtual method that you need to override if you're interested) and the Model which is the heart of your application.
What we usually do when sampling other peripherals in the GUI task is to have a seperate OS task with some priority that samples these custom buttons and sends a message through an OS message queue which is then checked by Model::tick() - From here on out you decide what you want to do with this value (Or several if you can press multiple buttons at the same time) - You can tell it to the active Screen (View/Presenter) through the ModelListener interface.
Check out one of the stickies in this subforum for a youtube video/info on how to integrate peripheral data into your TouchGFXapplication.
Let me know!
/Martin
2019-05-13 03:01 AM
Hi Martin/All,
Thanks for your response. if I could could ask one other question. Since I had already created windows for my application using another package before realising its limitations in the iteration cycle, I thought I would start there with TouchGFX.
I started with an empty windows, set the size, then placed a background image . That was fine. I next placed a button and slider widget, I found I could not resize them, how do do this?
Should have started another topic?
Kind Regards
MikeZ
2019-05-13 03:59 AM
Hi again,
Starting a new topic would probably be best. For traceabilitys sake.
Thanks.
/Martin
2019-05-13 04:05 AM
I renamed the topic to be a bit more concise now that we know more. I hope you don't mind.
/Martin