cancel
Showing results for 
Search instead for 
Did you mean: 

TouchGFX Resources for Building Simulator Application

WrenchInTheWorks
Associate II

Hello wonderful ST Community!

Before I get into the question, a bit of background on this project. I'm creating a button panel HID gaming device for the space sim Elite Dangerous. Using its somewhat limited API I'm hoping to feed info about the game state back to a 9 inch screen on the HID device through USB CDC running composite with HID. Before I start getting into hardware dev I want to create a sim of the GUI to test two things:

- size of display needed to show what I want to show

- getting data out of the game (will be done with a USB CDC app/driver/python (I have not quite got this far) running on the host computer)

It would be amazing to test these two aspects using the GFX simulator. I known it's theoretically possible to send data through some medium such as sockets but how does this work in practice?

Is there some good documentation around how the simulator works and how I can feed data into my sims?

I've been investigating using VSCode since GFX already has compatibility, but how does the simulator work differently to running on an MCU, I noted the lack of an RTOS in the simulator but still referencing the same GUI files?

 

Cheers,

Michael 

1 ACCEPTED SOLUTION

Accepted Solutions
GaetanGodart
ST Employee

Hello @WrenchInTheWorks ,

 

I am unsure on how you want to get input, but once you figure that out, make sure to include your files in the simulator Makefile like so :

ADDITIONAL_SOURCES += myOwnLib/src/myLib.cpp
ADDITIONAL_INCLUDE_PATHS += myOwnLib/include

 

You can use the frontendApplication to simulate data input if you are ok with that.

 


how does the simulator work differently to running on an MCU

Not very differently, a mcu is basically the same as a cpu, you just tell him the operations to do. In both cases, you ask to fill a framebuffer based on the TouchGFX framework. Then the MCU will transfer the framebuffer to the display while the simulator will display a window using SDL.
Similarly, the display will send touch coordinate through I2C while the SDL window sends clicks.

 

We have designed a few games, you can find them in the "Demo" section but I am not sure it is the right tool if you do not plan on putting your game on a microcontroller.

 

Regards,

Gaetan Godart
Software engineer at ST (TouchGFX)

View solution in original post

6 REPLIES 6
GaetanGodart
ST Employee

Hello @WrenchInTheWorks ,

 

I am unsure on how you want to get input, but once you figure that out, make sure to include your files in the simulator Makefile like so :

ADDITIONAL_SOURCES += myOwnLib/src/myLib.cpp
ADDITIONAL_INCLUDE_PATHS += myOwnLib/include

 

You can use the frontendApplication to simulate data input if you are ok with that.

 


how does the simulator work differently to running on an MCU

Not very differently, a mcu is basically the same as a cpu, you just tell him the operations to do. In both cases, you ask to fill a framebuffer based on the TouchGFX framework. Then the MCU will transfer the framebuffer to the display while the simulator will display a window using SDL.
Similarly, the display will send touch coordinate through I2C while the SDL window sends clicks.

 

We have designed a few games, you can find them in the "Demo" section but I am not sure it is the right tool if you do not plan on putting your game on a microcontroller.

 

Regards,

Gaetan Godart
Software engineer at ST (TouchGFX)

Goodday @GaetanGodart,

Apologies, I did not convoy my system very well so I have made a digram.EliteHID_Digram.jpg

Python is a place holder in the final system for now until I figure out how to do it nicely, but I will use python in the test system.  So if I understand correctly I can just treat the simulator as a C++ program and work with it similar to how I would usually modify files as if it were running on an MCU? How does that work with FreeRTOS? If I add a new file that will act as the link between the simulator and python do I need to put it into a FreeRTOS thread? My experience with C and C++ is limited to Embedded systems unfortunately.

 

Cheers,

Michael

 

Hello @WrenchInTheWorks ,

 

No problem!

 

In the final system, why do you want to run the game on a PC? Since an STM32 is basically a mini PC (MCU), you could run the controls, game and display on a single chip if the game is simple enough.

 


So if I understand correctly I can just treat the simulator as a C++ program

The simulator is an executable file. But every TouchGFX project have a different simulator file because the simulator just ask the computer to render that specific TouchGFX application and to display it on a SDL window.

 


work with it similar to how I would usually modify files as if it were running on an MCU?

Yes, you can add files as you want so you can make your custom code that run your game, renders the image, etc.
But then you have to fill the TouchGFX framebuffer with the data and just use the simulator to display the framebuffer in an SDL window because again, TouchGFX is used to add elements and render them in an efficient manner optimized for embedded use and store the image in a framebuffer, displaying the framebuffer is then up to the user.
But then you miss the point of TouchGFX, you are not using TouchGFX then, at this point just read the SDL documentation and display a few pixels on a widow, it is not that hard.

 


How does that work with FreeRTOS?

If you use the simulator, it is the computer that run the program, so the computer already have it's own resource management system. You don't even need FreeRTOS to run TouchGFX on a mcu actually.

 

You can also fetch the keyboard input in TouchGFX (it is very easy when running the simulator).

 

For the test system, I think that either you design your game with the TouchGFX framework (which is not designed for that but it is possible) like we did for the knight vs zombie and ninja vs robot games (we also made a 2048 game and others) :

GaetanGodart_0-1747813140026.png
or just don't use TouchGFX.
instead, design your game how you want and just render a window on your computer screen by an actually adapted way.

 

Then, for the final system, you can use TouchGFX for the lower STM32U5 that (I assume) has to show the game on a screen. To do that, you have to send the image X times every second to the STM32U5 and then show it on the display you choose.

 

 

Regards,

Gaetan Godart
Software engineer at ST (TouchGFX)

Hello @WrenchInTheWorks ,

 

Have you been able to move forward on your project?

 

Regards,

Gaetan Godart
Software engineer at ST (TouchGFX)

Hello @GaetanGodart,

Thank you for checking in! Still not quite on the same page about the project, but that's ok, I'm building a control device similar to the bellow image with the screen running touchGFX, it is a fully fledged PC game that my device will interface too and display info from using a limited game api.

WrenchInTheWorks_0-1749029454011.jpeg

 

This project being for my startup means my work on it is erratic, between poor health, my engineering work and study I don't get much time! I'm currently setting up the project in VS and working on the game interface. I've also been looking at selecting an MCU. It's a two part project with the other part being a 6DOF joystick so my time has been split. 

I'm happy to close this topic if you don't want too many open forum posts floating around, otherwise I'm equally happy to keep posting updates for other engineers trying similar ideas? Being my own generic startup, IP is not too much of a problem yet. 

I did want to thank you (in particular) and the rest of the TouchGFX team, I first started learning TouchGFX for work a few months ago and your activity on the forum has been indispensable for my learning. I've actively begun trying (with success, but I doubt I had much to much to do with the change) to shift my universities curriculum to use more STM tools, ICs and MCUs through mentoring other students and talking with lectures about what is possible with STM32 ecosystem. 

Taking the time to talk with me, a student, even through you might not have known that part, might make more business for STM in the long run since I've become quite a large proponent of STM in teaching and work!

Thank you again! 

Cheers,
Michael

 

 

Hello @WrenchInTheWorks ,

 

Thank you, that means a lot! :)

You can choose the comment that helped you the most as "best answer" if you want but then you can keep on posting on this thread because selecting a comment as "best answer" don't close the subject.

Based on the image, it looks like you only need one MCU, it will get the button click as interrupts and it will also control the screen. Regarding the API, I am not sure if you can do that through the MCU or if you have to do it through the computer.

For the test system on simulator, you can add interactions of "key pressed" for each of the physical buttons you want to have latter on. So basically mapping in your head : "the side button of the joystick is the key 'L' on the keyboard", "the 3 bottom buttons of the joystick are G, H and J".

 

Regards,

Gaetan Godart
Software engineer at ST (TouchGFX)