cancel
Showing results for 
Search instead for 
Did you mean: 

TouchGFX with the Hardware Button

Azeem
Associate II

Hi,

I am new to the TouchGFX library and trying to make it work with the STM32F429-DISC1 board. The touch display is working fine and I don't see any issues with it. I am now trying to work around the Hardware Button to work with the TouchGFX library.

On the board there is a PUSH button and I want to make it work with the screens. I know there is a workshop available for the Hardware button and TouchGFX interface.

Is there any example available to work with the Push button or Hardware Buttons with the TouchGFX library or any resources?

I would appreciate your help.

Thanks

Azeem

12 REPLIES 12
Azeem
Associate II

Hi,

I was littering reading the same post. Is it please possible to please share an example for the TouchGFX hardware interaction library?

MM..1
Chief III

How you dont understand on this example ?

Azeem
Associate II

Well, I am trying to get it work. If I am successful then I will let know otherwise I will have to come here again. Thanks

Michael K
Senior III

The "proper" way to do this is to check your hardware state in the model tick() function, and if it changes, call a virtual modelListener function. Any screen you want to respond to the button press should implement this function in its presenter. When the presenter method is called, get the data from the model and call an appropriate method on the view.

For example, let's say you want to turn a box a certain color if the button is pressed.

In Model.cpp:

bool buttonStatus = false; // this should be a private variable in Model.hpp, with a public accessor like getButtonStatus();
 
void Model::tick(){
  bool currentButtonStatus = checkButtonHardware(); // implement this on your own
   if(currentButtonStatus != buttonStatus){
      buttonStatus = currentButtonStatus;
      modelListener->buttonChanged();
   }
}

In ModelListener.hpp:

virtual void buttonChanged() {}

In Screen1Presenter.cpp: (of course create prototypes in .hpp)

void Screen1Presenter::buttonChanged(){
   if(model->getButtonStatus() == true){
      view.setBoxColor(BoxColorEnum::BLUE);
   } else {
      view.setBoxColor(BoxColorEnum::RED);
   }
}

Finally, in Screen1View.cpp (of course create prototypes in .hpp)

enum class BoxColorEnum{ RED, BLUE, ... };   // in hpp file
 
#include <touchgfx/Color.hpp>
 
void Screen1View::setBoxColor(BoxColorEnum color){
   switch(color){
      case BoxColorEnum::RED:
         box1.setColor(touchgfx::Color::getColorFrom24BitRGB(255, 0, 0);
      break;
      //... and so on
   }
   box1.invalidate();
}

Your checkButtonHardware button can either check it directly (e.g. HAL_GPIO_ReadPin(GPIOx, GPIO_Pin)) or it can receive an event from the RTOS if your hardware is processed in another task. The important thing is that your state is stored in the model, the modelListener is informed of changes, interested presenters decide what view functions to call depending on the model state, and the view is only concerned with changing the visuals.

Beyond this, if setting a variable with the state of a hardware pin is already outside your knowledge, the TouchGFX forum is not the place to ask those questions. TouchGFX is a graphics framework only. Just because you can put UIs on discovery boards without much fiddling, it doesn't mean that all the work is done for you. You are responsible for implementing your own hardware routines, and more basic STM32 tutorials are where you could acquire this knowledge.

Disclaimer: code should be used as a reference only, I've not validated or compiled it.

Embedded UI/UX Consulting: cadenza.design

Example explain perfectly chaos in Model hw management. WTF is model->getButtonStatus()

bool buttonStatus = false; // this should be a private variable in Model.hpp, with a public accessor like getButtonStatus();

https://classes.mst.edu/compsci1570/mutators.htm

Embedded UI/UX Consulting: cadenza.design

I know this, but i mean somebody need full example and not only half code.

And my example when you need change color , create for this in Designer Interaction triggered with hardware button, then call it directly in Model tick without all balast around.

You can use more as one interaction in sequence and too you can check what you do in simulator.

I think the scope of my example and explanation was more than adequate to explain the general principles. As professionals we need to be able to interpolate and extrapolate available information to suit our needs.

It sounds like you are getting defensive, and I'm not sure why. Perhaps because I used the word "proper"? I use it in quotation marks to imply that this is the way that ST and touchgfx have documented it. If this was offensive to you, I'm sorry and I'll add a note to my comment explaining as much. Use whatever way you want if it accomplishes your goal. However...

Here's the thing about all the "ballast" you refer to, which I assume is related to the tedium of making all these methods in different places. It doesn't matter in small projects, but once you start making multi-screen projects with real hardware interfacing (I.e. with multiple IO/analog/serial etc), the non-standard way like how you described breaks down fast. To prove my point, pretend you want to show the status of 15 buttons by changing box colors. Try making and maintaining 15 interactions! Then try testing them! Then try going beyond the small pieces of functionality the interactions provide. What if I wanted to change the color of two boxes for each button, now I need 30 interactions!

In my experience, the interactions are useful as a tool to help UI designers (who may not be the same people who know or program the c++ backend!) quickly experiment with simple animations or functionality.

Embedded UI/UX Consulting: cadenza.design