cancel
Showing results for 
Search instead for 
Did you mean: 

Feature request: improve hardware button sampling and reading in both target and simulator

Right now hardware buttons work the following way:

The TouchGFX engine checks if the button value matches the trigger set up in the designer in the generated method of the screen view base handleKeyEvent().

In target hardware the method Sample() of your button controller (base class touchgfx::ButtonController) is called by TouchGFX engine. 

In simulator the vent SDL_TEXTINPUT is used. If you hold a key the event will happen in rising edge of the key press  and then re-occur periodically if you hold it down, because the keyboard event repeats.
Using SDL_KEYUP would make more sense as touch button widgets also produce a pressed event on falling edge of button press (button release).

You cannot read the values the engine gets from the target hardware or simulator in the model directly. You would have to send them from view to presenter to model and do that for each screen. This is tedious.
Alternatively you can retrieve the HAL instance and sample again:

uint8_t key;
auto hal = HAL::getInstance();	
hal->getButtonController()->sample(key);

The problem is that this won't get identical values. It samples the buttons again. The buttons could have changed in the mean time. It also breaks the flow where touch and button events are sampled at specific points in time. It also doesn't work in simulator as the buttoncontroller in the simulator is null.

My suggestion is the following:

  1. Make a setting in TouchGFX designer to select if you want to use hardware button press or release event. And use the matching SDL_KEYDOWN or SDL_KEYUP in simulator as a trigger
  2. use ButtonController also for simulator. 
  3. Add methods to the touchgfx::ButtonController class: read() which would read the sampled values without modifying them and readRising() and readFalling to read the value that was changed between calls of Sample() 

 

Additionally I personally use only digits for hardware button values e.g. 0x31 ('1'). because those can be generated by both target hardware and simulator. You cannot generate ASCII character 0x01 in simulator. It confused me those characters are even an option in the designer.

Last but not least it would be nice if hardware buttons could trigger button presses of button widgets. Right now we have a project where we do this by generating fake touch events to trick the TouchGFX engine. These projects have physical start and stop buttons that are used if the screen is capacitive touch so that operators can use gloves. This is tedious, but the advantage is that both hardware and widget button presses result in the button animation and it will trigger identical behavior using the same event handler. Other projects use resistive touch so that a physical button is not needed.

Kudo posts if you have the same problem and kudo replies if the solution works.
Click "Accept as Solution" if a reply solved your problem. If no solution was posted please answer with your own.
0 REPLIES 0