cancel
Showing results for 
Search instead for 
Did you mean: 

Controlling the MIXIN Draggable at run-time

JimFouch
Associate III

I'm looking to create a user interface that will allow my customers to modify their displays at run time, but I don't want to make it by default. I'd like to be able to have the users first toggle an edit button before allowing them to modify the position of controls/widgets.

I see that it's actually a different class that is chosen at design time.

Also, would there be a way to allow users to choose from a list of widgets at run-time?

I'm trying to make a very user custom UI, and it many not be easy. For example, I may have 2-3 widgets that display one piece of data, I'd like to have all three widgets included at run time, but allow the user to chose between the three.0690X00000AtPYXQA3.jpg

3 REPLIES 3
HP
Senior III

a possible solution could be to have them all draggable but lock them by other means?

I'm not that familiar with the Draggable class but maybe one could implement a trigger mechanism so that even though the element is using the class it can still be toggled. Maybe it's already in there?

JimFouch
Associate III

Thanks for your reply.

I guess a brief explanation of my device would help.... It will be a device that will read and calculate several sensor values. For each of these values there will be one or more way of representing them on one or more SwipeContainer pages. Think of something like an RPM sensor. One user may want to see it as a needle gauge gauge, another a bar graph yet another just a simple number. I was to create a few dozen total gauges. Users will be able to edit the layout and representation of the underlying sensor data.

I've done some experimenting and found that you can move a widget from one SwipeContainer Page to another in code.

Ideally, I'd like to have widgets loaded from a configuration that can be controlled by the user in Edit Mode. But I'd need to be able to have each widget loaded dynamically, but it also needs to get updates from the Model class and so far I have only been able to get that to work on a widget that is added at design time. The code needs to reference that instance of the widget to update it.

I'm just so new with TouchGFX and still getting my legs under me.

I have been able to for a widget to be located off the screen and then bring it visible by code behind a button.

My desired interface would allow a user to use multiple instance of the same widget on one or more SwipeContainer pages.

Basically exposing the power of TouchGFX designer to a user at run time. After their editing of the pages, they then could save the configuration to something like an SDCard and then that config could be different for each owner of my device.

My other option is to have this happen by using a connected PC and manipulation on side PC. But, having it on directly on the device would be way more useful.

well it certainly is easier to setup the project so that you use the designer to control the layout but I get why you wouldn't want that.

the designer is usually generating the structure of the code - files that will be compiled later on either by the simulator or by for example CubeIDE.

an option would still be to have the different objects outside the screen or just invisible but that really don't answer your needs.