cancel
Showing results for 
Search instead for 
Did you mean: 

Touch Calibration

jimmii
Senior II

Hi,

Following situation:

I have a 480x272 display in portrait mode.

(1) In TouchGFXDesigner the 0/0 point is top left with x-axis in horizontal and y-axis vertical down.

(2) The expected (by the TouchGFX Framework) touch coordinates are 0/0 top right with x-axis vertical down and y-axis horizontal to the left.

(3) My Touch controller has 0/0 bottom left with x-axis vertical up and y-axis horizontal to the right.

So now I can't figure out how this translates to the calibration class.

- The calibration points: Do they base on (1),(2) or (3)?

- Measured points for calibration matrix: Do they base on (1),(2) or (3)?

- Measured points for "sampleTouch" function: Do they base on (1),(2) or (3)?

Thanks for your help.

/jimmii

5 REPLIES 5
Martin KJELDSEN
Chief III

You don't need to use the Calibration class - It's mostly for resistive touch screens.

If touch coordinates do not follow the native orientation of your display you can simply re-calculate in sampleTouch() for what you're returning to TouchGFX Engine. And you don't need to set PORTRAIT mode because your display is already natively portrait. So:

* If you need to rotate an application in software using setDisplayOrientation, and if your touch coordinates follow native screen orientation Touchgfx will convert touch coordinates for you (rotate 90, or no rotation)

* If you need to calibrate points due to inaccuracies in your ADC use the Calibration points

* If you need to offset the touch points because they're somehow not aligned with your native display orientation (this is a bit weird?) - Then you simply do the math inside sampleTouch() using the width and height of the display, e.g. if you want to mirror or flip a coordinate.

 For instance, if you were to use the Mirror Widget i've shared here a few times (becasue touchgfx does not support 180 degree rotation) you would have to re-calculate the touch coordinates inside sampleTouch() because TouchGFX does not really know anything about the 180 degree orientation - It's manual pixel magic.

/Martin

Thanks @Martin KJELDSEN​ 

I am using a resistive touch screen.

As of right now, the display is working just fine, with setDisplayOrientation and so on.

I did a manual 2 point calibration and inside sampleTouch I have to convert the coordinates as follows:

State->TouchX = 480 - Touch.Coor[X];
State->TouchY = 272 - Touch.Coor[Y];

With this configuration, everything works.

But if I want to use the Calibration class, I don't know on which coordinate system (1),(2) or (3) this process has to be aligned to.

Are the calibration points based on (1) or (2)? Why are they even different?

/jimmii

When you say everything works you mean that it works - but without calibration - so touch is probably a bit off still?

It might be a problem using both "calibration" methods at the same time. Offsetting the touch coordinates AND adjusting them according to the calibration matrix. What are your experiences with touch accuracy as it is now?

/Martin

Yes, the touch is off, so without calibration the touch point don't match.

When trying the calibration class, I switch off "my" calibration, so only one method is active at that time.

It would work with my 2-point calibration, but I wanted to try "your" 3-point calibration.

/jimmii

I think you should try this:

Since the calibration matrix in touchgfx will only be applied AFTER the coordinates are returned to the HAL you should offset the calibration matrix you've made since you're also offsetting the touch points. Does that make sense? Otherwise the inaccuracy for a specific point on the screen won't be properly offset.