2021-04-30 09:36 AM
Hi,
I am trying to understand the graph "precision" as explained here:
"If for example the level of precision is set to 0.1, each data point added to the DynamicGraph, will be multiplied by 10 internally"
I am using a histogram and giving the data points as floats.
I set the Precision to 1 and then to 0.1 but did not see any difference in the displayed value.
My data point is 20.0f.
I thought it would display a value of 200 with a precision set to 0.1.
Is the precision only used internally and not for display purposes?
Thanks for clarifying the use of this feature.
Franck.
Solved! Go to Solution.
2021-05-03 01:12 AM
Hello franck23,
Yes the precision is done internally as specified in the link you shared.
To make it appear on the display you need to adapt your value range.
For example if you have a graph from 0 to 180 with a precision of 0.01, you will never see the difference.
But if you try with a range from 0 to 2 with a precision of 0.1 for example, you will see that it is more precise than precision 1.
Hope I clarified your doubts.
/Alexandre
2021-05-03 01:12 AM
Hello franck23,
Yes the precision is done internally as specified in the link you shared.
To make it appear on the display you need to adapt your value range.
For example if you have a graph from 0 to 180 with a precision of 0.01, you will never see the difference.
But if you try with a range from 0 to 2 with a precision of 0.1 for example, you will see that it is more precise than precision 1.
Hope I clarified your doubts.
/Alexandre
2021-05-03 01:17 AM
Hi Alexandre,
Thanks for the clarification.
It makes sense to me now.