Doug Hay wrote:

I'm actually passing a variable that the debugger says is 300.0 (with

18 decimals turned on).

That does not guarantee that the actual variable data is exactly 300.0

(and your results would imply that it is not). The debugger is likely

just rounding the value up when displaying it as a string.

I did try passing 300.0 and that worked correctly.

Exactly. All the more reason to suspect your variable data does not

contain exactly what you think it does. Where is the variable getting

its value from to begin with? Double check that logic, it is not as

accurate as you think.

_Time = 300.0; // double, is actually already set based on the

racer's lap times, but debugger tells me it's 300.0

As it should be, because it is exactly 300.0, not 299.999999999.

I kind of followed what Remy said, I changed the debugger setting

"Floating Point" and the value was "299.999999999999".

There you go then. There is a big difference between being 300.0 and

being approximately 299.99999999999. Floating-point types are not

exact. This is the reason why you can't do floating-point comparisons

using the '==' operator, for instance. You have to use epsilon

comparisons instead to account for float-point inaccuracies.

My race hardware only provides 3 decimal places

Why not get rid of the decimal altogether? Take a lesson from the

Currency class. It avoids float-point errors in monetary calculations

by using integer math instead of floating-point math. You could do the

same in your code. If your hardware is only accurate to 3 decimal

places, then multiply your values by 1000, do the integer math, and

divide the result by 1000.

--

Remy Lebeau (TeamB)

Connect with Us