See
this link. Algebraically 1 10 div 10 mult should equal 1, but the realities of computers make it so this is not always the case. .1 is impossible to represent exactly using decimals on a computer. It's like 1/3 in decimal (ie: .3333333). Worse yet, the actual value of .1 will vary on different computers or programs or languages depending on all sorts of factors, some even on the hardware level.
By using integer math, the value you get will always be predictable and the same on all computers, hardware, etc. It might not always be "correct", but at least it's deterministically incorrect. You can compensate for it by doing a little fixed point math. ie: 1 1000 mult 10 div 10 mult 1000 div, which will always give you the correct answer of 1 even on different hardware, different languages, different drivers, etc.