To get two decimal places you would usually do this:
Perhaps you typed it as 10 by mistake?
Just do the two decimal places at the end, because if you do a calculation using two 2 decimal place numbers, it can become a full floating point number again. So, your entire script could become:
var mLvl1Qu5Equal = Math.round((mLvl1Qu5RanNu1 - mLvl1Qu5RanNu2) * 100)/100);
One problem with the official method, which is almost identical to what we've been talking about, is that int() rounds down. It acts like Math.floor(), and Jack wants the nearest decimal place, hence Math.round().
Thanks Colin Holgate, worked a treat.
For future reference for anyone else, if they want to display the decimal point to the user 0's don't appear. to fix this, put .toFixed(2) on the end of the variable you want to display. This will force it to display 2 digits after the dot.
Thanks robdillon for the link to the guide