I'm trying to get one of my maths equations to be rounded to 2 decimal places, but when I do this, it rounds it too much.
The code I've been experimenting with:
var decimalPlace1: Number = Math.round(mLvl1Qu5RanNu1 * 10) / 100;
var decimalPlace2: Number = Math.round(mLvl1Qu5RanNu2 * 10) / 100;
var mLvl1Qu5Equal = decimalPlace1 - decimalPlace2;
var mLvl1Qu5EqualDecimal = Math.round(mLvl1Qu5Equal * 10) / 100;
I got it working without the code in bold, but for some reason it's stopped working again.
I need to check the answer of mLvl1Qu5Equal against what the user enters.
To get two decimal places you would usually do this:
Perhaps you typed it as 10 by mistake?
Just do the two decimal places at the end, because if you do a calculation using two 2 decimal place numbers, it can become a full floating point number again. So, your entire script could become:
var mLvl1Qu5Equal = Math.round((mLvl1Qu5RanNu1 - mLvl1Qu5RanNu2) * 100)/100);
Thanks Colin Holgate, worked a treat.
For future reference for anyone else, if they want to display the decimal point to the user 0's don't appear. to fix this, put .toFixed(2) on the end of the variable you want to display. This will force it to display 2 digits after the dot.
Thanks robdillon for the link to the guide
One problem with the official method, which is almost identical to what we've been talking about, is that int() rounds down. It acts like Math.floor(), and Jack wants the nearest decimal place, hence Math.round().