I noticed that .NET has some funky/unintuitive behavior when it comes to decimals and trailing zeros.
0m == 0.000m //true 0.1m == 0.1000m //true
I don't like it much, but it works (for some range of values, at least)...
static decimal Normalize(decimal value) { long div = 1; while(value - decimal.Truncate(value) != 0) { div *= 10; value *= 10; } if(div != 1) { value = (decimal)(long)value / div; } return value; }