Is there a reason that a C# System.Decimal remembers the number of trailing zeros it was entered with? See the following example:
public void DoSomething()
{
Decimals represent fixed-precision decimal values. The literal value 0.50M
has the 2 decimal place precision embedded, and so the decimal variable created remembers that it is a 2 decimal place value. Behaviour is entirely by design.
The comparison of the values is an exact numerical equality check on the values, so here, trailing zeroes do not affect the outcome.