Is there a reason that a C# System.Decimal remembers the number of trailing zeros it was entered with? See the following example:
public void DoSomething()
{
I think it was done to provide a better internal representation for numeric values retrieved from databases. Dbase engines have a long history of storing numbers in a decimal format (avoiding rounding errors) with an explicit specification for the number of digits in the value.
Compare the SQL Server decimal and numeric column types for example.