If someone could explain to me the difference between Decimal and decimal in C# that would be great.
In a more general fashion, what is the difference between the lo
decimal, int, string are all just short hand notation to make things easier/prettier for you. The framework doesn't really know what a "decimal" is, but it does know System.Decimal, so when you compile your code, decimal just turns into System.Decimal. Try looking at some code where all the types are fully qualified, then try looking at some code where the aliases are used, I think most programmers will prefer the more compact aliases and perceive it as being easier to read. I also think it might be a throw back to C/C++ to make transitioning easier.