Why is decimal
not a primitive type?
Console.WriteLine(typeof(decimal).IsPrimitive);
outputs false
.
It is
Consider the below example ,
int i = 5;
float f = 1.3f;
decimal d = 10;
If you put a debugger and verify the native instruction set,it would be
As you can see int,float all being a primitive type takes single instruction to perform the assignemnt operation whereas decimal,string being non primitive types takes more than one native instruction to perform this operation.
Although not a direct answer, the documentation for IsPrimitive
lists what the primitive types are:
http://msdn.microsoft.com/en-us/library/system.type.isprimitive.aspx
A similar question was asked here:
http://bytes.com/topic/c-sharp/answers/233001-typeof-decimal-isprimitive-false-bug-feature
Answer quoted from Jon Skeet:
The CLR doesn't need to have any intrinsic knowledge about the decimal type - it treats it just as another value type which happens to have overloaded operators. There are no IL instructions to operate directly on decimals, for instance.
To me, it seems as though decimal
is a type that must exist for a language/runtime wanting to be CLS/CLI-compliant (and is hence termed "primitive" because it is a base type with keyword support), but the actual implementation does not require it to be truly "primitive" (as in the CLR doesn't think it is a primitive data type).
Decimal is a 128 bit data type, which can not be represented natively on a computer hardware. For example a 64-bit computer architecture generally has integer and addressing registers that are 64 bits wide, allowing direct support for 64-bit data types and addresses.
Wikipedia says that
Depending on the language and its implementation, primitive data types may or may not have a one-to-one correspondence with objects in the computer's memory. However, one usually expects operations on basic primitive data types to be the fastest language constructs there are.
In case of decimal it is just a composite datatype which utilizes integers internally, so its performance is slower than of datatypes that have a direct correlation to computer memory (ints, doubles etc).