For a char []
, I can easily get its length by:
char a[] = \"aaaaa\";
int length = sizeof(a)/sizeof(char); // length=6
However,
So the thing with the sizeof operator is that it returns you the amount of storage needed, in bytes, to store the operand.
The amount of storage needed to store a char is always 1 byte. So the sizeof(char)
will always return 1.
char a[] = "aaaaa";
int len1 = sizeof(a)/sizeof(char); // length = 6
int len2 = sizeof(a); // length = 6;
This is the same for both len1
and len2
because this division of 1 does not influence the equation.
The reason why both len1
and len2
carry the value 6 has to do with the string termination char '\0'
. Which is also a char which adds another char to the length. Therefore your length is going to be 6 instead of the 5 you were expecting.
char *a = new char[10];
int length = sizeof(a)/sizeof(char);
You already mentioned that the length turns out to be 4 here, which is correct. Again, the sizeof operator returns the storage amount for the operand and in your case it is a pointer a
. A pointer requires 4 bytes of storage and therefore the length is 4 in this case. Since you probably compile it to a 32-bit binary. If you'd created a 64-bit binary the outcome would be 8.
This explanation might be here already be here. Just want to share my two cents.