I have to implement a method that writes a byte to an ostream
object. Let\'s just called this ostream
object strobj
. I also have a bit
Your int
consist of several bytes (most likely 2, 4 or 8 bytes). Similar to the concept of the least significant bit, the least significant byte is the byte that has the less weight in the value of the entire integer. Depending on the endianness of your system, it may be the first or last byte in terms of memory.
To extract the least significant byte, bitwise-AND the number with one byte worth of 1
s, i. e. 255:
int LSB = (someInteger & 0xFF);
The more significant bits of an int
are the ones that contribute more to its value. The least significant bits contribute less. You can normally obtain these by anding the integer with 255
(a byte with all 1
s)
int lsb;
lsb = SumInt & 255;
When you look at a decimal number, say 507, the least significant digit would be the 7. Changing it to a 6 or an 8 would change the overall number a lot less than changing the 5.
When you look at a date, say May 14, 2013, the least significant part is the day (14) in terms of chronological ordering.
When you look at an unsigned 32-bit int (4 bytes), where the value of the integer is (256^3)*b3 + (256^2)*b2 + 256*b1 + b0
for the 4 bytes b0, b1, b2, b3
, the least significant byte is the byte b0
.
You can get the least-significant byte from your int sumInt
by doing char c = sumInt & 0xFF;
as others have suggested.
Imagine a 32-bit integer, where each bit can be 0 or 1, say:
011101010101010101010101010110110
^^^^^^^^
The least significant byte is the 8-bits at the right-hand side.
It's probably easier to understand if you take a decimal example: given the number 291023, the '3' is the least-significant digit, because if you change it you have the least effect on the overall number.
To get the least significant byte, just bitwise-AND the larger int
with 0xFF hex or 255 decimal (1+2+4+8+16+32+64+128=255)- that will clear out all the more-signficant bits and preserve the 8 least significant bits...
int x = ...whatever...;
unsigned char least_significant_bits = x & 255;
The effect is like this:
011101010101010101010101010110110 // x
000000000000000000000000011111111 //255
result:
000000000000000000000000010110110 // x & 255