// following code prints out Letters aA bB cC dD eE ....
class UpCase {
public static void main(String args[]) {
char ch;
for(int i = 0; i < 10; i++) {
ch = (ch
Second piece of code(Showbits):
The code is actually converting decimal to binary. The algorithm uses some bit magic, mainly the AND(&) operator.
Consider the number 123 = 01111011 and 128 = 10000000. When we AND them together, we get 0 or a non-zero number depending whether the 1 in 128 is AND-ed with a 1 or a 0.
10000000
& 01111011
----------
00000000
In this case, the answer is a 0 and we have the first bit as 0.
Moving forward, we take 64 = 01000000 and, AND it with 123. Notice the shift of the 1 rightwards.
01000000
& 01111011
----------
01000000
AND-ing with 123 produces a non-zero number this time, and the second bit is 1. This procedure is repeated.
First piece of code(UpCase):
Here 65503 is the negation of 32.
32 = 0000 0000 0010 0000
~32 = 1111 1111 1101 1111
Essentially, we subtract a value of 32 from the lowercase letter by AND-ing with the negation of 32. As we know, subtracting 32 from a lowercase ASCII value character converts it to uppercase.