I\'ve been studying C# and ran accross some familiar ground from my old work in C++. I never understood the reason for bitwise operators in a real application. I\'ve never u
As to 'how they work': bitwise operations are one of the lowest level operations CPUs support, and in fact some bitwise operations, like NAND and NOR, are universal - you can build any operation at all out of a sufficiently large set of NAND gates. This may seem academic, but if you look at how things like adders are implemented in hardware, that's often basically what they boil down to.
As to the 'point': in a lot of higher level applications, of course, there is not much use for bit ops, but at the lowest levels of a system they are incredibly important. Just off the top of my head, things that would be very difficult to write without bit operations include device drivers, cryptographic software, error correction systems like RAID5 or erasure codes, checksums like CRC, video decoding software, memory allocators, or compression software.
They are also useful for maintaining large sets of integers efficiently, for instance in the fd_set
used by the common select
syscall, or when solving certain search/optimization problems.
Take a look at the source of an MPEG4 decoder, cryptography library, or operating system kernel sometime and you'll see many, many examples of bit operations being used.