I am confused about how a bit vector would work to do this (not too familiar with bit vectors). Here is the code given. Could someone please walk me through this?
Line 1: public static boolean isUniqueChars(String str) {
Line 2: int checker = 0;
Line 3: for (int i = 0; i < str.length(); ++i) {
Line 4: int val = str.charAt(i) - 'a';
Line 5: if ((checker & (1 << val)) > 0) return false;
Line 6: checker |= (1 << val);
Line 7: }
Line 8: return true;
Line 9: }
The way I understood using Javascript. Assuming input var inputChar = "abca"; //find if inputChar has all unique characters
Line 4: int val = str.charAt(i) - 'a';
Above line Finds Binary value of first character in inputChar which is a, a = 97 in ascii, then convert 97 to binary becomes 1100001.
In Javascript Eg: "a".charCodeAt().toString(2) returns 1100001
checker = 0// binary 32 bit representation = 0000000000000000000000000
checker = 1100001 | checker; //checker becomes 1100001 (In 32 bit representation it becomes 000000000.....00001100001)
But i want my bitmask (int checker) to set only one bit, but checker is 1100001
Line 4: int val = str.charAt(i) - 'a';
Now above code comes handy. I just subtract 97 always (ASCII val of a)
val = 0; // 97 - 97 Which is a - a
val = 1; // 98 - 97 Which is b - a
val = 1; // 99 - 97 Which is c - a
Lets use val which is resetted
Line 5 and Line 6 is well explained @Ivan answer