bit-manipulation

PHP bitwise left shifting 32 spaces problem and bad results with large numbers arithmetic operations

大憨熊 提交于 2019-12-01 00:14:15
I have the following problems: First: I am trying to do a 32-spaces bitwise left shift on a large number, and for some reason the number is always returned as-is. For example: echo(516103988<<32); // echoes 516103988 Because shifting the bits to the left one space is the equivalent of multiplying by 2, i tried multiplying the number by 2^32, and it works, it returns 2216649749795176448. Second: I have to add 9379 to the number from the above point: printf('%0.0f', 2216649749795176448 + 9379); // prints 2216649749795185920 Should print: 2216649749795185827 Php integer precision is limited to

Why this union's size is 2 with bitfields?

狂风中的少年 提交于 2019-11-30 23:56:16
I am working on turbo C on windows where char takes one byte.Now my problem is with the below union. union a { unsigned char c:2; }b; void main() { printf("%d",sizeof(b)); \\or even sizeof(union a) } This program is printing output as 2 where as union should be taking only 1 byte. Why is it so? for struct it is fine giving 1 byte but this union is working inappropriately. And one more thing how to access these bit fields. scanf("%d",&b.c); //even scanf("%x",b.c); is not working because we cannot have address for bits.So we have to use another variable like below int x; scanf("%d",&x); b.c=x;

How to Bitwise compare a String

拟墨画扇 提交于 2019-11-30 23:37:45
I am working on a function that takes in a series of permission strings, less than 255 characters and assigns them to an entity. Each string assigned is unique, but there are so many that dropping them into an array, serializing them and pushing into a database and pulling them out later and de-serializing them or recalculating from a query every time there is a load has been causing delay problems. Especially with inherited permissions. So I was thinking of taking the string, generating a mask from it then OR'ing into the permissions glob. As more permissions are added continue to OR them to

Using a bitwise & inside an if statement

邮差的信 提交于 2019-11-30 23:23:34
问题 In C, I can write an if-statement if (firstInt & 1) but when I try and do the same in Java, the compiler tells me "incompatible types" and says I need a boolean instead of an int . Is there any way to write that C code in Java? 回答1: Any of the following should work for you: if ((firstInt & 1) != 0) if ((firstInt & 1) > 0) if ((firstInt & 1) == 1) 回答2: In C an integer expression can be used implicitly as a boolean expression (although I would argue that it is a bad idea), where zero is false

Fill with variable number of ones

懵懂的女人 提交于 2019-11-30 23:19:37
What's the best way to fill a variable with an unknown (at compile time) number of ones? For example, let's say: int n = 5; int b = fillwithones(5); now b contains 11111 (in binary). I can't just hard code int b = 31 because n is not known ahead of time (in my application). I could do something like this: int b = pow(2, n) - 1 But using a pow seems very wasteful. Thanks! You can use left shift and then subtract 1: unsigned int b = (1U << n) - 1U; // Broken down into steps // 1 = 00000001b // 1 << 5 = 00100000b // (1 << 5) - 1 = 00011111b The reason this works is 1 shifted left n times is the

Tagging/Encoding Pointers

心不动则不痛 提交于 2019-11-30 22:57:19
I need a way to tag a pointer as being either part of set x or part of set y (ie: the tag has only 2 'states'), I'm that means one can assume untagged = x and tagged = y. Currently I'm looking at using bitwise xor to do this: ptr ^ magic = encoded_ptr encoded_ptr ^ magic = ptr but I'm stumped at how to determine if the pointer is tagged in the first place. I'm using this to mark what pools nodes in a linked list come from, so that when the are delinked, they can go back to the correct perants. Update Just to make it clear to all those people suggesting to store the flag in extra data members,

Reading characters on a bit level

前提是你 提交于 2019-11-30 22:14:01
I would like to be able to enter a character from the keyboard and display the binary code for said key in the format 00000001 for example. Furthermore i would also like to read the bits in a way that allows me to output if they are true or false. e.g. 01010101 = false,true,false,true,false,true,false,true I would post an idea of how i have tried to do it myself but I have absolutely no idea, i'm still experimenting with C and this is my first taste of programming at such a low level scale. Thankyou This code is C89: /* we need this to use exit */ #include <stdlib.h> /* we need this to use

Using logical bitshift for RGB values

删除回忆录丶 提交于 2019-11-30 22:11:21
I'm a bit naive when it comes to bitwise logic and I have what is probably a simple question... basically if I have this (is ActionScript but can apply in many languages): var color:uint = myObject.color; var red:uint = color >>> 16; var green:uint = color >>> 8 & 0xFF; var blue:uint = color & 0xFF; I was wondering what exactly the `& 0xFF' is doing to green and blue. I understand what an AND operation does, but why is it needed (or a good idea) here? The source for this code was here: http://alexgblog.com/?p=680 Appreciate the tips. Alex In RGB you have 8 bits for Red, 8 bits for Green and 8

Is there a practical limit to the size of bit masks?

人盡茶涼 提交于 2019-11-30 22:10:07
There's a common way to store multiple values in one variable, by using a bitmask. For example, if a user has read, write and execute privileges on an item, that can be converted to a single number by saying read = 4 (2^2), write = 2 (2^1), execute = 1 (2^0) and then add them together to get 7. I use this technique in several web applications, where I'd usually store the variable into a field and give it a type of MEDIUMINT or whatever, depending on the number of different values. What I'm interested in, is whether or not there is a practical limit to the number of values you can store like

Performance of integer and bitwise operations on GPU

萝らか妹 提交于 2019-11-30 22:06:44
问题 Though GPUs are supposed for use with floating point data types, I'd be interested in how fast can GPU process bitwise operations. These are the fastest possible on CPU, but does GPU emulate bitwise operations or are they fully computed on hardware? I'm planning to use them inside shader programs written with GLSL. Also I'd suppose that if bitwise operations have full preformance, integer data types should have also, but I need confirmation on that. To be more precise, targeted versions are