Converting a decimal to a 16 bit binary using unsigned char and without string [closed]

这一生的挚爱 提交于 2020-01-06 06:40:21

问题


My code works if I use operand 1 and operand 2 as integers. Using unsigned char operand 1 does not work. Can you help me?

int ALU(unsigned char operand1, unsigned char operand2)
{
printf("Enter Operand 1(in decimal): ");
scanf("%d",&operand1);
printf("\nEnter Operand 2(in decimal): ");
scanf("%d",&operand2);

char bin16_1[]  = "0000000000000000";
int pos;
for (pos = 16; pos >= 0; --pos)
{
    if (operand1 % 2) 
    bin16_1[pos] = '1';
    operand1 /= 2;
}
printf("\n\nBinary Equivalence of Operand 1: %s",bin16_1);

If I input 4096 or 512 or 65536 as decimal, the output will be 0000 0000 0000 00000 which is wrong.


回答1:


char *reverse(char *str)
{
    size_t len = strlen(str);

    for (size_t pos = 0; pos < len / 2; pos++)
    {
        char tmp = str[pos];

        str[pos] = str[len - 1 - pos];
        str[len - 1 - pos] = tmp;
    }
    return str;
}

char *toBin(char *buff, uint16_t value, int pad)
{
    size_t nbits = 16;
    char *work_buff = buff;
    do
    {
        *work_buff++ = value & 1 ? '1' : '0';
        value >>= 1;
        nbits--;
    }while(value);
    if (pad)
    {
        while (nbits--)
        {
            *work_buff++ = '0';
        }
    }
    *work_buff = 0;
    return reverse(buff);
}



回答2:


Your solution does not work because you cannot perform such an operation on unsigned chars. Arithmetic applied by you is operating on numbers with the decimal base, but unsigned chars belong to base what is binary.

So instead of checking modulo and dividing input by 2, you should have performed shift and a binary comparison between currently the most significant bit and the 1. Code below:

Proper solution:

for( int i = 7; i >= 0; i-- ) {
    bin16_1[i] = (operand1 >> i) & 1 ? 1 : 0;
}

Explanation:

With every iteration, the most significant bit is being read from the byte by shifting it and binary comparing with 1.

For example, let's assume that input value is 128, what binary translates to 1000 0000. Shifting it by 7 will give 0000 0001, so it concludes that the most significant bit was 1. 0000 0001 & 1 = 1. That's the first bit to print in the console. Next iterations will result in 0 ... 0.



来源:https://stackoverflow.com/questions/50367681/converting-a-decimal-to-a-16-bit-binary-using-unsigned-char-and-without-string

易学教程内所有资源均来自网络或用户发布的内容,如有违反法律规定的内容欢迎反馈
该文章没有解决你所遇到的问题?点击提问,说说你的问题,让更多的人一起探讨吧!