C program to convert input binary number to decimal number
The code works fine for the input 10001000, 101100 for which the outputs are 136 and 44 respectively, but it f
Among several other issues with your code - you're trying to interpret the string "11111111111" (11 times '1') as an integer. However, the integer type on your machine uses 4 bytes, and the highest number it can represent is 2^31 - 1. The number 11,111,111,111 is higher than 2^33. So - you get signed integer overflow behavior.
Try parsing your input as a string, not as a huge number...
But - next time, please:
number_of_conversions
, not t
).t
- you could have demonstrated your issue with just a single conversion.scanf()
can fail, you know.