问题
I'm trying to convert a couple of binary strings back to int. However it doesn't convert all my binary strings, leaving me a java.lang.NumberFormatException exception. Here is my test code with 3 binary string:
public class Bin {
public static void main(String argvs[]) {
String binaryString ;
binaryString = Integer.toBinaryString(~0);
//binaryString = Integer.toBinaryString(~1);
//binaryString = "1010" ;
int base = 2;
int decimal = Integer.parseInt(binaryString, base);
System.out.println("INPUT=" + binaryString + " decimal=" + decimal) ;
}
}
If I convert the "1010" it works great, but when I try to convert one of the other two I get the exception. Can someone explain to me why this is ?
Cheers
回答1:
From http://docs.oracle.com/javase/1.5.0/docs/api/java/lang/Integer.html#toBinaryString(int) : the toBinaryString()
method converts its input into the binary representation of the "unsigned integer value is the argument plus 232 if the argument is negative".
From http://docs.oracle.com/javase/1.5.0/docs/api/java/lang/Integer.html#parseInt(java.lang.String,%20int) : the parseInt()
method throws NumberFormatException
if "The value represented by the string is not a value of type int
".
Note that both ~0
and ~1
are negative (-1 and -2 respectively), so will be converted to the binary representations of 232-1 and 232-2 respectively, neither of which can be represented in a value of type int
, so causing the NumberFormatException
that you are seeing.
回答2:
As explained above, Integer.toBinaryString() converts ~0 and ~1 to unsigned int so they will exceed Integer.MAX_VALUE.
You could use long to parse and convert back to int as below.
int base = 2;
for (Integer num : new Integer[] {~0, ~1}) {
String binaryString = Integer.toBinaryString(num);
Long decimal = Long.parseLong(binaryString, base);
System.out.println("INPUT=" + binaryString + " decimal=" + decimal.intValue()) ;
}
回答3:
The bits for "~0" are 11111111111111111111111111111111 (32 1's). Normally, this represents the number -1. The bits for "~1" are 11111111111111111111111111111110 (31 1's followed by a zero). Normally, this represents the number -2.
I tried "01111111111111111111111111111111" (a 0 and 31 1's), which represents the highest signed integer, in parseInt
and there was no error. But I tried "10000000000000000000000000000000", which represents the minimum signed integer, and there was the error again.
The parseInt
method seems to expect a "-" in the input to indicate that a negative number is desired. It looks like this method is detecting overflow in the integer and throwing the NumberFormatException
.
来源:https://stackoverflow.com/questions/14883428/java-convert-binary-string-to-int