I'm importing a large text file, 17 million digits long and I'm using this code:
BufferedReader reader = new BufferedReader(new FileReader("test2.txt"));
String line = reader.readLine();
System.out.println("Done");
BigInteger num = new BigInteger(line);
System.out.println("Done Again");
It loads the file pretty much instantly and prints out 'Done'
but it takes a long time (about an hour) for the String
to be converted into a BigInteger
, is there anything I can do to speed this up and quickly load the number?
As an optimization, since BigInteger
is Serializable
, you could save it to a binary file once and speed up your loading considerably.
Loading a serialized object should be way faster than parsing a huge string everytime.
Use ObjectOutputStream
to save your big integer and ObjectInputStream
to read it back in.
It is slow because new BigInteger(String)
is doing radix conversion from decimal to binary, which is O(N2). Nothing you can do about that.
You could save either the object itself, via Serialization, or the byte array it is stored in, via BigInteger.toByteArray()
. Either will load essentially instanteously.
As the comments have indicated, your code is slow because you're attempting to load a number with a lot of digits.
If you're unsatisfied with the performance of Java's BigInteger
implementation, then I suggest you look elsewhere.
This library claims to have a BigInteger that outperforms Java's implementation (note it may not speed up loading the number, but it should improve multiplication and division performance).
来源:https://stackoverflow.com/questions/44013214/how-can-i-quickly-load-a-large-txt-file-into-biginteger