Been looking around for a little while now and I\'m a bit confused on this issue. I want to be able to take an input stream and read it concurrently in segments. The segme
A good approach might instead be to have a single reader that reads chunks and then hands each chunk off to a worker thread from a thread pool. Given that these will be inserted into a database the inserts will be by far the slow parts compared to reading the input so a single thread should suffice for reading.
Below is an example that hands off processing of each line from System.in to a worker thread. Performance of database inserts is much better if you perform a large number inserts within a single transaction so passing in a group of say 1000 lines would be better than passing in a single line as in the example.
import java.io.BufferedReader;
import java.io.IOException;
import java.io.InputStreamReader;
import java.util.concurrent.ExecutorService;
import java.util.concurrent.Executors;
public class Main {
public static class Worker implements Runnable {
private final String line;
public Worker(String line) {
this.line = line;
}
@Override
public void run() {
// Process line here.
System.out.println("Processing line: " + line);
}
}
public static void main(String[] args) throws IOException {
// Create worker thread pool.
ExecutorService service = Executors.newFixedThreadPool(4);
BufferedReader buffer = new BufferedReader(new InputStreamReader(System.in));
String line;
// Read each line and hand it off to a worker thread for processing.
while ((line = buffer.readLine()) != null) {
service.execute(new Worker(line));
}
}
}