My problem statement. Read a csv file with 10 million data and store it in db. with as minimal time as possible.
I had implemented it using Simple multi threaded
Here is how I solved the problem.
Read a file and chunk the file( split the file) using Buffered and File Channel reader and writer ( the fastest way of File read/write, even spring batch uses the same). I implemented such that this is executed before job is started( However it can be executed using job as step using method invoker)
Start the Job with directory location as job parameter.