I am learning spring batch and I was able to create simple single step application(github repo link)
This application contains a job which does following:
1. reads persons from csv file
2. lowercase their names
3. Save them into databse
Now I want to learn partition feature so I added following partitioner:
@Component
public class MyPartitioner implements Partitioner {
@Override
public Map<String, ExecutionContext> partition(int gridSize) {
Map<String, ExecutionContext> map = new HashMap<>(gridSize);
for (int k = 0; k < gridSize; k++) {
ExecutionContext context = new ExecutionContext();
context.putString("keyName", "key_" + k); //Depends on what logic you want to use to split
map.put("PARTITION_KEY" + k, context);
}
return map;
}
}
and my config looks like this:
@Bean
public Job job() {
return jobBuilderFactory.get("myJob")
.incrementer(new RunIdIncrementer())
.flow(demoPartitionStep())
.end()
.build();
}
private Step demoPartitionStep() {
return stepBuilderFactory.get("demoPartitionStep")
.partitioner("demoPartitionStep", myPartitioner)
.gridSize(21)
.step(csvToDataBaseStep())
.taskExecutor(jobTaskExecutor())
.build();
}
private Step csvToDataBaseStep() {
return stepBuilderFactory.get("csvToDatabaseStep")
.<Person, Person>chunk(10)
.reader(csvPersonReader())
.processor(toLowerCasePersonProcessor)
.writer(dbPersonWriter)
.build();
}
public FlatFileItemReader csvPersonReader() {
return new FlatFileItemReaderBuilder()
.name("csvPersonReader")
.resource(csvResource)
.delimited()
.names(new String[]{"firstName", "lastName"})
.fieldSetMapper(new BeanWrapperFieldSetMapper<Person>() {{
setTargetType(Person.class);
}})
.build();
}
@Bean
public TaskExecutor jobTaskExecutor() {
ThreadPoolTaskExecutor taskExecutor = new ThreadPoolTaskExecutor();
// there are 21 sites currently hence we have 21 threads
taskExecutor.setMaxPoolSize(30);
taskExecutor.setCorePoolSize(25);
taskExecutor.setThreadGroupName("custom-executor");
taskExecutor.afterPropertiesSet();
return taskExecutor;
}
And When I start application I see following in the log:
2019-08-05 19:25:22.303 ERROR 24100 --- [bTaskExecutor-2] o.s.batch.core.step.AbstractStep : Encountered an error executing step csvToDatabaseStep in job myJob
org.springframework.batch.item.file.NonTransientFlatFileException: Unable to read from resource: [class path resource [users.csv]]
at org.springframework.batch.item.file.FlatFileItemReader.readLine(FlatFileItemReader.java:220) ~[spring-batch-infrastructure-4.1.2.RELEASE.jar:4.1.2.RELEASE]
at org.springframework.batch.item.file.FlatFileItemReader.doRead(FlatFileItemReader.java:173) ~[spring-batch-infrastructure-4.1.2.RELEASE.jar:4.1.2.RELEASE]
at org.springframework.batch.item.support.AbstractItemCountingItemStreamItemReader.read(AbstractItemCountingItemStreamItemReader.java:92) ~[spring-batch-infrastructure-4.1.2.RELEASE.jar:4.1.2.RELEASE]
at org.springframework.batch.core.step.item.SimpleChunkProvider.doRead(SimpleChunkProvider.java:94) ~[spring-batch-core-4.1.2.RELEASE.jar:4.1.2.RELEASE]
at org.springframework.batch.core.step.item.SimpleChunkProvider.read(SimpleChunkProvider.java:161) ~[spring-batch-core-4.1.2.RELEASE.jar:4.1.2.RELEASE]
at org.springframework.batch.core.step.item.SimpleChunkProvider$1.doInIteration(SimpleChunkProvider.java:119) ~[spring-batch-core-4.1.2.RELEASE.jar:4.1.2.RELEASE]
at org.springframework.batch.repeat.support.RepeatTemplate.getNextResult(RepeatTemplate.java:375) ~[spring-batch-infrastructure-4.1.2.RELEASE.jar:4.1.2.RELEASE]
at org.springframework.batch.repeat.support.RepeatTemplate.executeInternal(RepeatTemplate.java:215) ~[spring-batch-infrastructure-4.1.2.RELEASE.jar:4.1.2.RELEASE]
at org.springframework.batch.repeat.support.RepeatTemplate.iterate(RepeatTemplate.java:145) ~[spring-batch-infrastructure-4.1.2.RELEASE.jar:4.1.2.RELEASE]
at org.springframework.batch.core.step.item.SimpleChunkProvider.provide(SimpleChunkProvider.java:113) ~[spring-batch-core-4.1.2.RELEASE.jar:4.1.2.RELEASE]
at org.springframework.batch.core.step.item.ChunkOrientedTasklet.execute(ChunkOrientedTasklet.java:69) ~[spring-batch-core-4.1.2.RELEASE.jar:4.1.2.RELEASE]
at org.springframework.batch.core.step.tasklet.TaskletStep$ChunkTransactionCallback.doInTransaction(TaskletStep.java:407) ~[spring-batch-core-4.1.2.RELEASE.jar:4.1.2.RELEASE]
at org.springframework.batch.core.step.tasklet.TaskletStep$ChunkTransactionCallback.doInTransaction(TaskletStep.java:331) ~[spring-batch-core-4.1.2.RELEASE.jar:4.1.2.RELEASE]
at org.springframework.transaction.support.TransactionTemplate.execute(TransactionTemplate.java:140) ~[spring-tx-5.1.8.RELEASE.jar:5.1.8.RELEASE]
at org.springframework.batch.core.step.tasklet.TaskletStep$2.doInChunkContext(TaskletStep.java:273) ~[spring-batch-core-4.1.2.RELEASE.jar:4.1.2.RELEASE]
at org.springframework.batch.core.scope.context.StepContextRepeatCallback.doInIteration(StepContextRepeatCallback.java:82) ~[spring-batch-core-4.1.2.RELEASE.jar:4.1.2.RELEASE]
at org.springframework.batch.repeat.support.RepeatTemplate.getNextResult(RepeatTemplate.java:375) ~[spring-batch-infrastructure-4.1.2.RELEASE.jar:4.1.2.RELEASE]
at org.springframework.batch.repeat.support.RepeatTemplate.executeInternal(RepeatTemplate.java:215) ~[spring-batch-infrastructure-4.1.2.RELEASE.jar:4.1.2.RELEASE]
at org.springframework.batch.repeat.support.RepeatTemplate.iterate(RepeatTemplate.java:145) ~[spring-batch-infrastructure-4.1.2.RELEASE.jar:4.1.2.RELEASE]
at org.springframework.batch.core.step.tasklet.TaskletStep.doExecute(TaskletStep.java:258) ~[spring-batch-core-4.1.2.RELEASE.jar:4.1.2.RELEASE]
at org.springframework.batch.core.step.AbstractStep.execute(AbstractStep.java:203) ~[spring-batch-core-4.1.2.RELEASE.jar:4.1.2.RELEASE]
at org.springframework.batch.core.partition.support.TaskExecutorPartitionHandler$1.call(TaskExecutorPartitionHandler.java:139) [spring-batch-core-4.1.2.RELEASE.jar:4.1.2.RELEASE]
at org.springframework.batch.core.partition.support.TaskExecutorPartitionHandler$1.call(TaskExecutorPartitionHandler.java:136) [spring-batch-core-4.1.2.RELEASE.jar:4.1.2.RELEASE]
at java.util.concurrent.FutureTask.run(FutureTask.java:266) [na:1.8.0_111]
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142) [na:1.8.0_111]
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617) [na:1.8.0_111]
at java.lang.Thread.run(Thread.java:745) [na:1.8.0_111]
Caused by: java.io.IOException: Stream closed
at java.io.BufferedReader.ensureOpen(BufferedReader.java:122) ~[na:1.8.0_111]
at java.io.BufferedReader.readLine(BufferedReader.java:317) ~[na:1.8.0_111]
at java.io.BufferedReader.readLine(BufferedReader.java:389) ~[na:1.8.0_111]
at org.springframework.batch.item.file.FlatFileItemReader.readLine(FlatFileItemReader.java:201) ~[spring-batch-infrastructure-4.1.2.RELEASE.jar:4.1.2.RELEASE]
... 26 common frames omitted
and following:
2019-08-05 19:25:22.319 ERROR 24100 --- [ main] o.s.batch.core.step.AbstractStep : Encountered an error executing step demoPartitionStep in job myJob
org.springframework.batch.core.JobExecutionException: Partition handler returned an unsuccessful step
at org.springframework.batch.core.partition.support.PartitionStep.doExecute(PartitionStep.java:112) ~[spring-batch-core-4.1.2.RELEASE.jar:4.1.2.RELEASE]
at org.springframework.batch.core.step.AbstractStep.execute(AbstractStep.java:203) ~[spring-batch-core-4.1.2.RELEASE.jar:4.1.2.RELEASE]
at org.springframework.batch.core.job.SimpleStepHandler.handleStep(SimpleStepHandler.java:148) [spring-batch-core-4.1.2.RELEASE.jar:4.1.2.RELEASE]
at org.springframework.batch.core.job.flow.JobFlowExecutor.executeStep(JobFlowExecutor.java:68) [spring-batch-core-4.1.2.RELEASE.jar:4.1.2.RELEASE]
at org.springframework.batch.core.job.flow.support.state.StepState.handle(StepState.java:67) [spring-batch-core-4.1.2.RELEASE.jar:4.1.2.RELEASE]
at org.springframework.batch.core.job.flow.support.SimpleFlow.resume(SimpleFlow.java:169) [spring-batch-core-4.1.2.RELEASE.jar:4.1.2.RELEASE]
at org.springframework.batch.core.job.flow.support.SimpleFlow.start(SimpleFlow.java:144) [spring-batch-core-4.1.2.RELEASE.jar:4.1.2.RELEASE]
at org.springframework.batch.core.job.flow.FlowJob.doExecute(FlowJob.java:136) [spring-batch-core-4.1.2.RELEASE.jar:4.1.2.RELEASE]
at org.springframework.batch.core.job.AbstractJob.execute(AbstractJob.java:313) [spring-batch-core-4.1.2.RELEASE.jar:4.1.2.RELEASE]
at org.springframework.batch.core.launch.support.SimpleJobLauncher$1.run(SimpleJobLauncher.java:144) [spring-batch-core-4.1.2.RELEASE.jar:4.1.2.RELEASE]
at org.springframework.core.task.SyncTaskExecutor.execute(SyncTaskExecutor.java:50) [spring-core-5.1.8.RELEASE.jar:5.1.8.RELEASE]
at org.springframework.batch.core.launch.support.SimpleJobLauncher.run(SimpleJobLauncher.java:137) [spring-batch-core-4.1.2.RELEASE.jar:4.1.2.RELEASE]
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) ~[na:1.8.0_111]
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) ~[na:1.8.0_111]
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) ~[na:1.8.0_111]
at java.lang.reflect.Method.invoke(Method.java:498) ~[na:1.8.0_111]
at org.springframework.aop.support.AopUtils.invokeJoinpointUsingReflection(AopUtils.java:343) [spring-aop-5.1.8.RELEASE.jar:5.1.8.RELEASE]
at org.springframework.aop.framework.ReflectiveMethodInvocation.invokeJoinpoint(ReflectiveMethodInvocation.java:198) [spring-aop-5.1.8.RELEASE.jar:5.1.8.RELEASE]
at org.springframework.aop.framework.ReflectiveMethodInvocation.proceed(ReflectiveMethodInvocation.java:163) [spring-aop-5.1.8.RELEASE.jar:5.1.8.RELEASE]
at org.springframework.batch.core.configuration.annotation.SimpleBatchConfiguration$PassthruAdvice.invoke(SimpleBatchConfiguration.java:127) [spring-batch-core-4.1.2.RELEASE.jar:4.1.2.RELEASE]
at org.springframework.aop.framework.ReflectiveMethodInvocation.proceed(ReflectiveMethodInvocation.java:186) [spring-aop-5.1.8.RELEASE.jar:5.1.8.RELEASE]
at org.springframework.aop.framework.JdkDynamicAopProxy.invoke(JdkDynamicAopProxy.java:212) [spring-aop-5.1.8.RELEASE.jar:5.1.8.RELEASE]
at com.sun.proxy.$Proxy77.run(Unknown Source) [na:na]
at org.springframework.boot.autoconfigure.batch.JobLauncherCommandLineRunner.execute(JobLauncherCommandLineRunner.java:206) [spring-boot-autoconfigure-2.1.6.RELEASE.jar:2.1.6.RELEASE]
at org.springframework.boot.autoconfigure.batch.JobLauncherCommandLineRunner.executeLocalJobs(JobLauncherCommandLineRunner.java:180) [spring-boot-autoconfigure-2.1.6.RELEASE.jar:2.1.6.RELEASE]
at org.springframework.boot.autoconfigure.batch.JobLauncherCommandLineRunner.launchJobFromProperties(JobLauncherCommandLineRunner.java:167) [spring-boot-autoconfigure-2.1.6.RELEASE.jar:2.1.6.RELEASE]
at org.springframework.boot.autoconfigure.batch.JobLauncherCommandLineRunner.run(JobLauncherCommandLineRunner.java:162) [spring-boot-autoconfigure-2.1.6.RELEASE.jar:2.1.6.RELEASE]
at org.springframework.boot.SpringApplication.callRunner(SpringApplication.java:779) [spring-boot-2.1.6.RELEASE.jar:2.1.6.RELEASE]
at org.springframework.boot.SpringApplication.callRunners(SpringApplication.java:763) [spring-boot-2.1.6.RELEASE.jar:2.1.6.RELEASE]
at org.springframework.boot.SpringApplication.run(SpringApplication.java:318) [spring-boot-2.1.6.RELEASE.jar:2.1.6.RELEASE]
at org.springframework.boot.SpringApplication.run(SpringApplication.java:1213) [spring-boot-2.1.6.RELEASE.jar:2.1.6.RELEASE]
at org.springframework.boot.SpringApplication.run(SpringApplication.java:1202) [spring-boot-2.1.6.RELEASE.jar:2.1.6.RELEASE]
at spring.boot.hello.world.MyApplication.main(MyApplication.java:9) [main/:na]
And I noticed that some of data succesfully inserted into database. My csv file contains 500 persons but into sql database may be inserted 210 rows or 332 or even 600(I saw it when I set chunk size to 100)
How to implement partitioning correct? What do I wrong?
update
I tried to mark csvPersonReader with @StepScope and errors is disappeared but
rowCountInDatabaseTable=gridSize * rowCountInCsvFile
I still looking for a solution
Actually, i don't have used the partioner myself, but i might be able to give you some hints.
The first exception you had (Stream closed) was due to the fact that the same reader instance was used for every slave process. Obviously, the first slave process to finish closes the reader and from that moment on, the other slave processes tried to read from a closed stream.
You fixed that with the StepScope-Annotation, which was the right approach to solve this problem.
The problem with the partioner approach is, that you are responsible to partition the data you read.
What you did was simply to create a partioner which had 20 slave processes and everyone of these slave processes read the whole file. Hence, your db contained every entry in the file 20 times.
What you should do is to configure every step instance with appropriate "context-properties". Based on these context properties, the step instance knows which lines it should process (for instance start and end line). Or you could also split up the original file into 20 files with different names and provide for every instance another filename.
I found two examples where this is explained, one where you read from a db and one that reads from files:
https://www.mkyong.com/spring-batch/spring-batch-partitioning-example/
来源:https://stackoverflow.com/questions/57362728/org-springframework-batch-core-jobexecutionexception-partition-handler-returned