问题
I am new to Spring batch and couldn't figure out how to do this..
Basically I have a spring batch files and both are have to run parallel i.e when I request execute_job1 then BatchConfig1 have to execute and when I request execute_job2 then BatchConfig2 have to execute. How can I do this?
Controller
@RestController
public class JobExecutionController {
@Autowired
JobLauncher jobLauncher;
@Autowired
Job job;
/**
*
* @return
*/
@RequestMapping("/execute_job1")
@ResponseBody
public void executeBatchJob1() {
}
/**
*
* @return
*/
@RequestMapping("/execute_job2")
@ResponseBody
public void executeBatchJob2() {
}
}
BatchConfig1
@Configuration
@EnableBatchProcessing
public class BatchConfig {
@Autowired
private JobBuilderFactory jobs;
@Autowired
private StepBuilderFactory steps;
@Bean
public Step stepOne(){
return steps.get("stepOne")
.tasklet(new MyTaskOne())
.build();
}
@Bean
public Step stepTwo(){
return steps.get("stepTwo")
.tasklet(new MyTaskTwo())
.build();
}
@Bean
public Job demoJob(){
return jobs.get("exportUserJob1")
.incrementer(new RunIdIncrementer())
.start(stepOne())
.next(stepTwo())
.build();
}
}
BatchConfig2:
@Configuration
@EnableBatchProcessing
public class BatchConfig2 {
@Autowired
public JobBuilderFactory jobBuilderFactory;
@Autowired
public StepBuilderFactory stepBuilderFactory;
@Autowired
public DataSource dataSource;
@Bean
public JdbcCursorItemReader<User> reader() {
JdbcCursorItemReader<User> reader = new JdbcCursorItemReader<User>();
reader.setDataSource(dataSource);
reader.setSql("SELECT id,name FROM user");
reader.setRowMapper(new UserRowMapper());
return reader;
}
public class UserRowMapper implements RowMapper<User> {
@Override
public User mapRow(ResultSet rs, int rowNum) throws SQLException {
User user = new User();
user.setId(rs.getInt("id"));
user.setName(rs.getString("name"));
return user;
}
}
@Bean
public UserItemProcessor processor() {
return new UserItemProcessor();
}
@Bean
public FlatFileItemWriter<User> writer() {
FlatFileItemWriter<User> writer = new FlatFileItemWriter<User>();
writer.setResource(new ClassPathResource("users.csv"));
writer.setLineAggregator(new DelimitedLineAggregator<User>() {
{
setDelimiter(",");
setFieldExtractor(new BeanWrapperFieldExtractor<User>() {
{
setNames(new String[] { "id", "name" });
}
});
}
});
return writer;
}
@Bean
public Step step1() {
return stepBuilderFactory.get("step1").<User, User>chunk(10).reader(reader()).processor(processor())
.writer(writer()).build();
}
@Bean
public Job exportUserJob() {
return jobBuilderFactory.get("exportUserJob2").incrementer(new RunIdIncrementer()).flow(step1()).end().build();
}
}
回答1:
Seems like you can make use of multi-threaded steps or parallel steps.
Check the docs out: https://docs.spring.io/spring-batch/4.0.x/reference/html/scalability.html#scalability
回答2:
You can run a job with JobLauncher. The code for the controller:
@RestController
public class JobExecutionController {
public JobExecutionController(JobLauncher jobLauncher,
@Qualifier("demoJob") Job demoJob,
@Qualifier("exportUserJob") Job exportUserJob) {
this.jobLauncher = jobLauncher;
this.demoJob = demoJob;
this.exportUserJob = exportUserJob;
}
JobLauncher jobLauncher;
Job demoJob;
Job exportUserJob;
@RequestMapping("/execute_job1")
@ResponseBody
public void executeBatchJob1() {
try {
JobExecution jobExecution = jobLauncher.run(demoJob, new JobParameters(generateJobParameter()));
log.info("Job started in thread :" + jobExecution.getJobParameters().getString("JobThread"));
} catch (JobExecutionAlreadyRunningException | JobRestartException | JobParametersInvalidException | JobInstanceAlreadyCompleteException e) {
log.error("Something sent wrong during job execution", e);
}
}
@RequestMapping("/execute_job2")
@ResponseBody
public void executeBatchJob2() {
try {
JobExecution jobExecution = jobLauncher.run(exportUserJob, new JobParameters(generateJobParameter()));
log.info("Job started in thread :" + jobExecution.getJobParameters().getString("JobThread"));
} catch (JobExecutionAlreadyRunningException | JobRestartException | JobParametersInvalidException | JobInstanceAlreadyCompleteException e) {
log.error("Something sent wrong during job execution", e);
}
}
private JobParameters generateJobParameter() {
Map<String, JobParameter> parameters = new HashMap<>();
parameters.put("Job start time", new JobParameter(Instant.now().toEpochMilli()));
parameters.put("JobThread", new JobParameter(Thread.currentThread().getId()));
return new JobParameters(parameters);
}
}
To prevent starting your jobs at the application start add to the application.properties next spring batch configuration: spring.batch.job.enabled=false
来源:https://stackoverflow.com/questions/56791056/spring-batch-run-multiple-jobs-in-parallel