spring-batch

How to move files to archive and error folders after processing

别来无恙 提交于 2021-01-01 06:59:35
问题 Job runs once and try to process all the files available in a source folder in a step. Further it need to do removal of processed/tried but failed files from the source folder to another subsequent folders (/_archived, /_faild). What is the best way to move successfully processed files in archive folder and unsuccessfull files in error folder categorically using spring batch. 回答1: you can add separate tasklet or use JobExecutionListener.afterJob hook to move files. Below is sample example for

Start a spring batch job when already within a transaction

杀马特。学长 韩版系。学妹 提交于 2020-12-29 07:14:04
问题 I have simple Spring-Service that (among other tasks) starts a spring batch job with the following code: @Autowired private JobRegistry jobRegistry; @Autowired private JobLauncher jobLauncher; public void startMyJob() { Job job = jobRegistry.getJob("myJobName"); JobParameters jobParameters = new JobParametersBuilder().toJobParameters(); jobLauncher.run(job, jobParameters); } This works fine, as long as there is no transaction active when the Serivce-Method is called. However, with an active

Start a spring batch job when already within a transaction

风格不统一 提交于 2020-12-29 07:12:00
问题 I have simple Spring-Service that (among other tasks) starts a spring batch job with the following code: @Autowired private JobRegistry jobRegistry; @Autowired private JobLauncher jobLauncher; public void startMyJob() { Job job = jobRegistry.getJob("myJobName"); JobParameters jobParameters = new JobParametersBuilder().toJobParameters(); jobLauncher.run(job, jobParameters); } This works fine, as long as there is no transaction active when the Serivce-Method is called. However, with an active

How to halt a Spring Batch job from a StepExecutionListener afterStep method?

别等时光非礼了梦想. 提交于 2020-12-29 04:54:10
问题 I have successfully halted a job from a before step with this method. public class FirstListener implements StepExecutionListener { @Override public void beforeStep(StepExecution stepExecution) { boolean shouldRun = shouldJobRun(); if (!shouldRun) { // listeners will still work, but any other step logic (reader, processor, writer) will not happen stepExecution.setTerminateOnly(); stepExecution.setExitStatus(new ExitStatus("STOPPED", "Job should not be run right now.")); LOGGER.warn(duplicate

How can you restart a failed spring batch job and let it pick up where it left off?

限于喜欢 提交于 2020-12-29 03:08:42
问题 According to the Spring Batch documentation restarting of a job is supported out of the box but I cannot get it to start from where it left of. e.g. If my step processed 10 records it should start at record 11 with processing whenever I restart it. In practice this doesn't happen. It reads from the beginnen en reprocesses everything. Does anybody have a Java config based configuration of a simple job that reads a delimited file and writes the content to a db table that can be restarted from

How can you restart a failed spring batch job and let it pick up where it left off?

一曲冷凌霜 提交于 2020-12-29 03:05:09
问题 According to the Spring Batch documentation restarting of a job is supported out of the box but I cannot get it to start from where it left of. e.g. If my step processed 10 records it should start at record 11 with processing whenever I restart it. In practice this doesn't happen. It reads from the beginnen en reprocesses everything. Does anybody have a Java config based configuration of a simple job that reads a delimited file and writes the content to a db table that can be restarted from

Processing huge data with spring batch partitioning

孤人 提交于 2020-12-28 07:53:30
问题 I am implementing spring batch job for processing millions of records in a DB table using partition approach as follows - Fetch a unique partitioning codes from table in a partitioner and set the same in execution context. Create a chunk step with reader,processor and writer to process records based on particular partition code. Is this approach is proper or is there any better approach for situation like this? As some partition codes can have more number of records than others,so those with

Processing huge data with spring batch partitioning

你。 提交于 2020-12-28 07:52:46
问题 I am implementing spring batch job for processing millions of records in a DB table using partition approach as follows - Fetch a unique partitioning codes from table in a partitioner and set the same in execution context. Create a chunk step with reader,processor and writer to process records based on particular partition code. Is this approach is proper or is there any better approach for situation like this? As some partition codes can have more number of records than others,so those with

Processing huge data with spring batch partitioning

北城余情 提交于 2020-12-28 07:52:08
问题 I am implementing spring batch job for processing millions of records in a DB table using partition approach as follows - Fetch a unique partitioning codes from table in a partitioner and set the same in execution context. Create a chunk step with reader,processor and writer to process records based on particular partition code. Is this approach is proper or is there any better approach for situation like this? As some partition codes can have more number of records than others,so those with

Processing huge data with spring batch partitioning

假如想象 提交于 2020-12-28 07:51:41
问题 I am implementing spring batch job for processing millions of records in a DB table using partition approach as follows - Fetch a unique partitioning codes from table in a partitioner and set the same in execution context. Create a chunk step with reader,processor and writer to process records based on particular partition code. Is this approach is proper or is there any better approach for situation like this? As some partition codes can have more number of records than others,so those with