spring-batch

How to read csv lines chunked by id-column with Spring-Batch?

拟墨画扇 提交于 2019-11-28 02:13:39
I'm using Spring-Batch to read a csv file, format the content and write it to a database like: StepBuilder<T, T> builder = stepBuilderFactory.get("step") .<T, T>chunk(100) .reader(flatFileItemReader) .processor(processor) .writer(jpaItemWriter); The csv contains an ID column. How can I modify the reader to base the chunks on that ID? Examle: #id, #value 1, first 1000, second 1001, second 1005, second In this case the chunk would only read the first line, then commit, and then continue. Is that possible to apply chunking by a value in the file? I did the same using a custom CompletionPolicy and

Make a spring-batch job exit with non-zero code if an exception is thrown

北城以北 提交于 2019-11-28 02:09:46
问题 I'm trying to fix a spring-batch job which is launched from a shell script. The script then checks the process exit code to determine whether the job has succeeded. Java, however, exits 0 even if the program ended with an exception, unless System.exit was specifically called with a different code, so the script always reports success. Is there a way to make spring-batch return a non-zero code on failure? To be clear, I'm not talking about the ExitStatus or BatchStatus, but the actual exit

using spring batch to execute jobs in parallel

∥☆過路亽.° 提交于 2019-11-28 02:02:47
问题 I have a use case as follows: 1)There is a Parentjob which has multiple child jobs. 2)All child jobs should be executed in parallel. 3)The parent job should wait until all child jobs are done. 4)One child jobs are done control returns to master 5)Master job is completed. 6)In case any of the child job throws exception then also control should return to master job Is this possible using spring batch? EDIT : Am not looking to execute multiple steps of job in parellel.But multiple child jobs of

how dynamic create ftp adapter in spring integration?

喜夏-厌秋 提交于 2019-11-28 01:37:23
Thanks for attention i used spring integration in my project, i want to retrieve many input file from multiple ftp server with different address as bellow image: how to create dynamically inbound-adapter in my project to polling and retrieve files from servers? Gary Russell See the dynamic-ftp sample . While it only covers the outbound side, there are links in the README to discussions about what needs to be done on the inbound side (put each adapter in a child context that send messages to a channel in the main context). Also see my answer to a similar question for multiple IMAP mail adapters

Spring-Batch: how do I return a custom Job exit STATUS from a StepListener to decide next step

痞子三分冷 提交于 2019-11-28 01:34:54
问题 The issue is this: I have a Spring Batch job with multiple step. Based on step one i have to decide the next steps. Can i set status in STEP1- passTasklet based on a job parameter so that i can set the Exit status to a custom status and define it in the job definition file to go to which next step. Example <job id="conditionalStepLogicJob"> <step id="step1"> <tasklet ref="passTasklet"/> <next on="BABY" to="step2a"/> <stop on="KID" to="step2b"/> <next on="*" to="step3"/> </step> <step id=

Making a item reader to return a list instead single object - Spring batch

大憨熊 提交于 2019-11-28 01:21:29
问题 Question is : How to make an Item reader in spring batch to deliver a list instead of a single object. I have searched across, some answers are to modify the item reader to return list of objects and changing item processor to accept a list as input. How to do/code the item reader ? 回答1: take a look at the official spring batch documentation for itemReader public interface ItemReader<T> { T read() throws Exception, UnexpectedInputException, ParseException; } // so it is as easy as public

Spring Batch JUnit test for multiple jobs

▼魔方 西西 提交于 2019-11-28 00:49:36
问题 I am having two jobs configured in one context file <batch:job id="JobA" restartable="true"> <batch:step id="abc"> <batch:tasklet > <batch:chunk reader="reader" writer="writer" processor="processor" /> </batch:tasklet> </batch:step> </batch:job> <batch:job id="JobB" restartable="true"> <batch:step id="abc"> <batch:tasklet > <batch:chunk reader="reader" writer="writer" processor="processor" /> </batch:tasklet> </batch:step> </batch:job> When i am doing unit testing for the JobA using

How to skip blank lines in CSV using FlatFileItemReader and chunks

落爺英雄遲暮 提交于 2019-11-28 00:15:39
问题 I am processing CSV files using FlatFileItemReader. Sometimes I am getting blank lines within the input file. When that happened the whole step stops. I want to skipped those lines and proceed normal. I tried to add exception handler to the step in order to catch the execption instead of having the whole step stooped: @Bean public Step processSnidUploadedFileStep() { return stepBuilderFactory.get("processSnidFileStep") .<MyDTO, MyDTO>chunk(numOfProcessingChunksPerFile) .reader(snidFileReader

Spring @Transactional and JDBC autoCommit

笑着哭i 提交于 2019-11-27 23:39:49
问题 On my actual application, I have a DBCP connection pool which doesn't have JDBC autoCommit=false set. It seems to have the default autoCommit=true. This is probably a mistake but I'd like to understand the impact of changing this parameter. I am using: - Spring with @Transactional annotation - Spring Batch with JDBC readers and writers, eventually custom tasklets using JdbcTemplate I would like to know if Spring does set autoCommit=false on the current connection if it is in the context of a

Spring Batch Framework - Auto create Batch Table

故事扮演 提交于 2019-11-27 23:32:25
I just created a batch job using Spring Batch framework, but I don't have Database privileges to run CREATE SQL. When I try to run the batch job I hit the error while the framework tried to create TABLE_BATCH_INSTANCE. I try to disable the <jdbc:initialize-database data-source="dataSource" enabled="false"> ... </jdbc:initialize-database> But after I tried I still hit the error org.springframework.jdbc.BadSqlGrammarException: PreparedStatementCallback; bad SQL grammar [SELECT JOB_INSTANCE_ID, JOB_NAME from BATCH_JOB_INSTANCE where JOB_NAME = ? and JOB_KEY = ?]; nested exception is java.sql