spring-batch

MultiResourceItemReader with a custom delegate keeps reading the same file

不打扰是莪最后的温柔 提交于 2019-12-11 01:48:06
问题 Hello there java wizards, I am having a lot of trouble getting into the spring batch. Straight to the point for now. I need to process all files (xmls) from a folder and them write them back, with a small addition. The problem was that I want to preserve the input filename. The solution that I have for this is a MultiResourceItemReader that delegates to a custom Itemreader that in turn calls the StaxEventItemReader and returns a custom item that holds the marshalled xml and the filename.

How to configure mentioned use case using spring batch parallel step split flow?

邮差的信 提交于 2019-12-11 01:06:51
问题 I want to implement this use case, I have 3 flows, <split id="split1" task-executor="taskExecutor"> <flow> <step id="step1" parent="s1" next="step2"/> <step id="step2" parent="s2"/> </flow> <flow> <step id="step3" parent="s3"/> </flow> <flow> <step id="step4" parent="s4"/> </flow> <flow> <step id="step5" parent="s5"/> </flow> </split> <split id="split2" task-executor="taskExecutor"> <flow> <step id="step6" parent="s6"/> <step id="step7" parent="s7"/> </flow> <flow> <step id="step8" parent="s8

How to avoid duplicate document indexing in Lucene 6.0

。_饼干妹妹 提交于 2019-12-10 23:17:41
问题 I am creating a Lucene Index for values got from database. I have set Index OpenMode as OpenMode.CREATE_OR_APPEND . Index creation step is part of a Spring Batch Job. My understanding is that when I run job for the first time, indexing might take a while but when I rerun the job again for same unchanged source data , it should be fast because document is already there so UPDATE OR INSERT has not be performed. But for my case, subsequent indexing attempts for same unchanged source data gets

how to insert data into multiple tables through ItemWriter

百般思念 提交于 2019-12-10 21:24:02
问题 how to insert data into multiple tables through ItemWriter. ItemWriter Gets input through ItemReader which selects the data from multiple tables.it should accomplish this in single step. can somebody help? 回答1: You can use CompositeWriter of Spring Batch <chunk reader="myReader" writer="compositeWriter" /> Composite writer, seen by your step is no different by other writers, you may look at the chunk definition above. <bean id="compositeWriter" class="org.springframework.batch.item.support

Spring cloud data flow with spring batch job - scaling considerations

隐身守侯 提交于 2019-12-10 20:57:14
问题 We are currently in evaluation process shifting from Spring batch + Batch Admin into Spring Cloud based infrastructure. our main challenges / questions: 1. As part of the monolithic design of the spring batch jobs we are fetching some general MD and aggregated it into common data structure that many jobs using to run in a more optimized way. is the nature of the SCDF Tasks going to be a problem in our case ? should we reconsider shifting into Streams ? and how its can be done ? 2. One of the

Spring Batch - Executing multiple instances of a job at same time

自古美人都是妖i 提交于 2019-12-10 17:09:58
问题 I have a clarification. Is it possible for us to run multiple instances of a job at the same time. Currently, we have single instance of a job at any given time. If it is possible, please let me know how to do it. 回答1: Yes you can. Spring Batch distinguishes jobs based on the JobParameters. So if you always pass different JobParameters to the same job, you will have multiple instances of the same job running. A simple way is just to add a UUID parameter to each request to start a job. Example

JobParameters from Spring Batch

删除回忆录丶 提交于 2019-12-10 14:48:21
问题 I am trying to inject job parameters into a custom ItemReader. I have reviewed all of the StackOverflow notes on the subject (example: How to get access to job parameters from ItemReader, in Spring Batch?), and I see this is a common pain point that is mostly unresolved. I am hoping that a spring guru (@Michael Minella anyone) will see this and have some insight. I have got as far as determining that the jobparameters are available about one out of 10 runs, even with no code or configuration

How can I resolve this SQLTransactionRollbackException with Hsqldb in Spring Batch?

旧巷老猫 提交于 2019-12-10 14:34:54
问题 I'm working on a Spring Batch application that needs to execute jobs periodically. Here's a fragment of my configuration file that sets up the in-memory (hsqldb) database used for transaction handling. @Bean public SimpleJobLauncher simpleJobLauncher() { SimpleJobLauncher jl = new SimpleJobLauncher(); try { jl.setJobRepository(jobRepository()); } catch (Exception e) { System.err.println("Failed to create job repository"); } return jl; } @Bean public JobRepositoryFactoryBean

Spring Batch accessing job parameter inside step

℡╲_俬逩灬. 提交于 2019-12-10 12:37:38
问题 I have a following Spring Batch Job config: @Configuration @EnableBatchProcessing public class JobConfig { @Autowired private JobBuilderFactory jobBuilderFactory; @Autowired private StepBuilderFactory stepBuilderFactory; @Bean public Job job() { return jobBuilderFactory.get("job") .flow(stepA()).on("FAILED").to(stepC()) .from(stepA()).on("*").to(stepB()).next(stepC()) .end().build(); } @Bean public Step stepA() { return stepBuilderFactory.get("stepA").tasklet(new RandomFailTasket("stepA"))

BeanDefinitionParsingException: Configuration: The element [step2] is unreachable

放肆的年华 提交于 2019-12-10 12:28:54
问题 I had spring batch job similar to this one: <batch:job id="job"> <batch:step id="step1"> ... </batch:step> <batch:step id="step2"> ... </batch:step> </batch:job> and when tried to execute the job I got BeanDefinitionParsingException: Configuration problem: The element [step2] is unreachable 回答1: The problem is that there is missing next attribute in step1: <batch:step id="step1" next="step2"> 来源: https://stackoverflow.com/questions/20289814/beandefinitionparsingexception-configuration-the