spring-batch

how to send a custom object as Job Parameter in Spring Batch?

跟風遠走 提交于 2019-12-09 13:07:33
问题 I have a requirement of sending a Custom Object to the Spring Batch Job , where this Object is used continuously used by the Item Processor for the business requirement. How can we send custom object from outside to the Job Context. This Object changes from Job to Job and generated at runtime depending on Business case. How can send this as a Job Parameter? or is there any way that i can set this Object to the respective Job ? can overriding Spring JobParameter help me in any way? or are

Not running DDL scripts with spring batch + spring boot + sql server application

╄→гoц情女王★ 提交于 2019-12-09 04:04:27
My project has this requirement where user uploads a CSV file which has to be pushed to sql server database. I am following the below basic example to load CSV file into a sql sever database. https://github.com/michaelcgood/Spring-Batch-CSV-Example runninng this repo by making change of datasource, here we using sql server instead of in-memory db. here is the addition to POM file: <dependency> <groupId>com.microsoft.sqlserver</groupId> <<artifactId>sqljdbc4</artifactId> <version>4.0</version> <</dependency> additions to applications.properties file spring.datasource.url=jdbc:sqlserver:/

How to terminate Step within a Spring Batch Split Flow with a Decider

僤鯓⒐⒋嵵緔 提交于 2019-12-09 03:43:48
问题 I've happened up the following design defect in Spring Batch. A Step must have a Next attribute unless it is the last Step or last Step of a Split Flow. A Decider block must handle all cases returned by the Decider. Because of this, in a Split Flow, where the final Step would not have a Next attribute, if there is a Decider guarding it, then it must have a Next attribute. So it shouldn't have that attribute, but it also needs it. Catch 22. Example: <!-- Process parallel steps --> <split id=

Moving processed files in remote S(ftp) using Java DSL

我们两清 提交于 2019-12-09 01:06:16
问题 I'm trying to move files on remote SFTP once the batch has successfully processed the files using Spring integration and Java DSL. What would be the best way to achieve that? Adding a step in batch to move remote files ? Or using FTP Outbound Gateway and provide the MV command ? I tend to prefer the second solution and let the batch focus on the logic only, but I've hard times trying to implement it with java dsl. I've read http://docs.spring.io/spring-integration/reference/html/ftp.html#ftp

Unexpected in Spring partition when using synchronized

a 夏天 提交于 2019-12-09 00:28:58
问题 I am using Spring Batch and Partition to do parallel processing. Hibernate and Spring Data Jpa for db. For the partition step, the reader, processor and writer have stepscope and so I can inject partition key and range(from-to) to them. Now in processor, I have one synchronized method and expected this method to be ran once at time, but it is not the case. I set it to have 10 partitions , all 10 Item reader read the right partitioned range. The problem comes with item processor. Blow code has

Spring Batch Item Reader is executing only once

喜欢而已 提交于 2019-12-09 00:15:07
问题 Trying to implement Spring batch,but facing a strange problem,Our ItemReader class is executing only once. Here below is the detail. If we have 1000 rows in DB. Our Item reader fetch 1000 rows from DB,and pass list to ItemWriter ItemWriter successfully delete all items. Now ItemReader again tries to fetch the data from DB,but did not find,hence returns NULL,so execution stops. But we have configured batch to be executed with Quartz scheduler,which is every minute. Now if we insert let say

Spring batch pause/resume vs stop/restart

三世轮回 提交于 2019-12-08 21:05:39
I am new to spring batch and has some questions regarding pause/resume. After reading spring batch documentation, there doesn't seem to have any built-in pause or resume functions. However, there is this use case i found from the main site: http://docs.spring.io/spring-batch/2.0.x/cases/pause.html There are no sample codes provided or is there any place I could find these samples? In Spring batch, I understand there is a stop and restart function built-in. Could I use this as a form of pause and resume? Or there is another better way of doing it? Stop/restart is essentially pause and resume.

Spring Batch SkipListener not called when exception occurs in reader

ε祈祈猫儿з 提交于 2019-12-08 21:04:04
问题 This is my step configuration. My skip listeners onSkipInWrite() method is called properly. But onSkipInRead() is not getting called. I found this by deliberately throwing a null pointer exception from my reader. <step id="callService" next="writeUsersAndResources"> <tasklet allow-start-if-complete="true"> <chunk reader="Reader" writer="Writer" commit-interval="10" skip-limit="10"> <skippable-exception-classes> <include class="java.lang.Exception" /> </skippable-exception-classes> </chunk>

Does spring batch job repository with resource-less trx manager keep state in memory for ever?

梦想的初衷 提交于 2019-12-08 20:44:26
running latest Spring 4.1.0 and spring batch 3.0.1 Uisng <bean id="jobRepository" class="org.springframework.batch.core.repository.support.MapJobRepositoryFactoryBean"> <property name="transactionManager" ref="transactionManager" /> </bean> <bean id="transactionManager" class="org.springframework.batch.support.transaction.ResourcelessTransactionManager"/> I have a job that executes every few seconds, it's a very basic ETL job, check if Db has some data transforms and pushes to another system. If nothing to be done it tries on next run. I have noticed that memory consumption keeps going up on

Does spring batch job repository with resource-less trx manager keep state in memory for ever?

試著忘記壹切 提交于 2019-12-08 20:43:30
running latest Spring 4.1.0 and spring batch 3.0.1 Uisng <bean id="jobRepository" class="org.springframework.batch.core.repository.support.MapJobRepositoryFactoryBean"> <property name="transactionManager" ref="transactionManager" /> </bean> <bean id="transactionManager" class="org.springframework.batch.support.transaction.ResourcelessTransactionManager"/> I have a job that executes every few seconds, it's a very basic ETL job, check if Db has some data transforms and pushes to another system. If nothing to be done it tries on next run. I have noticed that memory consumption keeps going up on