spring-batch

Multiple Spring Batch jobs executing concurrently causing deadlocks in the Spring Batch metadata tables

大城市里の小女人 提交于 2020-01-10 19:32:09
问题 We have multiple Spring Batch jobs each running in their own java instance using the CommandLineJobRunner. All of the jobs are started simultaneously, only read/write flat files and update the same Spring Batch metadata hosted in SQL Server. The only database involved is the Spring Batch metadata database. When the multiple jobs are started simultaneously we get SQL deadlock exceptions. A more detailed stack trace can be found below. From the database perspective we can see that the deadlock

How to use Classifier with ClassifierCompositeItemWriter?

我只是一个虾纸丫 提交于 2020-01-10 06:08:08
问题 have trouble implementing a ClassifierCompositeItemwriter... I am reading a basic CSV File and i want to write them do a database. Depending on the data (Name + Name1) either write it to a simple ItemWriter or use a compositeItemwriter (that writes to two different Tables)... This is my : ClassifierCompositeItemwriter see > Error Message below public ClassifierCompositeItemWriter<MyObject> classifierCompositeItemWriter() { ClassifierCompositeItemWriter<MyObject> writer = new

Access @JobScope bean in spring batch with partitioned step

回眸只為那壹抹淺笑 提交于 2020-01-10 04:23:45
问题 Is there a way to access bean which is defined as @JobScope in partitioned step? We defined http client bean as @JobScope since it is unique per job but dynamically created and we need it in slave steps to issue post requests. When we autowire everything we get Error creating bean with name 'scopedTarget.captureErpStockTasklet': Scope 'step' is not active for the current thread; consider defining a scoped proxy for this bean if you intend to refer to it from a singleton; nested exception is

Location and UseCase for AggregateItemReader

[亡魂溺海] 提交于 2020-01-07 05:45:06
问题 Appendix Here lists a reader AggregateItemReader but I am not able to find it in any of Spring Batch jar files. I am using spring batch with spring boot and have version 3.0.7. API Doc indicates it in samples package. Currently, I am using JdbcPagingItemReader in my project but I want reader to return a List or Collection of objects to processor instead of a single object.This is needed so I can process those objects in bulk instead of processing one by one. Pagination requirement is

Invoking Stored Procedure using Spring JdbcBatchItemWriter

倖福魔咒の 提交于 2020-01-06 12:53:09
问题 I would like to execute a Stored Procedure using spring JdbcBatchItemWriter. My current code looks like : <bean id="xyzWriter" class="org.springframework.batch.item.database.JdbcBatchItemWriter"> ...... <property name="sql" value="update abc where x=:paramX" /> ...... </bean> I would like to replace this update sql query with a Stored Proc call. I would like to handle it in the xml file itself. Any help is really appreciated. Thanks 回答1: Did you tried running SP through JdbcBatchItemWriter?

Access job parameter in ItemReader (spring-batch using grails)

孤者浪人 提交于 2020-01-06 07:53:33
问题 I am launching a job from my service and code look like def jobParameters = new JobParametersBuilder() .addDate('toDate',toDate) .addDate('fromDate',fromDate) def jobEx = jobLauncher.run(billProcessJob,jobParameters.toJobParameters()) And it is executing successfully. But I need to access above job parameter in my Item Reader. My Item Reader looks like class MyDomainMapper implements FieldSetMapper { def mapLine(FieldSet fs) { if(!fs) { return null } log.debug('Record:'+fs);//Printing the

Reading CSV file with Spring batch and map to Domain objects based on the the first field then inserting in DB tables

北城以北 提交于 2020-01-06 06:18:45
问题 I can read and insert in the DB if the csv file has only lines starting with ORD but if the file is showing as below, mixed with different rows, I can do the mapping but the problem is what I'm using with the type of object for the chunk, as it takes one type of class. I want to map each row and then accordingly insert into the DB based on the first three characters, if the row starts with ORD then insert in Order table if starts with DET then insert in Detail table, my coding below show

Spring batch rollbacks the inserts

混江龙づ霸主 提交于 2020-01-06 05:43:05
问题 Configured the below in my project: <batch:no-rollback-exception-classes> <batch:include class="java.sql.SQLException"/> <batch:include class="org.springframework.dao.DuplicateKeyException"/> <batch:include class="java.sql.SQLIntegrityConstraintViolationException"/> </batch:no-rollback-exception-classes> While loading the file, I have duplicate records, but since I have configured org.springframework.dao.DuplicateKeyException under no-rollback-exception-classes , Spring batch should not

Apache POI 4.0.1 super slow getting started … 15 minutes or more. What is wrong?

别来无恙 提交于 2020-01-06 05:33:07
问题 It takes 15 minutes or more for POI to initialize its first workbook in Java 8 on Windows 10, in a Tomcat 8 instance. Based on interrupting the process in the debugger and looking at the stack, it is spending the time in the classloader, driven by xbeans. Edit This has the feel of a classloader issue because when I implemented the workaround for the POI library (below), other classes started expressing the same issue. Edit The stack trace looks most similar to this bug: https://bugs.java.com

Dataflow Tasks are not working with Spring Batch

前提是你 提交于 2020-01-06 04:42:16
问题 I'm having Spring Batch job that is also dataflow task . When I run this job everything seems OK, In Tasks > Executions I can see that tasks finished successfully. On the other hand when I go to Jobs tabs I'm getting this error (in command line): java.lang.NullPointerException: null at org.springframework.cloud.dataflow.server.service.impl.DefaultTaskJobService.getTaskJobExecution(DefaultTaskJobService.java:240) ~[spring-cloud-dataflow-server-core-1.2.2.RELEASE.jar!/:1.2.2.RELEASE] at org