spring-batch

How to process multiple lines with the spring batch?

☆樱花仙子☆ 提交于 2019-12-24 07:05:58
问题 My data is like this UserId,UserData 1,data11 1,data12 2,data21 3,data31 The question is how I can make the spring batch itemreader read multiple lines and map to object like Map < userid, List < userdata > > Thanks. 回答1: Steps to follow: create a custom Item Writer class by implementing ItemWriter where we have implemented a logic to store User object in Map<String,List<String>> because a single user can have multiple associated data package com.spring.batch.domain; import java.util

How to process multiple lines with the spring batch?

ぐ巨炮叔叔 提交于 2019-12-24 07:05:12
问题 My data is like this UserId,UserData 1,data11 1,data12 2,data21 3,data31 The question is how I can make the spring batch itemreader read multiple lines and map to object like Map < userid, List < userdata > > Thanks. 回答1: Steps to follow: create a custom Item Writer class by implementing ItemWriter where we have implemented a logic to store User object in Map<String,List<String>> because a single user can have multiple associated data package com.spring.batch.domain; import java.util

Spring batch output force CR-LF as line seperator

倖福魔咒の 提交于 2019-12-24 03:53:28
问题 I am trying to force my spring batch to write file always with CR-LF[EDIT] as line seperator irrespective of the underlying system. I was trying to use setLineSeperator of FlatFileItemWriter <bean id="myFileWriter" class="org.springframework.batch.item.file.FlatFileItemWriter"> <property name="lineSeparator"> <value>\r\n</value> </property> </bean> But it always generates file with "\r\n" as string. I am not sure how to unescape this. I looked at the source code of FlatFileItemWriter, there

JdbcPagingItemReader in Spring batch is not giving correct results

狂风中的少年 提交于 2019-12-24 03:43:34
问题 I am facing issue that records return from query and pagination config i made giving incorrect no's of records. Is pagination config incorrect. pagination return less no of records. Query equivalent of paging config select * from SOME_TABLE where CLIENT_FILE_NM= 'process_abc.20150617024850' AND TXN_ID IS NOT NULL AND SOME_DATA IS NOT NULL order by CREATE_DT ASC; Paging config <bean id="postItemReader" class="org.springframework.batch.item.database.JdbcPagingItemReader" scope="step"> <property

Spring Batch process a one 10gb line into separate items

一笑奈何 提交于 2019-12-24 03:38:06
问题 I have implemented a spring-batch job to process a 10gb file with one item per file and it works great! But the client just changed it to make it a 10gb file with one single line with all items on it separated by size. Where the size are the first 4 digits of the text which on top of it all is in EBCDIC so I need to translate it into as well. An example would be: 0020aaaaaaaaaaaaaaaaaaa0010xxxxxxxxx0021aaaaaaaaaaaaaaaaaaay0009xxxxxxxx1aaaaaaa and so on for 10GB worth of text I have a couple

Spring Batch process a one 10gb line into separate items

孤街醉人 提交于 2019-12-24 03:38:02
问题 I have implemented a spring-batch job to process a 10gb file with one item per file and it works great! But the client just changed it to make it a 10gb file with one single line with all items on it separated by size. Where the size are the first 4 digits of the text which on top of it all is in EBCDIC so I need to translate it into as well. An example would be: 0020aaaaaaaaaaaaaaaaaaa0010xxxxxxxxx0021aaaaaaaaaaaaaaaaaaay0009xxxxxxxx1aaaaaaa and so on for 10GB worth of text I have a couple

NonTransientFlatFileException in Spring Batch

蹲街弑〆低调 提交于 2019-12-24 03:29:59
问题 I was trying to read a CSV file having 100 records and processing them in a batch of 10 records in one go. Everything is working fine but after processing all the records, i am getting org.springframework.batch.item.file.NonTransientFlatFileException: Unable to read from resource: [class path resource [csv/input/VMwareImport.csv]] and the root cause is org.springframework.batch.core.JobExecutionException: Partition handler returned an unsuccessful step. Below is my job xml: <beans xmlns="http

Infinite loop in DB2 JDBC driver

笑着哭i 提交于 2019-12-24 03:27:31
问题 I'm using a Spring Batch, DB2 with JDBC v9.5 FP0 driver sometimes in any step where the process read from database I get the next error org.springframework.batch.core.step.AbstractStep execute Encountered an error executing the step java.lang.StackOverflowError at java.util.HashMap.getEntry(Unknown Source) at java.util.HashMap.get(Unknown Source) at com.ibm.websphere.rsadapter.DB2DataStoreHelper.findMappingClass(DB2DataStoreHelper.java:529) at com.ibm.websphere.rsadapter.DB2DataStoreHelper

Infinite loop in DB2 JDBC driver

独自空忆成欢 提交于 2019-12-24 03:27:17
问题 I'm using a Spring Batch, DB2 with JDBC v9.5 FP0 driver sometimes in any step where the process read from database I get the next error org.springframework.batch.core.step.AbstractStep execute Encountered an error executing the step java.lang.StackOverflowError at java.util.HashMap.getEntry(Unknown Source) at java.util.HashMap.get(Unknown Source) at com.ibm.websphere.rsadapter.DB2DataStoreHelper.findMappingClass(DB2DataStoreHelper.java:529) at com.ibm.websphere.rsadapter.DB2DataStoreHelper

Spring batch — DatabaseType not found for product name: [Informix Dynamic Server]

本小妞迷上赌 提交于 2019-12-24 02:45:18
问题 I want to implement Spring Batch with Spring boot , since we are using informix database I'm running into the following exception when my spring boot App starts up. configuration : @Bean public DataSource dataSource() throws SQLException { BasicDataSource dataSource = new BasicDataSource(); dataSource.setDriverClassName(dataSourceProperties.getDriverClassName()); dataSource.setUrl(dataSourceProperties.getDbUrl()); dataSource.setUsername(dataSourceProperties.getDbUsername()); dataSource