spring-batch

Can we make Lucene IndexWriter serializable for ExecutionContext of Spring Batch?

雨燕双飞 提交于 2019-12-25 07:59:09
问题 This question is related to my another SO question. To keep IndexWriter open for the duration of a partitioned step, I thought to add IndexWriter in ExecutionContext of partitioner and then close in a StepExecutionListenerSupport 's afterStep(StepExecution stepExecution) method. Challenge that I am facing in this approach is that ExecutionContext needs Objects to be serializable. In light of these two questions, Q1, Q2 -- it doesn't seem feasible because I can't add a no - arg constructor in

Spring Batch — StaxItemReader Jaxb to read multiple nested xml nodes

给你一囗甜甜゛ 提交于 2019-12-25 07:46:34
问题 Iam trying to read xml file which has nested xml nodes using StaxEventItemReader and Jaxb2Marshaller. some where iam doing something wrong which result in exception. Will provide sample XML file here File.xml <File> <FileDate>05/28/2016</FileDate> <RecordCount>75</RecordCount> <Transaction> <RecordID>1</RecordID> <MemberDetails> <Id>A2334549</Id> <MemberDemoData> <SubID>89890734548557</SubID> <MemberSuffix>01</MemberSuffix> <SSN>XXXXX</SSN> <CategoryCode>B</CategoryCode> <Gender>F</Gender>

One ItemReader, 2 SQL Query, jdbcTemplate?

拟墨画扇 提交于 2019-12-25 07:46:23
问题 I have a requirement where I am reading from the database in two different Query . Each Query has its own SQL . The SQLs are similar and are going after the same set of tables for the most part, with minor differences. I wanted to check if I can have two SQLs in an ItemReader or maybe using a jdbctemplate is possible? Any ideas, sample code? 回答1: in the event that you want to 're-use' an existing JdbcCursorItemReader (or one of the other Spring Batch Jdbc*ItemReaders), you can switch the SQL

For second page onwards, JdbcPagingItemReader is not putting values automatically for sortkey placeholder

↘锁芯ラ 提交于 2019-12-25 07:26:28
问题 I am using JdbcPagingItemReader as below, @Bean public ItemReader<RemittanceVO> reader() { JdbcPagingItemReader<RemittanceVO> reader = new JdbcPagingItemReader<RemittanceVO>(); reader.setDataSource(dataSource); reader.setRowMapper(new RemittanceRowMapper()); reader.setQueryProvider(queryProvider); reader.setPageSize(100); return reader; } @Bean public PagingQueryProvider queryProvider() throws Exception{ SqlPagingQueryProviderFactoryBean queryProviderBean= new

How to post on multiple queues using single job/ JMSwriter in spring batch

谁都会走 提交于 2019-12-25 07:19:07
问题 I am a newbie at Spring Batch and have recently started using it. I have a requirement where I need to post/write the messages read from each DB record on different queues using single Job. As I have to use reader to read the messages from DB and use processor to decide on which queue I have to post it. So my question is Can I use single JMSwriter to post the messages on different queues as I have to use single Job and DB Reader. Thanks in Advance 回答1: As I know JMSwriter not supports it (it

java.lang.ClassCastException: java.lang.String cannot be cast to com.common.batch.model.Customer - Spring Batch CompositeItemReader and Writter

倖福魔咒の 提交于 2019-12-25 06:24:00
问题 I am developing Spring Batch CompositeItemReader and Writter example. In this program I am trying to read data two tables from mysql db table and write to single XML file . When trying to do that I see following error is coming: java.lang.ClassCastException: java.lang.String cannot be cast to com.common.batch.model.Customer at com.common.batch.processor.CustomerProcessor.process(CustomerProcessor.java:1) at org.springframework.batch.core.step.item.SimpleChunkProcessor.doProcess

Spring Batch moving file after processing

一曲冷凌霜 提交于 2019-12-25 04:57:05
问题 I am new to Spring batch. I have to read multiple files(delimiter) from a folder and load them in DB. Which I did. But my issue is after each file is processed I have to move the file to processed folder or error records to Error folder. For example if I process a below file ( abc.txt ) from multiple file: D|hello1|123 D|hello2|three - Error D|hello3|123 I know that 2nd record is an error. Now I have to write the error record in a error file( abc-error.txt ) to error folder and proceed with

org.springframework.batch.item.ItemStreamException: Failed to initialize the reader

对着背影说爱祢 提交于 2019-12-25 04:32:11
问题 I'm trying to parse the flat file having multiplerecords. for parsing I'm using FlatFileItemReader class. while parsing i got this error. Jan 14, 2016 4:37:45 PM org.springframework.batch.core.step.AbstractStep execute SEVERE: Encountered an error executing the step org.springframework.batch.item.ItemStreamException: Failed to initialize the reader at org.springframework.batch.item.support.AbstractItemCountingItemStreamItemReader.open(AbstractItemCountingItemStreamItemReader.java:142) at org

Spring Batch dynamic Flow/Job construction

隐身守侯 提交于 2019-12-25 04:21:52
问题 I'm currently using Spring Batch to run a job that processes a file, does some stuff on each line and writes the output to another file. This was developed in a 'core' product but now (as always) we have some client-specific requirements that mandate the inclusion of some extra steps in the job. I've been able to do a proof-of-concept where use the common Spring features to be able to 'replace' the job with another one with the extra steps either by using distinct names for the job (if we

Spring Batch - Resource must not be null

只愿长相守 提交于 2019-12-25 04:13:30
问题 I'm new to Spring Batch and am trying to work with a simple Spring Batch application (http://spring.io/guides/gs/batch-processing/) and am trying to convert it to use Oracle as a storage mechanism. The problem I'm running into is below; INFO 22152 --- [ main] o.s.b.c.l.support.SimpleJobLauncher : Job: [FlowJob: [name=importUserJob]] failed unexpectedly and fatally with the following parameters: [{run.id=7, -spring.output.ansi.enabled=always}] java.lang.IllegalStateException: Failed to execute