spring-batch

Read flat file and write to multiple writers which will write different objects

别等时光非礼了梦想. 提交于 2019-12-12 06:47:24
问题 i have a requirement to read a flatfile and create a TRADE java object.Now processor should create 3 different java objects based on TRADE java object. i have to write these 3 different java objects to 3 differnt xml files. simply I want to one read 3 processors and 3 write to multiple xml files based on the read data. I have tried compositeItemWriter but it will take the same object for writing.but i have 3 different object consumer,envelope,deliveryOrder SampleInput :

SPRING BATCH : commit interval changing according to data in input file

我与影子孤独终老i 提交于 2019-12-12 05:38:06
问题 Currently my batch has a static commit-interval (1000). We asked me to change it in order to commit once a data in my flat file get changed. So I should have a reader which reads lines from flat file, once it notice that this data has changed, it should process the read lines and writes in database. I tried with completionPolicy as follows: ReceptionCompletionPolicy.java : public class ReceptionCompletionPolicy extends SingleItemPeekableItemReader<ReceptionLineFieldHelper> implements

How to apply partitioned count for MultiResourceItemReader?

╄→尐↘猪︶ㄣ 提交于 2019-12-12 05:25:24
问题 I have a file of 50K records. It takes close to 40 mins to insert it into a DB. So I thought of applying a partition to the step in such a way that the 50k records are partitioned between 10 threads (via gridSize ) with each thread processing 1000 records in parallel. All the forums show examples of using JDBCPagingItemReader and partitioned count set via execution context. Since I am using MultiResourceItemReader , how can I set the partition count( startingIndex and endingIndex - refer code

Dynamic SQL query with JdbcCursorItemReader

冷暖自知 提交于 2019-12-12 05:15:36
问题 I'm using java configuration (spring-boot) for spring batch. I have a list of Employee Ids and for each Id, I need to run a query (like below) and then process the data. select * from history where employee_id = ? I understand we can use reader.setPreparedStatementSetter to dynamically set the parameter in the above SQL. However, I'm not sure how I can repeat the batch process for each of the employee id in the list. Even if I mark the reader() as @StepScope, the reader is called only once.

Should I keep Lucene IndexWriter open for entire indexing or close after each document addition?

℡╲_俬逩灬. 提交于 2019-12-12 05:05:32
问题 Is closing Lucene IndexWriter after each document addition slow down my indexing process? I imagine, closing and opening index writer will slow down my indexing process or is it not true for Lucene? Basically, I have a Lucene Indexer Step in a Spring Batch Job and I am creating indices in ItemProcessor . Indexer Step is a partitioned step and I create IndexWriter when ItemProcessor is created and keep it open till step completion. @Bean @StepScope public ItemProcessor<InputVO,OutputVO>

Parse-load huge XML using Spring Batch framework

∥☆過路亽.° 提交于 2019-12-12 04:57:59
问题 I want to parse huge file with XML data and insert records into SQL server 2008. Database connection/Data insertion is managed by product framework. I read XML data, validate it, create object and pass it to framework for insertion. I am planning to divide XML file into small files, parse it using parser threads and run parallel load using different threads. I am new to Spring framework. How can I do this using Spring Batch framework? Do I need to divide file in Spring Batch? Will I be able

How to ignore namespace when unmarshalling XML file

浪尽此生 提交于 2019-12-12 04:06:57
问题 when the source file have the namespace with: xmlns="http://schemas.alcatel.com/iptv/singtel" the error will be raise: Encountered an error executing step sma-updstbparams.processfile in job sma-updstbparams org.springframework.oxm.UnmarshallingFailureException: JAXB unmarshalling exception; nested exception is javax.xml.bind.UnmarshalException - with linked exception: [com.sun.istack.internal.SAXParseException2; lineNumber: 3; columnNumber: 24; unexpected element (uri:"http://schemas.alcatel

Spring Batch: ItemReader with Scrollable Resultset

二次信任 提交于 2019-12-12 03:58:07
问题 I have batch job in which I read more than 1 million records from database and I am accessing those records using Scrollable Resultset. Now i am converting that job to spring batch. Scrollable Resultset won't work in this situtation. I have tried but after reading records in first chunk Resultset closes and when batch tries to access it in next step it throws exception: "can not operate on close result set". I am new to spring batch. Can any body please help me on how can i implement

I want to convert excel to xml

南楼画角 提交于 2019-12-12 03:41:41
问题 I want to convert excel to xml.Can any one tell me how to proceed with mapper? My code: public class WorkGroupReader implements ItemReader<List<String>> { WorkGroupMapper linemapper; public void setLinemapper(WorkGroupMapper linemapper) { this.linemapper = linemapper; } @Override public List<String> read() throws Exception, UnexpectedInputException, ParseException, NonTransientResourceException,BiffException, IOException { String FilePath = "E:\\ide-workspaces\\OmniFeed\\input\\WorkGroup.xls"

Spring Batch Chunk processing

纵饮孤独 提交于 2019-12-12 03:37:40
问题 When processing a step level using chunk processing(specifying a commit-interval) in Spring Batch,is there a way to know inside the Writer,when all the records in a file have been read and processed.My idea was to pass the collection of records read from the file to the ExecutionContext once all the records have been read. Please help. 回答1: I don't know if the is one of pre-built CompletionPolicy that do what you want, but if none you can write a custom CompletionPolicy that mark a chunk as