spring-batch

How is the skipping implemented in Spring Batch?

家住魔仙堡 提交于 2019-11-29 03:58:14
I was wondering how I could determine in my ItemWriter , whether Spring Batch was currently in chunk-processing-mode or in the fallback single-item-processing-mode. In the first place I didn't find the information how this fallback mechanism is implemented anyway. Even if I haven't found the solution to my actual problem yet, I'd like to share my knowledge about the fallback mechanism with you. Feel free to add answers with additional information if I missed anything ;-) The implementation of the skip mechanism can be found in the FaultTolerantChunkProcessor and in the RetryTemplate . Let's

Could not open JPA EntityManager for transaction; nested exception is java.lang.IllegalStateException

大兔子大兔子 提交于 2019-11-29 03:33:24
I am quite new to Spring and Spring-Batch in particular. Still I somehow managed to install the Spring Batch-Admin . I added custom jobs and Hibernate/JPA for persistence. Everything is working as expected, up to the point where the first chunk should be persisted. Then I receive the following error-message: org.springframework.transaction.CannotCreateTransactionException: Could not open JPA EntityManager for transaction; nested exception is java.lang.IllegalStateException: Already value [org.springframework.jdbc.datasource.ConnectionHolder@60d31437] for key [org.springframework.jdbc

Spring batch-Delete the flatfile from directory after processed

别来无恙 提交于 2019-11-29 02:32:04
In spring batch , I am using MultiResourceItemReader to read multiple files from the directory. Then I am using a FlatFileItemReader as a delegate to process individual files. My usecase is to delete the file once it is processed completely(READ-WRITE is done) and then multiResourceitemReader has to pick another file and it has to go on. I tried FileDeletingTasklet to delete file in a directory, but as per Spring docs , the execute method will be called only once. How can I achieve delete on file which are processed(READ-WRITE), but I don't want to go with entire directory delete once all

Spring batch jpaPagingItemReader why some rows are not read?

时光怂恿深爱的人放手 提交于 2019-11-29 02:23:08
I 'm using Spring Batch(3.0.1.RELEASE) / JPA and an HSQLBD server database. I need to browse an entire table (using paging) and update items (one by one). So I used a jpaPagingItemReader. But when I run the job I can see that some rows are skipped, and the number of skipped rows is equal to the page size. For i.e. if my table has 12 rows and the jpaPagingItemReader.pagesize = 3 the job will read : lines 1,2,3 then lines 7,8,9 (so skip the lines 4,5,6)… Could you tell me what is wrong in my code/configuration, or maybe it's an issue with HSQLDB paging? Below is my code: [EDIT] : The problem is

Spring Batch FlatFileItemWriter - How to use stepExecution.jobId to generate file name

夙愿已清 提交于 2019-11-29 02:22:22
I have this FileWriter where I'm trying to append the current Job Id to the filename that is generated. <bean id="csvFileWriter" class="org.springframework.batch.item.file.FlatFileItemWriter" scope="step"> <property name="resource"> <bean class="org.springframework.core.io.FileSystemResource"> <constructor-arg type="java.lang.String"> <value>${csv.file}_#{stepExecution.jobExecution.jobId}</value> </constructor-arg> </bean> </property> <property name="lineAggregator"> <bean class="org.springframework.batch.item.file.transform.DelimitedLineAggregator"> <property name="delimiter"> <util:constant

How to read all files in a folder with spring-batch and MultiResourceItemReader?

与世无争的帅哥 提交于 2019-11-29 01:58:38
I want to configure spring-batch to read all csv files inside a specific folder sequentially. The following does not work because the delegate will try to open a file named *.csv , which of course is invalid. What do I have to change here? @Bean public ItemReader<String> reader() { MultiResourceItemReader<String> reader = new MultiResourceItemReader<>(); reader.setResources(new Resource[] {new FileSystemResource("/myfolder/*.csv")}); reader.setDelegate(new FlatFileItemReader<>(..)); return reader; } The equivalent xml configuration would be written as follows, how could I rewrite it to java

commit-interval in Spring batch and dealing with rollbacks

南楼画角 提交于 2019-11-29 00:46:36
问题 My question relates to Spring batch and transactions. Say I've chosen a commit-interval of 50 for one of my steps. Also suppose I have 1000 records in all and amongst those records one will cause the itemWriter to fail thereby causing a rollback of the entire chunk (50 records in my example). What are the stategies to make sure that the 49 valid records are written to database after the job has completed (and ignored the problematic chunk)? 回答1: After some researching, I came up with the

Spring batch to upload a CSV file and insert into database accordingly

荒凉一梦 提交于 2019-11-28 22:07:14
My project has this requirement where user uploads a CSV file which has to be pushed to mysql database. I know we can use Spring batch to process large number of records. But I'm not able to find any tutorial/sample code for this requirement of mine. All the tutorials which I came across just hardcoded the CSV file name in it like below: https://spring.io/guides/gs/batch-processing/ I'll need to use the file uploaded by user and process it accordingly. Any help here would be appreciated.. If not with Spring batch, is there any other way to insert the uploaded CSV data to mysql? Please have

Routing data to multiple files in item writer based on item's property as criteria

断了今生、忘了曾经 提交于 2019-11-28 22:00:38
I am getting a list of items in my reader. There is a property called Code in each item object having several possible values not known to me before hand. 1) Based on the value of Code in each item, I want to write that particular item in a output file pertaining to that Code . For e.g. if my current item's Code is "abc", the item should be written in to abc.txt in the writer. 2) If there is a Code "xyz" in current item, for which the file is not present, a new file should get created and the item should go to that file. 3) For all such multiple files created based on Code , I also want to add

Multiple itemwriters in Spring batch

我的未来我决定 提交于 2019-11-28 21:13:03
I am currently writing a Spring batch where I am reading a chunk of data, processing it and then I wish to pass this data to 2 writers. One writer would simply update the database whereas the second writer will write to a csv file. I am planning to write my own custom writer and inject the two itemWriters in the customItemWriter and call the write methods of both the item writers in the write method of customItemWriter. Is this approach correct? Are there any ItemWriter implementations available which meet my requirements? Thanks in advance You can use Spring's CompositeItemWriter and delegate