spring-batch

Unable to log a error record in DB within ItemProcessListener - onProcessError

北战南征 提交于 2021-02-08 11:51:13
问题 We have implemented the ItemProcessListener and the SkipListener in the Batch job, which is using the Spring batch. We are able to log the skipped items in the database, without creating a separate transaction. But the when the onProcessError method is invoked in the ItemProcessListener, the transaction is rolled back, due to the corresponding Runtime Exception. We used @Transactional and propagation as REQUIRES_NEW, on the service interface for DB update, but it still rolled back the

Using SystemCommandTasklet for split the large flat file into small files

天大地大妈咪最大 提交于 2021-02-08 10:07:20
问题 Is anyone already splitted the large flat files into small files using Spring Batch SystemCommandTasklet. And i would like to know is it really time consuming process? We would like to split the file with 100 million records(each record contains 15 fields). Please any help on this would be really appreciated. Regards Shankar 回答1: I do it in my talk JSR-352, Spring Batch, and You (https://www.youtube.com/watch?v=yKs4yPs-5yU). There, I use the SystemCommandTasklet in combination with OS X's

Using SystemCommandTasklet for split the large flat file into small files

試著忘記壹切 提交于 2021-02-08 10:05:41
问题 Is anyone already splitted the large flat files into small files using Spring Batch SystemCommandTasklet. And i would like to know is it really time consuming process? We would like to split the file with 100 million records(each record contains 15 fields). Please any help on this would be really appreciated. Regards Shankar 回答1: I do it in my talk JSR-352, Spring Batch, and You (https://www.youtube.com/watch?v=yKs4yPs-5yU). There, I use the SystemCommandTasklet in combination with OS X's

Is there a bug in the new Spring JSON reader or am I doing something wrong?

ⅰ亾dé卋堺 提交于 2021-02-08 08:13:33
问题 I've got the following reader configured: @Configuration public class ReaderConfig { @Bean public JsonItemReader<String> jsonItemReader(Resource resource) { return new JsonItemReaderBuilder<String>() .jsonObjectReader(new JacksonJsonObjectReader<>(String.class)) .resource(resource) .name("jsonItemReader") .build(); } } With this test: @Test public void jsonItemReaderTest() throws Exception { ReaderConfig config = new ReaderConfig(); Resource sampleJsonResource = new ClassPathResource(

Parallel step execution of ItemStreamReader in SpringBatch

江枫思渺然 提交于 2021-02-08 07:42:35
问题 I have a ItemStreamReader ( extends AbstractItemCountingItemStreamItemReader ), the reader on its own is quite fast, but the the following processing takes quite some time. From a business point of view I can process as many items in parallel as I want. As my ItemStreamReader is reading a large JSON file with a JsonParser, it ends up to be statefull. So just adding a TaskExecutor to the Step does not work and throws parsing exceptions and the following log output by spring batch: 16:51:41.023

Spring Batch - FlatFileParseException (record with double quotes)

左心房为你撑大大i 提交于 2021-02-08 07:40:48
问题 I have few records with double quotes in between the field values. So When i use FlatFileItemReader it throws FlatFileParseException for those records. The Sample Record is: 7^A3989815^A2400284298^ABU^AA" - CLEANING INC.^A$ How do we handle this kind of records in Spring Batch Item Readers? Regards, Shankar 回答1: You can change default quote character to something which you are sure will not appear as suggested here. We had similar problems and changed it to @ as suggested and it works, but

Utility of FetchSize and PageSize in Reader Spring Batch

一个人想着一个人 提交于 2021-02-08 07:21:26
问题 What is the difference between the propertie "FetchSize" and "PageSize" in Spring Batch ? The PageSize is the number of rows to retrieve at a time ? The FetchSize is number of DB calls ? If my query return 10000 rows, what is the best setting ? If i put PageSize to 1000 and FetchSize to 1000, can you confirm i just need 10 calls for return all rows ? So if i upgrade the number of the propertie PageSize (for example 10.000), the number of DB call is just 1 so running time of the batch is

Utility of FetchSize and PageSize in Reader Spring Batch

这一生的挚爱 提交于 2021-02-08 07:21:04
问题 What is the difference between the propertie "FetchSize" and "PageSize" in Spring Batch ? The PageSize is the number of rows to retrieve at a time ? The FetchSize is number of DB calls ? If my query return 10000 rows, what is the best setting ? If i put PageSize to 1000 and FetchSize to 1000, can you confirm i just need 10 calls for return all rows ? So if i upgrade the number of the propertie PageSize (for example 10.000), the number of DB call is just 1 so running time of the batch is

Spring-Batch Java Based FileItemWriter for CSV

≯℡__Kan透↙ 提交于 2021-02-08 05:16:51
问题 I have a Spring Batch service containg ItemWriter to write the data to the CSV. I have used the example give by Spring Batch guide. https://spring.io/guides/gs/batch-processing/ I tried to modify the ItemWriter to create the CSV again. The Problems which I am facing are - It is not creating the CSV file if it is not present. If I made it available before hand it is not writing data to it. @Bean public ItemWriter<Person> writer(DataSource dataSource) { FlatFileitemWriter<Person> csvWriter =

Spring-Batch Java Based FileItemWriter for CSV

China☆狼群 提交于 2021-02-08 05:12:38
问题 I have a Spring Batch service containg ItemWriter to write the data to the CSV. I have used the example give by Spring Batch guide. https://spring.io/guides/gs/batch-processing/ I tried to modify the ItemWriter to create the CSV again. The Problems which I am facing are - It is not creating the CSV file if it is not present. If I made it available before hand it is not writing data to it. @Bean public ItemWriter<Person> writer(DataSource dataSource) { FlatFileitemWriter<Person> csvWriter =