spring-batch

@OnSkipInWrite is Not Called in SkipListener

Deadly 提交于 2019-12-11 18:08:18
问题 I am reading the csv file and inserting data to database using spring batch(read,process and write).I am using "jpaRepository.save" in itemWriter class to save the data into the database. And I am trying to catch the skipped item and the skipped message in @OnSkipInWrite method but this method is not called even if data are skipped. And in batch_step_execution table : read_count = 18, write_count = 10, write_skip_count = 0, roll_back_count =8. Why the write_skip_count is 0? I just want to

Marshaller must support the class of the marshalled object error in split job with spring batch

微笑、不失礼 提交于 2019-12-11 17:54:37
问题 In a spring batch application that I use to learn , the application is to read from database tables and save in xml files.I tried to split steps so that 2 steps run in parallel (flow) .( I will put only the relevant code if you need more details I will add them ). when I run the job it s read the first 10 record for each step1 and step2 but I get an error after each 10 record readings : java.lang.IllegalStateException: Marshaller must support the class of the marshalled object jobPerson.xml :

Retry feature is not working in Spring Batch

空扰寡人 提交于 2019-12-11 17:13:02
问题 I have a batch job where i am using ScriptBatch.3.0.x version. My use-case is to retry the job incase of any intermediate failures in between. I am using the Chunk based processing and StepBuilderFactory for a job. I could not see any difference by adding the retry in it. return stepBuilderFactory.get("ValidationStepName") .<Long, Info> chunk(10) .reader(.....) .processor(.....) // .faultTolerant() // .retryLimit(5) // .retryLimit(5).retry(Exception.class) .writer(......) .faultTolerant()

How to put data to job scope Map inside the writer?

允我心安 提交于 2019-12-11 16:55:45
问题 I start job following way: jobExecution = jobLauncher.run(job, jobParameters); jobExecution... /// I want to get Map with results here Also I have following writer: @Component public class MyWriter implements ItemWriter<MyBean> { @Override public void write(@NonNull List<? extends MyBean> items) throws Exception { MyResult result = someComponent.doSmth(items); } } } I want to put result into Map to collect all results within single job execution. How could I achieve it? 回答1: You can put the

Spring Batch : Parsing a CSV file with quoteCharacter

帅比萌擦擦* 提交于 2019-12-11 16:48:53
问题 I'm new in Spring Batch, we know that CSV files come in all form and shapes… and some of them are syntactically incorrect. I'm tring to parse a CSV file, that line start with '"' and end with '"' this is my CSV : "1;Paris;13/4/1992;16/7/2006" "2;Lyon;31/5/1993;1/8/2009" "3;Metz;21/4/1990;27/4/2010" I tried this : <bean id="itemReader" class="org.springframework.batch.item.file.FlatFileItemReader"> <property name="resource" value="data-1.txt" /> <property name="lineMapper"> <bean class="org

Can't remove file from batch listener on Windows? (The process cannot access the file because it is being used by another process)

一曲冷凌霜 提交于 2019-12-11 16:32:02
问题 I created simple job which reads all files from folder( D:\\chunk ), does empty procesing and empty writing and I registered listener to remove file after processing. On Windows(On Linux and MacOs it does not happen) mashine I experince following error: 2019-09-09 12:08:13.752 WARN 4028 --- [ main] c.b.m.b.RemovingListener : Failed to remove chunk 0b9a2623-b4c3-42b2-9acf-373a2d81007c.csv java.nio.file.FileSystemException: D:\chunk\1.csv: The process cannot access the file because it is being

Want to get total row count in footer of spring batch without customizing writer(Delegate Pattern)

时光怂恿深爱的人放手 提交于 2019-12-11 16:12:34
问题 This is my footer class:-- public class SummaryFooterCallback extends StepExecutionListenerSupport implements FlatFileFooterCallback{ private StepExecution stepExecution; @Override public void writeFooter(Writer writer) throws IOException { writer.write("footer - number of items written: " + stepExecution.getWriteCount()); } @Override public void beforeStep(StepExecution stepExecution) { this.stepExecution = stepExecution; } } This is my xml:-- <bean id="writer" class="org.springframework

How to use Spring Batch read CSV,process it and write it as a CSV with one row can produce more than one row?

有些话、适合烂在心里 提交于 2019-12-11 15:43:02
问题 I'm using Spring Batch in order to read a CSV file, process it and write it back after some processing, it is pretty simple to do when there is a one to one relation between the source and target, but according to my business logic, in some cases a row in the input can produce more than one rows in the output files. This is how the the processor looks like but i couldn't find any information on how to write a Writer for it. public class CsvRowsProcessor implements ItemProcessor<RowInput, List

In spring batch, how to insert a piece of code just after reading a list of item by bulk, with given list of item as parameter?

人走茶凉 提交于 2019-12-11 15:41:57
问题 I'm using spring batch in chunk mode for processing items. I read them by bulk(6000 items by bulk), process them one by one, and write them all. I read them via a JdbcCursorItemReader, which is very conveniant for bulk processing, and reading. The problem is that once read, I need to retrieve additional data from another source. Simplest way is to do it in the processor, calling custom method like getAdditionalDataById(String id) . The wrong thing in this is that it consume a lot of times. So

Cannot autowired beans when separate configuration classes

懵懂的女人 提交于 2019-12-11 15:25:04
问题 I have a JavaConfig configurated Spring Batch job. The main job configuration file is CrawlerJobConfiguration. Before now, I have all the configuration (infrastructure, autowired beans, etc) in this class and it works fine. So I decided to separate the job configuration from autowired beans and infracstruture beans configuration and create another 2 configuration classes Beans and MysqlInfrastructureConfiguration. But now I am having problems to run my job. I'm receiving a