spring-batch

Reg. transaction support for a spring batch job at job level

浪尽此生 提交于 2019-12-11 14:43:38
问题 Lets assume that i need to execute a spring batch job with 2 steps.step 1 is to read data from a postgres table and update values in the same table. step 2 is to read data from another postgres table and update this table. How can i achieve transactions at job level for this scenario? That is, if the second step fails, then the first step should be rolled back. 回答1: i'm not sure if there even exists an solution with automatic chained/multi-level transaction handling that works reliable (or

Spring Batch - One or more ItemWriter among a list

偶尔善良 提交于 2019-12-11 14:38:56
问题 What I want to do is use one or more ItemWriter for a single step, depending on the batch configuration. What I need is a behaviour between a classic CompositeItemWriter (which calls all writers) and a classic ClassifierCompositeItemWriter (which calls only one writer); this behaviour would let me call one or more writer, with an external condition for each of the writers specified as delegates. One of the solution I thought about is using a ClassifierCompositeItemWriter which contains as

Spring-Batch job does not end after last step

一个人想着一个人 提交于 2019-12-11 14:14:33
问题 My Spring-Batch job is set like that: @Bean Job myJob(JobBuilderFactory jobBuilderFactory, @Qualifier("stepA") Step stepA, @Qualifier("s"tepB) Step stepB) { return jobBuilderFactory.get("myJob") .incrementer(new RunIdIncrementer()) .start(stepA) .next(stepB) .build(); } And here is my launcher: @Autowired JobLauncher(@Qualifier("myJob") Job job, JobLauncher jobLauncher) { this.job = job; this.jobLauncher = jobLauncher; } @Scheduled(fixedDelay=5000) void launcher() throws

Spring step does not run properly when I “fib” the reader, must I use a tasklet?

自闭症网瘾萝莉.ら 提交于 2019-12-11 14:09:54
问题 I'm aware that all spring steps need to have a reader, a writer, and optionally a processor. So even though my step only needs a writer, I am also fibbing a reader that does nothing but make spring happy. This is based on the solution found here. Is it outdated, or am I missing something? I have a spring batch job that has two chunked steps. My first step, deleteCount , is just deleting all rows from the table so that the second step has a clean slate. This means my first step doesn't need a

Reset state before each Spring scheduled (@Scheduled) run

烂漫一生 提交于 2019-12-11 13:25:31
问题 I have a Spring Boot Batch application that needs to run daily. It reads a daily file, does some processing on its data, and writes the processed data to a database. Along the way, the application holds some state such as the file to be read (stored in the FlatFileItemReader and JobParameters ), the current date and time of the run, some file data for comparison between read items, etc. One option for scheduling is to use Spring's @Scheduled such as: @Scheduled(cron = "${schedule}") public

SimpleAsyncTaskExecutor trying to read record after completion of processing

空扰寡人 提交于 2019-12-11 13:24:27
问题 In my spring batch project, I need to read from a list of rows from a table, create a chunk of 4 and process and then write to another table. I have implemented SimpleAsyncTaskExecutor to allow for parallel processing of chunks, but I find that after all records in the recordset are processed, Spring Batch is trying to continue reading next lot of result and failing. After it exceeds the skip level, it obviously aborts the job. My query is - why will be the batch continue to look for next

How to send messages asynchronously to queue them up without waiting for reply of each message in spring amqp using rabbitmq in java?

十年热恋 提交于 2019-12-11 13:17:40
问题 I am trying to use rabbitmq using spring amqp, below is my configuration. <rabbit:connection-factory id="rabbitConnectionFactory" port="${rabbitmq.port}" host="${rabbitmq.host}" /> <rabbit:admin connection-factory="rabbitConnectionFactory" /> <rabbit:queue name="${rabbitmq.import.queue}" /> <rabbit:template id="importAmqpTemplate" connection-factory="rabbitConnectionFactory" queue="${rabbitmq.import.queue}" routing-key="${rabbitmq.import.queue}"/> <rabbit:listener-container connection-factory

Processing/reading .BAI2 files in java

给你一囗甜甜゛ 提交于 2019-12-11 13:07:56
问题 I am working on reading .BAI2 files and processing transaction records using java. I have been exploring various options like reading and parsing .BAI2 file using plain java file IO, using spring batch etc. But i am finding the .BAI2 file structure fairly complex and not being able to get it to work correctly. Just wanted to know the opinions/thoughts if there are any standard tools or ways to read .BAI2 files using java. And if it can be achieved using spring batch. Thanks in advance. .BAI2

Howto aggregate on full data set in Spring Batch jobs?

扶醉桌前 提交于 2019-12-11 13:05:24
问题 I need to insert aggregation in my Spring Batch jobs. But the aggregation step need to have the entire data set available. In pure SQL, it's easy to code SQL aggregation requests : the full data set (as stored in database) is available. But in Spring Batch jobs, everything is done in memory, and spread in chunked. So howto deal with that kind of data strewing ? Do you have any advice concerning the best practices to insert aggregation steps/processes ? Thx a lot for your enlightments 回答1: You

Spring batch: processing multiple record at once

两盒软妹~` 提交于 2019-12-11 12:49:40
问题 I am using spring batch and as normally used I have reader , processor and writer . I have 2 questions 1> Reader queries all 200 records (total record size in table is 200 and I have given pagesize=200 )and thus it gets me all 200 records, and in processor we want list of all these record because we have to compare each record with other 199 records to group them in different tiers . Thus I am thinking if we can get that list in processing step , I can manipulate them .how should I approach .