spring-batch

@TransactionalEventListener at spring batch

£可爱£侵袭症+ 提交于 2020-04-18 06:31:10
问题 I want to process something after commit on spring batch . And I tried this example (https://dzone.com/articles/transaction-synchronization-and-spring-application). That example works perfectly on springboot like this flow. 1. some update query and event publish by ApplicationEventPublisher 2. some update query and event publish by ApplicationEventPublisher 3. some update query and event publish by ApplicationEventPublisher 4. commit 5. after commit logic I maked 6. after commit logic I maked

Using SpringBatch JdbcCursorItemReader with List as NamedParameters

最后都变了- 提交于 2020-04-18 06:26:19
问题 I am using below bean definition to configure a reader to read some data from the database table in a Spring Batch project. It is using a named param in SQL. I am passing A java.util.List as a parameter. However, I am getting Invalid Column type error when it tries to run the SQL. If I just hard code one single value ( namedParameters.put("keys", "138219"); ) instead of passing a list, it works. @Bean public JdbcCursorItemReader<MyDTO> myReader() { JdbcCursorItemReader<MyDTO> itemReader = new

Why does Spring batch application always output the same?

一曲冷凌霜 提交于 2020-04-18 06:14:31
问题 Below is my code snippet that I am trying to execute in Spring batch: @Bean public Job ioSampleJob() throws Exception { return this.jobBuilderFactory.get("DKSHSpringBatchApplication") .incrementer(new RunIdIncrementer()) .start(step1()).next(step2()) .next(step3()).next(step4()).build(); } Can anyone please help how I can execute my batch job. It was working fine initially but after several run it is just giving the old response. 来源: https://stackoverflow.com/questions/61042136/why-does

Why does Spring batch application always output the same?

不羁的心 提交于 2020-04-18 06:13:25
问题 Below is my code snippet that I am trying to execute in Spring batch: @Bean public Job ioSampleJob() throws Exception { return this.jobBuilderFactory.get("DKSHSpringBatchApplication") .incrementer(new RunIdIncrementer()) .start(step1()).next(step2()) .next(step3()).next(step4()).build(); } Can anyone please help how I can execute my batch job. It was working fine initially but after several run it is just giving the old response. 来源: https://stackoverflow.com/questions/61042136/why-does

Spring-Batch: Item writer for Parent-Child relationship

泪湿孤枕 提交于 2020-04-17 22:53:20
问题 I have written a item processor which returns the list of Objects. This object needs to be split into 2 data base table (One parent and a child). One header row and for this corresponding header ID we have child rows associated in child table. I have used ListUnpackingItemWriter example to solve list problem. I have used CompositeItemWriter to split the result into 2 writer, Now I need to split each one for header and child table. Now each writer has same number of rows. IS there a better way

Use Spring Batch to write in different Data Sources

若如初见. 提交于 2020-04-11 12:20:44
问题 For a project I need to process items from one table and generate 3 different items for 3 different tables, all 3 in a second data source different from the one of the first item. The implementation is done with Spring Batch over Oracle DB. I think this question has something similar to what I need, but in there it is writing at the end only one different item. To ilustrate the situation: DataSource 1 DataSource 2 ------------ ------------------------------ Table A Table B Table C Table D The

Use Spring Batch to write in different Data Sources

久未见 提交于 2020-04-11 12:20:09
问题 For a project I need to process items from one table and generate 3 different items for 3 different tables, all 3 in a second data source different from the one of the first item. The implementation is done with Spring Batch over Oracle DB. I think this question has something similar to what I need, but in there it is writing at the end only one different item. To ilustrate the situation: DataSource 1 DataSource 2 ------------ ------------------------------ Table A Table B Table C Table D The

NoSuchJobException when running a job programmatically in Spring Batch

别说谁变了你拦得住时间么 提交于 2020-04-08 11:35:19
问题 I have a Job running on startup. I want to run this job programmatically at a particular point of my application, not when I start my app. When running on startup I have no problem, but I got a "NoSuchJobException" ( No job configuration with the name [importCityFileJob] was registered ) when I try to run it programmatically. After looking on the web, I think it's a problem related to JobRegistry, but I don't know how to solve it. Note : my whole batch configuration is set programmatically, I

Spring-batch metadata tables in different schema

北慕城南 提交于 2020-04-08 10:21:34
问题 I have a datasource that connects to an Oracle database in my application. Is it possible to access to another schema that includes the Spring-batch metadata tables through this datasource? The user of this datasource has all rights to access to the other schema. I have already tried "tablePrefix" attribute of the JobRepository such as "Schema.batch_". But it does not work. Briefly, I search for the way to tell the Spring-batch to access to the metadata tables like "select ....from Schema

Multiple files of different data structure formats as input in Spring Batch

孤者浪人 提交于 2020-03-27 06:27:53
问题 Based on my research, I know that Spring Batch provides API to handling many different kinds of data file formats. But I need clarification on how do we supply multiple files of different format in one chunk / Tasklet. For that, I know that there is MultiResourceItemReader can process multiple files but AFAIK all the files have to be of the same format and data structure. So, the question is how can we supply multiple files of different data formats as input in a Tasklet ? 回答1: Asoub is right