spring-batch

Spring batch : Assemble a job rather than configuring it (Extensible job configuration)

瘦欲@ 提交于 2021-02-18 18:31:09
问题 Background I am working on designing a file reading layer that can read delimited files and load it in a List . I have decided to use Spring Batch because it provides a lot of scalability options which I can leverage for different sets of files depending on their size. The requirement I want to design a generic Job API that can be used to read any delimited file. There should be a single Job structure that should be used for parsing every delimited file. For example, if the system needs to

Spring Batch multiple process for heavy load with multiple thread under every process

戏子无情 提交于 2021-02-17 06:43:43
问题 I have a scenario where I need to have roughly 50-60 different process running concurrently and executing a task. Every process must fetch the data from DB using a sql query by passing a value and fetching data to be run against in the subsequent task. select col_1, col_2, col_3 from table_1 where col_1 = :Process_1; @Bean public Job partitioningJob() throws Exception { return jobBuilderFactory.get("parallelJob") .incrementer(new RunIdIncrementer()) .flow(masterStep()) .end() .build(); }

Java/Spring Processing Spring batch job requests asynchronously

为君一笑 提交于 2021-02-17 06:33:24
问题 Currently, I expose a rest endpoint in my application that kick of spring batch jobs. However, the requests are not scheduled asynchronously. Response is provided after job completes with batch status in the MyResponse object. @RestController @RequestMapping("/test") public class TestController { private MyProcessor processor; private RequestDataRepo repo; public TestController(final MyProcessor processor, final RequestDataRepo repo) { this.feedProcessor = feedProcessor; this.repo = repo; }

To read Excel can we use Spring batch?

岁酱吖の 提交于 2021-02-16 09:12:14
问题 I want to know if it is possible to use Spring Batch , in order to read from an file Excel and save it in Database. remark : the content of file Excel chang every 2 hours. And if it is not possible with Spring Batch, what other solution can i use 回答1: Go take a look on spring-batch-extensions for Excel. You will find some examples of ExcelItemReader and ExcelItemWriter. Here is the introduction of the spring-batch-extensions project for Excel : Spring Batch extension which contains ItemReader

Spring batch launches SimpleJobLauncher run method before boot run method

前提是你 提交于 2021-02-11 18:19:32
问题 I have job configuration as below @SpringBootApplication public class Test implements CommandLineRunner { @Autowired JobLauncher jobLauncher; @Autowired Job job; @Autowired private JobBuilderFactory jobs; @Autowired private StepBuilderFactory steps; public static void main(String[] args) { SpringApplication.run(Test.class, args); } @Override public void run(String... args) throws Exception { JobParameters params = new JobParametersBuilder() .addString("JobID", String.valueOf(System

Can we have multiple readers(reading records through pagination) in a single batch job

巧了我就是萌 提交于 2021-02-11 17:52:25
问题 I have a spring batch job to be written . where : I need to read say 10k records through pagination(fetching 1000 rec at a time) (from azure SQL db) I need to use 1000 records at a time and then use one column of these records(say some id) to read corresponsing records from another cosmos db table. How do I implement 2 readers in this case as I need to read 1000 records at a time and again fetch records from cosmos db for those 1000 records first and process that. 回答1: There is a common

how to autoscale the spring batch application in openshift?

喜欢而已 提交于 2021-02-11 15:42:14
问题 I have a spring batch application which will trigger the job to transfer bulk data from one database to another database through API call.All jobs are configured to work in parallel processing(Master/slave step)partition and deployed this application in openshift. Need to autoscale the application based on the load during the job execution.Even though i have used the openshift autoscale feature still i couldn't find the efficiency in performance of the job. PODs are simply creating but only

Spring Batch : One Reader, composite processor (two classes with different entities) and two kafkaItemWriter

不打扰是莪最后的温柔 提交于 2021-02-11 14:51:38
问题 ItemReader is reading data from DB2 and gave java object ClaimDto . Now the ClaimProcessor takes in the object of ClaimDto and return CompositeClaimRecord object which comprises of claimRecord1 and claimRecord2 which to be sent to two different Kafka topics. How to write claimRecord1 and claimRecord2 to topic1 and topic2 respectively. 回答1: Just write a custom ItemWriter that does exactly that. public class YourItemWriter implements ItemWriter<CompositeClaimRecord>` { private final ItemWriter

Spring Batch : One Reader, composite processor (two classes with different entities) and two kafkaItemWriter

笑着哭i 提交于 2021-02-11 14:51:05
问题 ItemReader is reading data from DB2 and gave java object ClaimDto . Now the ClaimProcessor takes in the object of ClaimDto and return CompositeClaimRecord object which comprises of claimRecord1 and claimRecord2 which to be sent to two different Kafka topics. How to write claimRecord1 and claimRecord2 to topic1 and topic2 respectively. 回答1: Just write a custom ItemWriter that does exactly that. public class YourItemWriter implements ItemWriter<CompositeClaimRecord>` { private final ItemWriter

How to test spring batch job within @Transactional SpringBootTest test case?

早过忘川 提交于 2021-02-11 14:46:25
问题 I just can't seem to win today... Is there a way to read from a OneToMany relationship in a Spock SpringBootTest integration test, without annotating the test as @Transactional or adding the unrealistic spring.jpa.properties.hibernate.enable_lazy_load_no_trans=true ? OR, is there a way to launch a Spring-Batch Job from within a @Transactional test case? Let me elaborate... I'm trying to get a simple Spring Boot Integration test working for my Spring Batch reporting process, which reads from