spring-batch

Activate Batch on only one Server instance

不羁的心 提交于 2020-05-24 03:53:20
问题 I have a nginx loadbalancer in front of two tomcat instances each contains a spring boot application. Each spring boot application executes a batch that writes data in a database. The batch executes every day at 1am. The problem is that both instances execute the batch simultaniously which i don't want. Is there a way to keep the batchs deployed in two instances and tell tomcat or nginx to start the batch in master server (and the slave server doesn't run the batch). If one of the servers

How can I remove StaxEventItemWriter <root> tag added by start/end Document methods?

烂漫一生 提交于 2020-05-23 17:23:06
问题 How can I remove StaxEventItemWriter tag added by start/end Document methods ? It is added by default when I'm generating the XML file, So please can any one tell me how I can remove default root tag. Ex- <?xml version='1.0' encoding='UTF-8'?> <root> <ressourcespleiade date="2015-10-03 06:38:00.000"> --- --- </..> 回答1: I ended doing something like this: /** * {@link StaxEventItemWriter} which write no root tag as written elements are root * @param <T> Type of the written elements * * @author

Partitioner for JdbcCursorItemReader - Reader must be open before it can be read

↘锁芯ラ 提交于 2020-05-18 12:06:06
问题 I had a working spring batch job which when I tried to make multi-threaded using a partitioner I started getting Reader must be open before it can be read. org.springframework.batch.item.ReaderNotOpenException: Reader must be open before it can be read. at org.springframework.batch.item.database.AbstractCursorItemReader.doRead(AbstractCursorItemReader.java:443) ~[spring-batch-infrastructure-3.0.7.RELEASE.jar:3.0.7.RELEASE] at org.springframework.batch.item.support

Spring Batch - Is there a way to commit data even if the chunk raise some exception?

允我心安 提交于 2020-05-17 06:26:28
问题 I have a process that read from a queue, process and write into DB. Even if the process fails, I have to store in DB. But the Spring Batch steps are transactional and always rollback the changes. So, is there a way to commit data even if the chunk raise some exception? EDIT I: I tried with Tasklet but getting the same behaviour. Thanks in advance. 回答1: One way you could write your JOB to commit all your data even when exceptions are raised in your processing is to use a SkipPolicy and write

Spring batch:Test case for Tasklet - Key is not appearing in actual class when it is invoked from Test class

半腔热情 提交于 2020-05-17 05:59:22
问题 I am trying to learn Batch and Tasklet. I am writing a test case for a Tasklet code in spring batch. I am setting a map in my Test class and debug, the actual class is not having the key which I am passing from my test class. MyEventTasklet.java public class MyEventTasklet implements Tasklet { public RepeatStatus execute (StepContribution contribution, ChunkContext chunkContext){ TreeMap<String, Map<Integer, Set<Student>>> studentMap = chunkContext.getStepContext().getJobExecutionContext()

Spring batch multithreading using partitioning

坚强是说给别人听的谎言 提交于 2020-05-16 07:03:56
问题 My problem statement is that- I have to pass multiple numbers of files to spring batch reader and reader runs in parellel.if we use grid-size = 100 then there will be 100 threads which is not logical. what is the way to solve this issue i.e. process many files with limited number of threads. @Bean public Step orderStep1() throws IOException { return stepBuilderFactory.get("orderStep1") .partitioner("slaveStep", partitioner()) .step(slaveStep()) .gridSize(100) .taskExecutor(taskExecutor())

Spring batch multithreading using partitioning

99封情书 提交于 2020-05-16 07:02:45
问题 My problem statement is that- I have to pass multiple numbers of files to spring batch reader and reader runs in parellel.if we use grid-size = 100 then there will be 100 threads which is not logical. what is the way to solve this issue i.e. process many files with limited number of threads. @Bean public Step orderStep1() throws IOException { return stepBuilderFactory.get("orderStep1") .partitioner("slaveStep", partitioner()) .step(slaveStep()) .gridSize(100) .taskExecutor(taskExecutor())

na] Caused by: java.sql.SQLSyntaxErrorException: ORA-00942: table or view does not exist - Spring Batch

爷,独闯天下 提交于 2020-05-14 12:39:05
问题 I am working on Spring Boot v2.2.6.RELEASE and Spring Batch . In this example, I am reading data from Oracle system and putting it into the Postgres system after applying some data filter. Note - Spring Batch is able to read the data from Oracle DB but unable to write it into PostgresDB . spring.datasource.url=jdbc:oracle:thin:@//localhost:1527/DB spring.datasource.username=user spring.datasource.password=password spring.datasource.driver-class-name=oracle.jdbc.OracleDriver postgres

Spring Boot Batch - execluding JobLauncherCommandLineRunner

风流意气都作罢 提交于 2020-05-13 06:17:21
问题 i have a simple Spring Batch job configured in Spring Boot (something similar to the spring guides). at startup, it auto-detects and invokes JobLauncherCommandLineRunner and i want to stop that behavior. I want the job to only be fired by a defined trigger elsewhere in the app, not on startup. i've tried the @ComponentScan(excludeFilters... approach but it still gets invoked. any way to switch off this 'helper' class? 回答1: You can set spring.batch.job.enabled=false or you can set spring.batch

SCDF - Create Metadata tables in different schema and work the Batch Job with different schema

一曲冷凌霜 提交于 2020-05-12 08:53:34
问题 The bounty expires in 3 days . Answers to this question are eligible for a +50 reputation bounty. Jeff Cook is looking for an answer from a reputable source : Need proper guidance on this issue and need more clear sample example or response I'm using Spring Batch which loads data from Oracle and put it into the MongoDB . I'm looking to use the Spring Cloud Data Flow , but SCDF doesn't have support for the MongoDB . Is there any way if we can maintain the SCDF Metadata into the Postgres (as