spring-batch

Deploying two Spring batch applications in same cluster in a single Weblogic Domain?

笑着哭i 提交于 2019-11-29 16:21:19
BackGround - I am trying to deploy two spring batch applications as .war in same cluster in a single Weblogic Domain & each of them have spring batch admin console configured in servlet.xml like below - <?xml version="1.0" encoding="UTF-8" ?> <beans xmlns="http://www.springframework.org/schema/beans" xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance" xsi:schemaLocation="http://www.springframework.org/schema/beans http://www.springframework.org/schema/beans/spring-beans-3.0.xsd"> <!-- Spring Batch Admin --> <import resource="classpath*:/org/springframework/batch/admin/web/resources/servlet

How to run Spring Batch Jobs in certain order (Spring Boot)?

会有一股神秘感。 提交于 2019-11-29 16:06:10
I'm developing with Spring Batch using Spring Boot. I'm with the minimal configuration provided by Spring Boot and defined some Jobs (no XML configuration at all). But when I run the application, SpringApplication.run(App.class, args); the jobs are sequentially executed in some arbitrary order. I'm defining the jobs this way in @Configuration annotated classes, Spring do the rest: @Bean public Job requestTickets() { return jobBuilderFactory.get(Config.JOB_REQUEST_TICKETS) .start(stepRequestTickets()) .build(); } How can I instruct the framework to run the jobs in a certain order? EDIT: Could

How to control the number of parallel Spring Batch jobs

感情迁移 提交于 2019-11-29 15:42:40
问题 I have a report generating application. As preparation of such reports is heavyweight, they are prepared asynchronously with Spring Batch. Requests for such reports are created via REST interface using HTTP. The goal is that the REST resource simply queues report execution and completes (as described in documentation). Thus a TaskExecutor has been provided for the JobLauncher: <bean id="jobLauncher" class="org.springframework.batch.core.launch.support.SimpleJobLauncher"> <property name=

How to get Job parameteres in to item processor using spring Batch annotation

安稳与你 提交于 2019-11-29 15:09:43
问题 I am using spring MVC. From my controller, I am calling jobLauncher and in jobLauncher I am passing job parameters like below and I'm using annotations to enable configuration as below: @Configuration @EnableBatchProcessing public class BatchConfiguration { // read, write ,process and invoke job } JobParameters jobParameters = new JobParametersBuilder().addString("fileName", "xxxx.txt").toJobParameters(); stasrtjob = jobLauncher.run(job, jobParameters); and here is my itemprocessor public

Spring batch :Restart a job and then start next job automatically

落花浮王杯 提交于 2019-11-29 14:58:20
I need to create a recovery pattern. In my pattern I can launch a job only on a given time window. In case the job fails, it will only be restarted on the next time window and when finish I would like to start the schedule job that was planned in advance for this window. The only different between jobs is the time window parameters. I thought about JobExecutionDecider with conjunction with JobExplorer or overriding a Joblauncher. But all seems too intrusive. I failed to found an example that match my needs any Ideas will be most welcome. Just to recap what was actually done based on the advice

Spring Batch - how to convert String from file to Date?

浪子不回头ぞ 提交于 2019-11-29 14:47:27
I am trying to process a CSV file in which some of the fields are dates of the format "yyyy-MM-dd" - but the reader fails when it tries to convert the String from the CSV file to a Date in my model class. The error is: org.springframework.validation.BindException: org.springframework.validation.BeanPropertyBindingResult: 1 error Field error in object 'target' on field 'datetimeInactive': rejected value [2011-04-27]; codes [typeMismatch.target.datetimeInactive,typeMismatch.datetimeInactive,typeMismatch.java.util.Date,typeMismatch]; arguments [org.springframework.context.support

How to read CSV file with different number of columns with Spring Batch

巧了我就是萌 提交于 2019-11-29 12:55:27
I have a CSV file that doesn't have a fixed number of columns, like this: col1,col2,col3,col4,col5 val1,val2,val3,val4,val5 column1,column2,column3 value1,value2,value3 Is there any way to read this kind of CSV file with Spring Batch? I tried to do this: <bean id="ItemReader" class="org.springframework.batch.item.file.FlatFileItemReader"> <!-- Read a csv file --> <property name="resource" value="classpath:file.csv" /> <property name="lineMapper"> <bean class="org.springframework.batch.item.file.mapping.DefaultLineMapper"> <!-- split it --> <property name="lineTokenizer"> <bean class="org

nested exception is redis.clients.jedis.exceptions.JedisConnectionException: Could not get a resource from the pool

懵懂的女人 提交于 2019-11-29 12:47:25
I already went through many links like Jedis, Cannot get jedis connection: cannot get resource from pool and Cannot get Jedis connection; Could not get a resource from the pool , but still getting the below error. I am using Spring Data Redis in Spring Batch and reading data from mysql and writing to redis DB. It seems some connection errors. The error below for reference. 2018-07-19 00:08:46 DEBUG o.s.t.support.TransactionTemplate - Initiating transaction rollback on application exception org.springframework.data.redis.RedisConnectionFailureException: Cannot get Jedis connection; nested

Spring batch execute dynamically generated steps in a tasklet

浪尽此生 提交于 2019-11-29 12:46:57
I have a spring batch job that does the following... Step 1. Creates a list of objects that need to be processed Step 2. Creates a list of steps depending on how many items are in the list of objects created in step 1. Step 3. Tries to executes the steps from the list of steps created in step 2. The executing x steps is done below in executeDynamicStepsTasklet(). While the code runs without any errors it does not seem to be doing anything. Does what I have in that method look correct? thanks /* * */ @Configuration public class ExportMasterListCsvJobConfig { public static final String JOB_NAME

Stopping a file inbound channel adapter after one one file is read

孤街醉人 提交于 2019-11-29 12:37:03
Our application uses a Spring Integration file:inbound-channel-adapter to poll a directory to listen to when a file is dropped there. Spring Integration then starts a Spring Batch job, handing over to the job the path and name of the file to process. Obviously, the file poller continues to run even after a file has been processed by the Spring Batch job. So, the Spring context remains open and the application does not terminate. Is there a way, programatically or through configuration (preferable), to stop the poller after one file has been read? Thanks Gary Russell You can use a