spring-batch

How to run Spring batch job with different parameters?

南楼画角 提交于 2019-12-11 12:14:38
问题 I'm using spring batch with custom reader and writer. I have a control table with customerId's . I need to run the same Step multiple times, once for each customer in my control table. The customerId should be able to pass as a parameter since I need it in the reader as well as in the writer. How can this best be achieved? @Bean public Step shipmentFactsStep() { return stepBuilderFactory.get("shipmentFactsStep") .<Shipmentfacts, Shipmentfacts>chunk(10000) .reader(shipmentfactsItemReader())

Spring Batch MDC Logging

扶醉桌前 提交于 2019-12-11 11:54:45
问题 I'm wanting to know how to log things such as the Job Name and Execution ID using MCD in Spring Batch. Here's some code: bootstrap.properties this file has a list of items I currently log, and I've added execId as the 3rd element here. logging.pattern.level=%5p [%X{X-B3-TraceId:-},%X{sessionId:-},%X{execId:-},%X{program:-},%X{mainframeId:-}] spring.application.name=mcc spring.profiles.active=globals,local,local-override MCC Application this file has my main method. When I manually set the

Spring Batch - Passing Resource Name from MultiResourceItemReader > FlatFileItemReader to StepExecutionListener

天涯浪子 提交于 2019-12-11 11:47:55
问题 I have a Spring Batch job that needs to do the below Check a directory on local file system that may contain more than one file Process each of the files, save data from those files to database Rename files by adding a suffix to include PROCESSED or ERROR I have used the below A MultiResourceItemReader that reads files and delegates to a FlatFileItemReader The FlatFileItemReader reads data using LineMapper , FieldSetMapper An ItemProcessor manipulates data read An ItemWriter writes to the

Spring-Batch - MultiFileResourcePartitioner - Caused by: java.net.SocketException: Connection reset

北战南征 提交于 2019-12-11 11:30:47
问题 I am using FlatFileItemReader and have extended the AbstractResource to return a stream from Amazon S3 object. S3Object amazonS3Object = s3client.getObject(new GetObjectRequest(bucket,file)); InputStream stream = null; stream = amazonS3Object.getObjectContent(); return stream; In my batch job I have also implemented MultiFileResourcePartitioner in which i gave the bucket to partition all the files. I am able to read only part of few files and after which i get a socket reset error.see below

Can we run multiple job instances of same job with different parameters at the same time using Spring batch partitioning and rabbitmq

有些话、适合烂在心里 提交于 2019-12-11 11:05:57
问题 I have implemented my batch jobs using spring batch partitioning and using rabbitmq as a middle ware. I studied documentation and referred to these unit tests https://github.com/sshcherbakov/spring-batch-talk/blob/master/src/main/resources/META-INF/master.xml https://github.com/sshcherbakov/spring-batch-talk/blob/master/src/main/resources/META-INF/slave.xml I can run my job steps concurrently but I am bit worried about how it will work if I launch multiple instances of same job at the same

SFTP using spring integration

跟風遠走 提交于 2019-12-11 10:25:44
问题 I have a use case where a user drop multiple csv files into remote directory and then place a ready.txt to indicate that files are ready to consume. Our applcation when it see a ready.txt file in the remote directory it should start copying all the files into local directory including ready.txt using sftp file inbound channel adapter. Is there a way to make sure readt.txt file is the last file to be copied to local directory? Because when files are copied from remote directory to local

Can we create multiple instances of a same java(spring) batch job?

拈花ヽ惹草 提交于 2019-12-11 10:12:43
问题 I am using quartz to schedule a spring batch job. The job reads a file from a folder( which has multiple files ) does some processing and copies it to another folder. is it possible to create multiple instances of the job, that will run concurrenty,reading multiple files ? My question is : In spring batch, is it possible to spawn multiple instances of the same job? I am using quartz schedular ? 回答1: In Spring Batch it is possible to start several jobs, provided you have supplied different

Error creating Job in Spring Batch

ε祈祈猫儿з 提交于 2019-12-11 10:09:30
问题 I'm using JDeveloper and Weblogic 12c(12.1.3) and I want to create a job with XML using Spring Batch. When I deploy the project It shows me and error. I need to use this IDE because of some restrictions in my job, also, without using maven. I've seen several example and I think my XML it's fine. I think the problem is related to weblogic (I had a similar issue) because I made the same test with the same project structure and libraries using Netbeans IDE and GlassFish Open Source Edition 4.1.1

Passing object as parameter when starting spring batch job

我们两清 提交于 2019-12-11 09:40:38
问题 My service is starting spring batch job. I want to be able to pass some object to the job, each time this object parameter will be different. This object I need to use in my tasklet. I am starting the job by JobLauncher. As far as I googled, I see that JobParameters wont' help me in this case. Also I found that lots of answers are to use JobExecutionContext or whatsoever. But I want to inject parameter object right before job start. Is it posssible? Service which starts the job @Service

Use a query that takes parameters in Spring Batch

夙愿已清 提交于 2019-12-11 09:30:43
问题 I have a functioning program that reads from a DB and inputs it into a flat file. I'm using Spring batch for these I want be able to choose the parameters for my query. How can I do that. My xml look something like this: <bean id="databaseitemreader" class="JdbcursorItemReader"> <property name = "datasource" <ref = ...> <property name = sql value= "Select fname , lname , address from tbl_student"/> Item file writer stuff ..... (This one does not need any change) I want to be be able to pass