spring-batch

Spring Batch and Spring Integration

怎甘沉沦 提交于 2019-12-01 09:31:55
I want to use Spring Batch and Spring Integration to import data from database and write them into a file and ftp them to a remote server. But I guess my problem is I don't want to create Domain Object for my table. My queries are random and I want something that just reads the data and writes it to files and transfer. Can I use Spring Batch and Integration without creating respective domain objects? Absolutely. You can use either of the JDBC ItemReader s or the JPA ItemReader with a ColumnMapRowMapper to retrieve a Map of the result set. You can use the FlatFileItemWriter pretty simply to

ArrayList cannot be cast to org.springframework.batch.core.JobParameter

放肆的年华 提交于 2019-12-01 09:20:07
问题 I want to send a list from rest client to rest web service which will start a job in Spring Batch. Is that possible or must I save the list in database/flatfile before start the job and read the input values from database/flatfile? I guess someone pointed how do it in certain Jira issue (see below) but I couldn't figure out at least a basic idea how to move forward. I placed below my controller and how I am trying to cast it to JobParameter. I placed the Jira link and the possible direction

Calling Async REST api from spring batch processor

生来就可爱ヽ(ⅴ<●) 提交于 2019-12-01 09:06:38
I wrote a spring batch job that processes List of Lists. Reader returns List of List. Processor works on each ListItem and returns processed List. Writer writes stuff to DB and sftp from List of List. I have a use case where I call Async REST api from spring batch processor. On ListenableFuture response I implemented LitenableFutureCallback to handle success and failure, which works as expected, but before the async call returns something, ItemProcessor dosen't wait for call backs from async api and returns object(List) to writer. I am not sure how to implement and handle async calls from

Job level Transactionality in Spring Batch

泄露秘密 提交于 2019-12-01 09:06:38
I know right now there is no such thing as inter-step transactionality in Spring-Batch. I'm developing a complex batch job, with many steps performing several actions in database, and each one is related with the others, in such way that each one of them belongs to the same transaction. The way I understand the Spring-Batch paradigm I'm bound to use one-step job in order to have transactionality. Is there any thought (or any other way) to have some kind of job-level transactionality in lately or future versions? Edit1: I have found in this link , point 6.3.1, a way to concatenate several

Version Incompatibility between Spring batch and cloudera hadoop

南楼画角 提交于 2019-12-01 08:29:09
问题 I was trying the spring batch word count program and faced a version issue like this : ERROR [org.springframework.batch.core.step.AbstractStep] - <Encountered an error executing the step> java.lang.IncompatibleClassChangeError: Found interface org.apache.hadoop.mapreduce.Counter, but class was expected I use Cloudera Hadoop2 cdh4.5.0 and Spring hadoop version 1.0.1.RELEASE . I cant identify the exact problem as Spring batch is compatible with hadoop cdh4 . My dependency tree is as shown below

Spring boot spring.batch.job.enabled=false not able to recognize

杀马特。学长 韩版系。学妹 提交于 2019-12-01 08:12:16
I tried spring.batch.job.enabled=false in application.properties and -Dspring.batch.job.enabled=false when running the jar file. However @EnableBatchProcessing automatically start running the batch jobs on application start. How i can debug such scenario? TestConfiguration.class @Configuration @EnableBatchProcessing public class TestConfiguration {...} MainApplication @ComponentScan("com.demo") @EnableAutoConfiguration public class MainApplication { public static void main(String[] args) throws BeansException, JobExecutionAlreadyRunningException, JobInstanceAlreadyCompleteException,

Job level Transactionality in Spring Batch

心不动则不痛 提交于 2019-12-01 07:39:41
问题 I know right now there is no such thing as inter-step transactionality in Spring-Batch. I'm developing a complex batch job, with many steps performing several actions in database, and each one is related with the others, in such way that each one of them belongs to the same transaction. The way I understand the Spring-Batch paradigm I'm bound to use one-step job in order to have transactionality. Is there any thought (or any other way) to have some kind of job-level transactionality in lately

Json Array reader file with spring batch

 ̄綄美尐妖づ 提交于 2019-12-01 07:27:56
问题 I have a file as an input which contain a json Array : [ { ..., ... }, { ..., ... }, { ..., ... } ] I want to read it without breaking the spring batch principales (With the same way as FlatFileReader or XmlReader) I didn't find any way to do it the readers already implemented in spring-batch . What's the best way to implement this reader ? Thanks in Advance 回答1: Assuming you want to model the StaxEventItemReader in that you want to read each item of the JSON array as an item in Spring Batch,

Spring Batch Javaconfig - parameterize commit-interval aka chunksize

爱⌒轻易说出口 提交于 2019-12-01 07:10:42
问题 with Spring Batch xml based configuration you can parameterize the commit-interval / chunk size like: <job id="basicSimpleJob" xmlns="http://www.springframework.org/schema/batch"> <step id="basicSimpleStep" > <tasklet> <chunk reader="reader" processor="processor" writer="writer" commit-interval="#{jobParameters['commit.interval']}"> </chunk> </tasklet> </step> </job> with javaconfig based configuration it could look like @Bean public Step step( ItemStreamReader<Map<String, Object>> reader,

How to throw exception from spring batch processor process() method to Spring batch job started method?

半世苍凉 提交于 2019-12-01 06:54:48
I am having the Web-service method to start spring batch job.If any exception occurred in spring batch processing control is coming back till processor process method. But i need the controller to came back to web-service method there i have to catch and code to email that exception. Web-service method: public void processInputFiles() throws ServiceFault { String[] springConfig = { CONTEXT_FILE_NAME }; ApplicationContext context = new ClassPathXmlApplicationContext(springConfig); try { setClientInfo(); JobLauncher jobLauncher = (JobLauncher) context.getBean(JOB_LAUNCHER); Job job = (Job)