spring-batch

Spring batch autowire bean in writer not working by configuring it via job java based configuration

断了今生、忘了曾经 提交于 2019-12-24 17:33:15
问题 I have to autowire an object to my writer class of spring batch job. This has to be specific for job ( job scope ) I am trying to configure it my spring batch job java configuration calss and use it by autowire fashion from my writer class. But this seems to be not working and I always gets this bean as null from my writer class. What am I doing wrong. My Job configuration class snippet : @org.springframework.context.annotation.Configuration @EnableBatchProcessing @Component public class

Spring batch job throws error like 'Partition handler returned an unsuccessful step'

元气小坏坏 提交于 2019-12-24 17:23:31
问题 We have spring batch job which is processing 100 million records in multithreded job with scaling process like partitioning. Here master step create 500 paritions and those are being processed by 100 threads. But sometimes job is failing with just following exception. If I rerun the job without any code change it just works. Can someone explain what might be causing issue in slave step which is running in diff thread which makes master step to fail and stop processing further. 2015-09-11 17

Spring Batch - web service to web service chunking

时间秒杀一切 提交于 2019-12-24 15:50:37
问题 I have a web service hosted which allows to pull records in batch. This web service takes starting record number (ROWID) and page size(800 max) as parameters. There could be 50-60k records to pull from this service and call another web service to post all these data again in chunk without persisting data in between. How could I use Spring Batch to pull the records page by page (chunking) by calling web service and how do I post same records to another web service. I was able to do this using

Spring batch or Spring boot async method execution?

回眸只為那壹抹淺笑 提交于 2019-12-24 14:53:01
问题 I have a situation where the data is to be read from 4 different web services, process it and then store the results in a database table. Also send a notification after this task is complete. The trigger for this process is through a web service call. Should I write my job as a spring batch job or write the whole read/process code as an async method (using @Async) which is called from the Rest Controller? Kindly suggest 回答1: In my opinion the your choice should be @Async, because Spring Batch

Partitioned jobs using reply destination

邮差的信 提交于 2019-12-24 14:16:22
问题 We have a JEE app that uses about 40 partitioned jobs on a cluster. It can be deployed on both JBoss and WebSphere. We are experiencing 2 problems: messaging system failures in both JBoss and WebSphere typically related to temporary queue connection problems partitioned jobs effectively hung because of lost messages. I read a posting that switching the reply-destination of the outbound-gateway can improve robustness and allow for re-connection in the case of failures. The inbound-gateway

Convert Message to Job to make it Spring Integration with Batch Processing

若如初见. 提交于 2019-12-24 13:29:15
问题 I am trying to process a series of files using Spring Integration in a batch fashion. I have this very old xml which tries to convert the messages into jobs <int:transformer ref="messageToJobTransformer"/> <batch-int:job-launching-gateway job-launcher="jobLauncher"/> The messageToJobTransformer is a class which can convert a Message into a Job. The problem is I don't know where this file is now neither I want a xml config. I want it to be pure Java DSL. Here is my simple config. return

Clone table from database to database

我的未来我决定 提交于 2019-12-24 11:35:54
问题 I need a solution for getting the whole table from one DB instance (DB1) and creation the same on another DB instance (DB2). Earlier I used Spring Integration tightly, but I heard that Spring Batch can fit better for such a case and I would like to try it. So, is it possible/have a sence to use Spring Batch job with the following steps: Create an empty table on DB2 having the same schema as a source table from DB1. Select from DB1 table -> update DB2 table. In case smth wrong during the step

Spring Batch querying with state changes

回眸只為那壹抹淺笑 提交于 2019-12-24 09:52:04
问题 I am using Spring Boot 1.5.7 with Spring Data JPA and Spring Batch. I use JpaPagingItemReader<T> to read entities and JpaItemWriter<T> to write them. What I am aiming to do, is read data from a certain database table, convert them to a different format and write them back to different tables (I read raw json strings, deserialize them and insert them to their specific tables). I don't plan to delete the data I read after processing them, instead I just want to mark them as processed. The

DeadlockLoserDataAccessException in Spring Batch

走远了吗. 提交于 2019-12-24 08:47:13
问题 I am struggling trying to find a solution to this and I'm hoping that someone out there can help. We have a Spring/Hibernate/Wicket/Tomcat webapp. We use Spring Batch for executing jobs in the background. Some execute every minute and check a database table in an external system to see if there are new records. So there are several jobs (maybe 8 or so) that are executing on some fixed interval. A couple of those jobs we have to do some manual queries to ensure there isn't a second one running

Spring Batch Integration job-launching-gateway

戏子无情 提交于 2019-12-24 07:06:51
问题 I am working on simple project that will lunch a job when a new file is created in specific folder but i don't want to use xml only java annotation , so my question is how can i implement the below in code <batch-int:job-launching-gateway request-channel="outboundJobRequestChannel" reply-channel="jobLaunchReplyChannel"/> <int:logging-channel-adapter channel="jobLaunchReplyChannel"/> BR Shahbour 回答1: Use the Spring Integration Java DSL; in your case, you would use ... .handle(jobLauncher())