spring-batch

Is there any way to persist some data in database after an exception occurs in ItemWriter in spring batch?

泄露秘密 提交于 2019-12-11 08:29:26
问题 I want to persist some data after an exception occurs in ItemWriter step. But if I am not incorrect, in this case a rollback would happen and so this persisting logic won't work.Is there any way to achieve this? Right now, I am implementing itemWriteListener and all the persisting logic is written inside onWriteError method. This logic only concerns to change state of some entities to error. 回答1: You'll want to use a separate connection that does not participate in the transaction. For

spring batch StaxEventItemReader release out an exception

萝らか妹 提交于 2019-12-11 08:26:32
问题 when I running a project of spring batch, exception occured! Exception detail: Caused by: java.lang.NullPointerException: null at org.springframework.batch.item.xml.StaxEventItemReader.moveCursorToNextFragment(StaxEventItemReader.java:141) fileName is correct! configuration code: @Bean @StepScope public StaxEventItemReader xmlFileItemReader(@Value("#{jobParameters['fileType']}") String fileType, @Value("#{jobExecutionContext['extractFileName']}") String fileName) throws Exception { System.out

Need a way to prevent unwanted job param from propagating to next execution of spring boot batch job

拜拜、爱过 提交于 2019-12-11 08:24:37
问题 I am running a batch app using spring boot 2.1.2 and spring batch 4.1.1. The app uses a MySQL database for the spring batch metadata data source. First, I run the job with this command: java -jar target/batchdemo-0.0.1-SNAPSHOT.jar -Dspring.batch.job.names=echo com.paypal.batch.batchdemo.BatchdemoApplication myparam1=value1 myparam2=value2 Notice I am passing two params: myparam1=value1 myparam2=value2 Since the job uses RunIdIncrementer, the actual params used by the app are logged as: Job:

ClassCastException in StaxEventItemReader

微笑、不失礼 提交于 2019-12-11 08:18:27
问题 I have a problem with ClassCaseException. I created XStreamItemEventReader in Spring batch to read xml and write result to database. Readers i reading xml which should be parsed into that class: @Data @AllArgsConstructor @NoArgsConstructor public class EmployeeDTO { private Long id; private String firstName; private String surname; private String email; private Integer age; } My item reader for xml is create in that class : public final class XMLReader<T> extends StaxEventItemReader<T>

Stopping a Job in the beforeStep in Spring Batch

旧城冷巷雨未停 提交于 2019-12-11 07:38:02
问题 I want to be able to stop a job when a timing threshold is met. There are 2 approaches I was thinking about. First was to stop the job in the afterStep. However, I do not want it to have a Stopped status if it is at the completion of the last step. Therefore, I am going with stopping it in the beforeStep. I tried experimenting with public void beforeStep(StepExecution stepExecution) { stepExecution.setStatus(BatchStatus.STOPPED); return; } and public void beforeStep(StepExecution

Performance issue with MultiResourcePartitioner in Spring Batch

我的梦境 提交于 2019-12-11 06:58:22
问题 I have a spring batch project that reads a huge zip file containing more than 100.000 xml files. I am using MultiResourcePartitioner, and I have a Memory issue and my batch fails with java.lang.OutOfMemoryError: GC overhead limit exceeded. It seems like if all the xml files are loaded in memory and not garbaged after processing. Is there a performant way to do this ? Thanks. 来源: https://stackoverflow.com/questions/38793243/performance-issue-with-multiresourcepartitioner-in-spring-batch

How to set scheduler for Spring Batch jobs in Spring Cloud Data Flow?

风流意气都作罢 提交于 2019-12-11 06:58:09
问题 I’m setting up a new Spring Batch Jobs and want to deploy it using SCDF . However, I have found that SCDF does not support scheduler feature in local framework . I have 3 questions to ask you: Can someone explain how scheduler of SCDF work? Are there any ways to schedule 1 job using SCDF ? Can I use my local server as a Cloud Foundry? and how? 回答1: Yes, Spring Cloud Data Flow does not support scheduling on local platform. Please note that the local SCDF server is for development purposes only

Does Spring-batch acquires connections from datasource for whole job running time?

旧巷老猫 提交于 2019-12-11 06:41:55
问题 Does Spring-batch acquires connections from datasource for whole job running time? In general I have long running steps in Spring-batch job. During execution Springs takes connection from datasource managed by C3P0 and when steps runs too long C3P0 collects this connections by unreturnedConnectionTimeout - which prevents Springs to finish its manipulations with DB. In order to manage this I am considering to refactor long running tasklet steps to chunk oriented with hope that Spring acquire

SPRING BATCH : How to configure remote chunking for multiple jobs running in a task executor

时光毁灭记忆、已成空白 提交于 2019-12-11 06:12:45
问题 I am new to spring batch processing. I am using remote chunking where there is a master , multiple slaves and ActiveMQ for messaging. Master has a job and a job launcher and the job launcher has a task-executor which is having following configuration <task:executor id="batchJobExecutor" pool-size="2"queue-capacity="100" /> . Chunk configuration is <bean id="chunkWriter" class="org.springframework.batch.integration.chunk.ChunkMessageChannelItemWriter" scope="step"> <property name=

Spring Batch: what is the best way to use, the data retrieved in one TaskletStep, in the processing of another step

安稳与你 提交于 2019-12-11 06:09:28
问题 I have a job in which: The first step is a TaskletStep which retrieves some records(approx. 150-200) from a database table into a list. The second step retrieves data from some other table and requires the list of records retrieved in the previous step for processing. I came across three ways to do this: 1)putting the list retrieved in first step in StepExecutionContext and then promoting it to JobExecutionContext to share data between steps. 2)using spring's caching concept i.e. using