spring-batch

Using jndi datasource with spring batch admin

强颜欢笑 提交于 2019-12-04 11:41:24
问题 When using Spring Batch Admin, it tries to provide some defaults for dataSource, transactionManager etc. If you want to override these defaults, you create your own xml bean definitions under META-INF/spring/batch/servlet/override/ folder and during the bootstrap it guarantees that the default properties will be overridden. In spring-batch-admin, a dataSource default is defined in data-source-context.xml with this definition <bean id="dataSource" class="org.apache.commons.dbcp.BasicDataSource

Spring Batch Meta-Data tables Purging

五迷三道 提交于 2019-12-04 06:38:48
问题 In MySQL DB, 1. Does spring batch provides a way to purge meta-data tables? 2. Or we need to purge and archive the meta-data tables manually? 3. how spring meta-data tables are maintained well in PROD environment with out Purging mechanism? Need guidance on this! 回答1: i have been struggling with this for a ling time but, there is no standard implementation for this. Then i came up with a my own stored procedure , I have created my own variable - for clearing last 6 months data AGO_SIX_MONTH

Limit the lifetime of a batch job

流过昼夜 提交于 2019-12-04 06:37:14
问题 Is there a way to limit the lifetime of a running spring-batch job to e.g. 23 hours? We start a batch job daily by a cron job and he job takes about 9 hours. It happened under some circumstances that the DB connection was so slow that the job took over 60 hours to complete. The problem is that the next job instance gets started by the cronjob the next day - and then anotherone the day after - and anotherone... If this job is not finished within e.g. 23 hours, I want to terminate it and return

Spring batch integration MessageSource<InputStream> to Joblaunch request

别等时光非礼了梦想. 提交于 2019-12-04 06:25:51
问题 I am planning to use S3Streaming message source to process the import file (xml)received in S3. I am not sure how to transform MessageSource to job launch request as job parameter doesn't support parameters other than primitive type, please throw some light on how to proceed on this - Thanks 回答1: If you mean you want to pass the InputStream payload to an ItemReader , no, that can't be done with a JobLauncher . Instead, the ItemReader itself needs to open the input stream (perhaps using a

Spring Batch DelimitedLineTokenizer class quoteCharacter property behavior

蓝咒 提交于 2019-12-04 06:24:30
问题 I have a item reader as below: <beans:bean id="myItemReader" class="org.springframework.batch.item.file.FlatFileItemReader"> <beans:property name="resource" ref="myFileResource" /> <beans:property name="lineMapper"> <beans:bean class="org.springframework.batch.item.file.mapping.DefaultLineMapper"> <beans:property name="lineTokenizer"> <beans:bean class="org.springframework.batch.item.file.transform.DelimitedLineTokenizer"> <beans:property name="delimiter" value="|"/> <beans:property name=

Retry not working with FaultTolerantStepBuilder

只愿长相守 提交于 2019-12-04 06:22:06
问题 I am trying to build a retry into my error-prone itemreader. I have set up a small POC to test if everything is working, but I am not able to get retries running. Here is what I did: @Configuration @EnableBatchProcessing public static class TestConfiguration { // other beans @Bean @Qualifier("importFullJob") public Job importFullJob(ItemReader itemReader) { TaskletStep mockStep = stepBuilderFactory.get("mockStep") .chunk(1) .faultTolerant() .retry(RestClientException.class) .retryLimit(10)

Spring Batch after JVM crash

此生再无相见时 提交于 2019-12-04 05:45:26
How to restart Jobs after a JVM crash? I was running a lot of Jobs implemented in Spring Batch framework, when my JVM crashed or the system failed. How can I restart these Jobs after failure? Basically, you can do as follows: Configure a JobExplorer factory bean in your application context: Configure a JobOperator bean in your applictaion context Query the jobExplorer for distinct job names: jobExplorer.getJobNames() For each job from step (3), query the jobExplorer for unfinished jobs: jobExplorer.findRunningJobExecutions(String jobName) For each JobExecution from step (4) invoke: jobOperator

Spring Batch :Aggregated reader / writer Issue

六月ゝ 毕业季﹏ 提交于 2019-12-04 05:19:51
问题 I am trying to use Spring batch and implement an aggregated reader (batch file, where multiple records should be treated as one record while writing). Here is the code snippet for my reader: public class AggregatePeekableReader implements ItemReader<List<T>>, ItemStream { private SingleItemPeekableItemReader<T> reader; private boolean process(T currentRecord , InvoiceLineItemsHolder holder) throws UnexpectedInputException, ParseException, Exception { next = peekNextInvoiceRecord(); // finish

Remote partition - slave getting greedy

做~自己de王妃 提交于 2019-12-04 04:48:10
问题 Following is what we are trying to achieve. We want a big xml file to be staged in a database parallely in different vms. To achieve this, we are using the scalable spring batch remote partition approach and we are running into some issues. Following is the high level setup master - splits an xml file into multiple partitions ( we currently have a grid size of 3). slave 1 - processing partitions (reads index based partitions and writes to DB) slave 2 - processing partitions We are running it

PoiItemReader with StepScope Annotation do not read Excel file

梦想的初衷 提交于 2019-12-04 04:43:37
问题 I want to pass a JobParameter in Spring Batch to my PoiItemReader, to locate the Excelfilepath. So I have to use the Annotation @StepScope @StepScope @Bean ItemReader<StudentDTO> excelStudentReader( @Value("#{jobParameters[filePath]}") String filePath) { PoiItemReader<StudentDTO> reader = new PoiItemReader<>(); reader.setResource(new ClassPathResource(filePath)); reader.setRowMapper(new StudentExcelRowMapper()); return reader; } The Job launched without exceptions, but the reader do not read