spring-batch

Version Incompatibility between Spring batch and cloudera hadoop

有些话、适合烂在心里 提交于 2019-12-01 12:03:34
I was trying the spring batch word count program and faced a version issue like this : ERROR [org.springframework.batch.core.step.AbstractStep] - <Encountered an error executing the step> java.lang.IncompatibleClassChangeError: Found interface org.apache.hadoop.mapreduce.Counter, but class was expected I use Cloudera Hadoop2 cdh4.5.0 and Spring hadoop version 1.0.1.RELEASE . I cant identify the exact problem as Spring batch is compatible with hadoop cdh4 . My dependency tree is as shown below. [INFO] org.springframework.data:batch-wordcount:jar:0.0.1 [INFO] +- org.springframework:spring

How to configure spring batch not to auto create batch tables for storing meta data?

丶灬走出姿态 提交于 2019-12-01 11:43:46
问题 I'm working on a spring batch which uses JPA to perform CURD operations on PostGres database. I'm using Spring boot 2.1.3. Even though I added the below configuration to disable Spring batch to use my postgres database for storing batch job meta data information, I' getting "ERROR: relation "batch_job_instance" does not exist" exception as shown below. Also I have followed the solution mentioned in here. Can anyone please suggest on what additional things needs to be done? hibernate.temp.use

Spring Batch: Listener event when Tasklet throws an exception

风流意气都作罢 提交于 2019-12-01 11:39:48
I'm using a tasklet and a StepExecutionListener but it seems there's no listener callback for a case where my tasklet throws an exception. For various other listener types – ChunkListener , ItemProcessListener , etc. – there is but none of those listeners work with tasklets. All I want is an event after my tasklet executes regardless of whether it threw an exception or not. Is it possible to do that? It doesn't appear to be supported in the API. Edit: Responding to @danidemi I'm registering the listener and tasklet using the programmatic API like this: steps.get(name) .listener(listener)

Spring Batch: Listener event when Tasklet throws an exception

强颜欢笑 提交于 2019-12-01 10:59:53
问题 I'm using a tasklet and a StepExecutionListener but it seems there's no listener callback for a case where my tasklet throws an exception. For various other listener types – ChunkListener , ItemProcessListener , etc. – there is but none of those listeners work with tasklets. All I want is an event after my tasklet executes regardless of whether it threw an exception or not. Is it possible to do that? It doesn't appear to be supported in the API. Edit: Responding to @danidemi I'm registering

Spring Batch JdbcPagingItemReader seems not EXECUTING ALL THE ITEMS

。_饼干妹妹 提交于 2019-12-01 10:56:51
I'm working on an app that extract records from an Oracle database and then are exported as one single tabulated file. However, when I attempt to read from the DB using JdbcPagingItemReader and write to a file I only get the number of records specified in pageSize. So if the pageSize is 10, then I get a file with 10 lines and the rest of the records seem to be ignored. So far, I haven't been able to find whats is really going on and any help would be most welcome. Here is the JdbcPagingItemReader config: <bean id="databaseItemReader" class="org.springframework.batch.item.database

ArrayList cannot be cast to org.springframework.batch.core.JobParameter

本秂侑毒 提交于 2019-12-01 10:54:05
I want to send a list from rest client to rest web service which will start a job in Spring Batch. Is that possible or must I save the list in database/flatfile before start the job and read the input values from database/flatfile? I guess someone pointed how do it in certain Jira issue (see below) but I couldn't figure out at least a basic idea how to move forward. I placed below my controller and how I am trying to cast it to JobParameter. I placed the Jira link and the possible direction perhaps I should take but I really didn't understand the suggestion in this Jira issue. I added below

Spring Batch Javaconfig - parameterize commit-interval aka chunksize

試著忘記壹切 提交于 2019-12-01 10:31:40
with Spring Batch xml based configuration you can parameterize the commit-interval / chunk size like: <job id="basicSimpleJob" xmlns="http://www.springframework.org/schema/batch"> <step id="basicSimpleStep" > <tasklet> <chunk reader="reader" processor="processor" writer="writer" commit-interval="#{jobParameters['commit.interval']}"> </chunk> </tasklet> </step> </job> with javaconfig based configuration it could look like @Bean public Step step( ItemStreamReader<Map<String, Object>> reader, ItemWriter<Map<String, Object>> writer, @Value("#{jobParameters['commit.interval']}") Integer

Performance optimization for processing of 115 million records for inserting into Oracle

蓝咒 提交于 2019-12-01 10:04:01
问题 I have a requirement where I am reading a text file placed in Unix of size 19 GB and having records around 115 million. My Spring Batch (Launcher) is getting triggered by Autosys and Shell script once the file is placed in the location. Initially on execution of this process it took around 72hrs to read, process(Null checks and date parsing) and write the data into the Oracle database. But after certain configuration changes like using Throttle Limit, Task Executor etc, I was able to reduce

Spring Batch JdbcPagingItemReader seems not EXECUTING ALL THE ITEMS

和自甴很熟 提交于 2019-12-01 10:00:43
问题 I'm working on an app that extract records from an Oracle database and then are exported as one single tabulated file. However, when I attempt to read from the DB using JdbcPagingItemReader and write to a file I only get the number of records specified in pageSize. So if the pageSize is 10, then I get a file with 10 lines and the rest of the records seem to be ignored. So far, I haven't been able to find whats is really going on and any help would be most welcome. Here is the

Json Array reader file with spring batch

爷,独闯天下 提交于 2019-12-01 09:49:56
I have a file as an input which contain a json Array : [ { ..., ... }, { ..., ... }, { ..., ... } ] I want to read it without breaking the spring batch principales (With the same way as FlatFileReader or XmlReader) I didn't find any way to do it the readers already implemented in spring-batch . What's the best way to implement this reader ? Thanks in Advance Assuming you want to model the StaxEventItemReader in that you want to read each item of the JSON array as an item in Spring Batch, here's what I'd recommend: RecordSeparatorPolicy - You'll need to implement your own RecordSepartorPolicy