spring-batch

spring batch : Tasklet without ItemWriter

笑着哭i 提交于 2019-12-04 14:57:42
I defined my tasklet without ItemWriter like this : <b:tasklet> <b:chunk reader="baseReader" processor="baseProcessor" commit-interval="100" /> </b:tasklet> and i got this error : Configuration problem: The <b:chunk/> element has neither a 'writer' attribute nor a <writer/> element. Do you have any idea ? Thanks Well, in a chunk, A reader and a Writer are MANDATORY! however, The ItemProcessor is optional. This is from the official doc : 5.1.1. Configuring a Step Despite the relatively short list of required dependencies for a Step, it is an extremely complex class that can potentially contain

Spring Batch Multiple Threads

爱⌒轻易说出口 提交于 2019-12-04 14:31:02
问题 I am writing a Spring Batch with idea of scaling it when required. My ApplicationContext looks like this @Configuration @EnableBatchProcessing @EnableTransactionManagement @ComponentScan(basePackages = "in.springbatch") @PropertySource(value = {"classpath:springbatch.properties"}) public class ApplicationConfig { @Autowired Environment environment; @Autowired private JobBuilderFactory jobs; @Autowired private StepBuilderFactory stepBuilderFactory; @Bean public Job job() throws Exception {

Dynamically choose a spring-batch reader at runtime

血红的双手。 提交于 2019-12-04 14:12:29
问题 I have a spring-batch job that converts various bank statements into my app. There is a different reader for each bank statement type and only one writer for all of them. The job is very simple - read, process and write: <batch:job id="importer" restartable="true"> <batch:step id="import"> <batch:tasklet> <batch:chunk reader="reader" writer="writer" processor="processor" commit-interval="10" /> </batch:tasklet> </batch:step> </batch:job> Now, I would like the end user to provide a statement

How to skip batch step when condition is false

橙三吉。 提交于 2019-12-04 13:20:38
I have one basis job with one basic step. This jobs is executing every x second (I am using quartz for this). Then in my config class I also have variable "runStep". Where should I add this attribute and run my step only if runStep is true ? <batch:job id="export1" parent="baseJob"> <batch:step id="registruj" parent="baseStep"> <tasklet> <chunk reader="registrujReader" processor="registrujProcessor" writer="registrujWriter" commit-interval="1" /> </tasklet> </batch:step> </batch:job> <bean id="baseJob" class="org.springframework.batch.core.job.SimpleJob" abstract="true"> <property name=

Repeating a Step x times in Spring Batch

心不动则不痛 提交于 2019-12-04 13:06:56
问题 I'm using Spring Batch 3.0.3 configured with annotation to create a batch job that repeats a step an undetermined number of times. My first step will read into memory a list of items used during the repeating step. I'd like the repeating steps to iterate through this job-scoped list. How can I configure my job to run the same step x times? I've seen examples in xml of a step specifying the next step to run. I'm thinking I could point two steps to each other in an infinite loop until the list

Spring Batch-Repeat step for each item in a data list

橙三吉。 提交于 2019-12-04 13:00:05
This is a tough one, but I am sure it is not unheard of. I have two datasets, Countries and Demographics. The countries dataset contains the name of a country and an ID to it's Demographic data. The demographic dataset is a hierarchal dataset starting from the country down to the suburb. Both of these datasets are pulled from a 3rd party on a weekly basis. I need to split the demographics out into files, one for each country. So far the steps that i have are 1) Pull Countries 2) Pull Demographics 3) (this is needed) Loop over the country dataset calling a "Write Country Demographics to File"

Reading file from HDFS using Spring batch

你离开我真会死。 提交于 2019-12-04 12:37:06
I've to write a Spring batch which will read a file from HDFS and will update the data in MySQL DB. The source file in HDFS contains some report data, in CSV format. Can someone point me to an example of reading a file from HDFS? Thanks. The FlatFileItemReader in Spring Batch works with any Spring Framework Resource implementation: @Bean public FlatFileItemReader<String> itemReader() { Resource resource; // get (or autowire) resource return new FlatFileItemReaderBuilder<String>() .resource(resource) // set other reader properties .build(); } So if you manage to have a Resource handle pointing

Using Spring Batch to write to a Cassandra Database

北城余情 提交于 2019-12-04 12:06:48
As of now, I'm able to connect to Cassandra via the following code: import com.datastax.driver.core.Cluster; import com.datastax.driver.core.Session; public static Session connection() { Cluster cluster = Cluster.builder() .addContactPoints("IP1", "IP2") .withCredentials("user", "password") .withSSL() .build(); Session session = null; try { session = cluster.connect("database_name"); session.execute("CQL Statement"); } finally { IOUtils.closeQuietly(session); IOUtils.closeQuietly(cluster); } return session; } The problem is that I need to write to Cassandra in a Spring Batch project. Most of

Spring Batch Item Reader - use skippedLinesCallback to set input field names

瘦欲@ 提交于 2019-12-04 12:01:46
I have a simple job as below: <batch:step id="step"> <batch:tasklet> <batch:chunk reader="itemReader" processor="itemProcessor" writer="itemWriter" commit-interval="5000" /> </batch:tasklet> </batch:step> itemReader is as below: <bean id="itemReader" class="org.springframework.batch.item.file.FlatFileItemReader" scope="step"> <property name="linesToSkip" value="1"></property> <property name="skippedLinesCallback" ref="skippedLinesCallback" ></property> <property name="lineMapper"> <bean class="org.springframework.batch.item.file.mapping.DefaultLineMapper"> <property name="lineTokenizer"> <bean

Create new output file using FlatFileItemWriter in spring-batch

◇◆丶佛笑我妖孽 提交于 2019-12-04 12:00:06
I have a simple spring batch job - read a file line by line, do something with the input string, and write some output. Output file contains every line of input plus some processing status for that line (success/failure.) The reads a file from: <dir>/<inputFolder>/<inputFileName> and writes processed output to <dir>/<outputFolder>/<inputFileName> All these values are passed as jobParameters File Reader is like so: <bean id="itemReader" class="org.springframework.batch.item.file.FlatFileItemReader" scope="step"> <property name="resource" value="file:#{jobParameters['cwd']}/#{jobParameters[