spring-batch

Using Spring Batch JdbcCursorItemReader with NamedParameters

旧街凉风 提交于 2019-11-30 22:30:22
The Spring Batch JdbcCursorItemReader can accept a preparedStatementSetter : <bean id="reader" class="org.springframework.batch.item.database.JdbcCursorItemReader"> <property name="dataSource" ref="..." /> <property name="sql" value="SELECT * FROM test WHERE col1 = ?"> <property name="rowMapper" ref="..." /> <property name="preparedStatementSetter" ref="..." /> </bean> This works well if the sql uses ? as placeholder(s), as in the above example. However, our pre-existing sql uses named parameters, e.g. SELECT * FROM test WHERE col1 = :param . Is there a way to get a JdbcCursorItemReader to

Partitioned Job can't stop by itself after finishing? Spring Batch

会有一股神秘感。 提交于 2019-11-30 22:01:49
I wrote a Job of two Steps, with one of two steps being a partitioning step. The partition step uses TaskExecutorPartitionHandler and runs 5 slave steps in threads. The job is started in the main() method. But it's not stopping after every slave ItemReader returned null- the finish symbol. And even after the program ran past the last line of code in main() method (which is System.out.println("Finished")) the program process won't stop, hanging in memory and doing nothing. I have to press the stop button on Eclipse's panel to stop the program. the following is the content of a JobExecution

Is it a bad idea to change the Spring Batch Meta-Data tables manually?

时光怂恿深爱的人放手 提交于 2019-11-30 21:37:21
Background I'm using Spring Batch 2.1.8, and run jobs by CommandLineJobRunner . Such as: java org.springframework.batch.core.launch.support.CommandLineJobRunner classpath:launchContext.xml theJobId Problem At some condition such as a server crash, running job could be interrupted. But the interrupted job left a STARTED status in the Spring Batch Meta-Data tables, and can't be run again. org.springframework.batch.core.repository.JobExecutionAlreadyRunningException: A job execution for this job is already running I can think of two solutions: Solution1 Add a new job parameter and change it

Spring Batch - Skip Record On Process

北战南征 提交于 2019-11-30 21:25:52
I wanted to skip some record on process. what i have tried is, i have created custom exception and throw the exception when i want to skip the record and its calling the Skip listener onSkipInProcess method.Its working fine. please find the configuration. <batch:chunk reader="masterFileItemReader" writer="masterFileWriter" processor="itemProcessor" commit-interval="5000" skip-limit="100000" > <batch:skippable-exception-classes> <batch:include class="org.springframework.batch.item.file.FlatFileParseException"/> <batch:include class="com.exception.SkipException"/> </batch:skippable-exception

How does Spring Batch CompositeItemWriter manage transaction for delegate writers?

爱⌒轻易说出口 提交于 2019-11-30 21:18:46
问题 In the batch job step configuration, I plan to execute 2 queries in the writer, the 1st query is to update records in table A, then the 2nd query is to insert new records in table A again. So far I think CompositeItemWriter can achieve my goal above, i.e., I need to create 2 JdbcBatchItemWriters, one is for update, and the other one is for insert. My first question is if CompositeItemWriter is a fit for the requirement above? If yes, that lead to the second question about transaction. For

Multiple Spring Batch jobs executing concurrently causing deadlocks in the Spring Batch metadata tables

China☆狼群 提交于 2019-11-30 19:48:12
We have multiple Spring Batch jobs each running in their own java instance using the CommandLineJobRunner. All of the jobs are started simultaneously, only read/write flat files and update the same Spring Batch metadata hosted in SQL Server. The only database involved is the Spring Batch metadata database. When the multiple jobs are started simultaneously we get SQL deadlock exceptions. A more detailed stack trace can be found below. From the database perspective we can see that the deadlock victims were doing one of the following: Insert into BATCH_JOB_SEQ default values or Delete from BATCH

Spring batch restrict single instance of job only

ぃ、小莉子 提交于 2019-11-30 19:36:25
I have one spring batch job which can be kicked of by rest URL. I want to make sure only one job instance is allowed to run. and if another instance already running then don't start another. even if the parameters are different. I searched and found nothing out of box solution. thinking of extending SimpleJobLauncher. to check if any instance of the job running or not. Tomas Narros You could try to intercept the job execution , implementing the JobExecutionListener interface: public class MyJobExecutionListener extends JobExecutionListener { //active JobExecution, used as a lock. private

What are the Spring Batch “default” Context Variables?

筅森魡賤 提交于 2019-11-30 19:33:12
In the Spring Batch step-scope documentation , there are three unexplained spring-batch context maps: jobParameters , jobExecutionContext , and stepExecutionContext . Springsource sample code, combined: <bean id="flatFileItemReader" scope="step" class="org.springframework.batch.item.file.FlatFileItemReader"> <property name="var1" value="#{jobParameters['input.file.name']}" /> <property name="var2" value="#{jobExecutionContext['input.file.name']}" /> <property name="var3" value="#{stepExecutionContext['input.file.name']}" /> </bean> What are the default parameters available within jobParameters

Spring Batch: how to pass jobParameters to a custom bean?

落花浮王杯 提交于 2019-11-30 19:27:22
问题 Im still studying spring batch and came across a scenario where i need to pass a jobParameter to a custom bean. The job parameter contains a path of a file. Here is how my context looks like: <bean id="myBean" class=".....MyBean"> <property name="path" value="file:#{jobParameters['PATH'}/fileName"/> </bean> This is already included in a step scope from a reader that is not included here. The question is. When the class is instantiated, the value passed to the bean is "file:#{jobParameters[

Reading line breaks in CSV which are quoted in the file in FlatfileItemReader of spring batch

寵の児 提交于 2019-11-30 19:14:35
I am trying to parse a CSV file with FlatFileItemReader. This CSV contains some quoted newline characters as shown below. email, name abc@z.com, "NEW NAME ABC" But this parsing is failing with required fields are 2 but actual is 1. What I am missing in my FlatFileReader configuration? <property name="lineMapper"> <bean class="org.springframework.batch.item.file.mapping.DefaultLineMapper"> <!-- The lineTokenizer divides individual lines up into units of work --> <property name="lineTokenizer"> <bean class="org.springframework.batch.item.file.transform.DelimitedLineTokenizer"> <!-- Names of the