spring-batch

Has anyone tried implementing a unsupported database to use for jobRepository for Spring Batch?

你离开我真会死。 提交于 2019-11-29 12:30:31
My database Sap Hana db is not supported for Spring Batch. I am looking for a guide on how to implement my own DAOs for SimpleJobRepository for SpringBatch. Has tried this before? I did not include the database type property because according to spring batch website not including it will auto search for a database type. I also used JobRepositoryFactoryBean since the db is unsupported. I am excited though to write my own implementation for this, maybe i can contribute it to spring batch source. My Setting is as follows: <bean id="jobRepository" class="org.springframework.batch.core.repository

Unterminated Double Quotes in Spring Batch

谁都会走 提交于 2019-11-29 12:22:09
I am new to Spring Batch and I have run into a problem. The batch application I am working on reads and processes lines from a delimited text file. I have configured the application to use a FlatFileReader to read the delimited text file, but the issue is that some of the data being read has a double quote in it. A FlatFileParseException is thrown when the FlatFileReader encounters a single double quote, but none is thrown when two double quotes are present. Has anyone come across this issue, and if so, what would be the proper resolution? Manipulating the data itself is not an option

Spring Batch - How to prevent batch from storing transactions in DB

生来就可爱ヽ(ⅴ<●) 提交于 2019-11-29 12:07:24
First the problem statement: I am using Spring-Batch in my DEV environment fine. When I move the code to a production environment I am running into a problem. In my DEV environment, Spring-Batch is able to create it's transaction data tables in our DB2 database server with out problem. This is not a option when we go to PROD as this is a read only job. Attempted solution: Search Stack Overflow I found this posting: Spring-Batch without persisting metadata to database? Which sounded perfect, so I added @Bean public ResourcelessTransactionManager transactionManager() { return new

Access @JobScope bean in spring batch with partitioned step

纵饮孤独 提交于 2019-11-29 11:35:39
Is there a way to access bean which is defined as @JobScope in partitioned step? We defined http client bean as @JobScope since it is unique per job but dynamically created and we need it in slave steps to issue post requests. When we autowire everything we get Error creating bean with name 'scopedTarget.captureErpStockTasklet': Scope 'step' is not active for the current thread; consider defining a scoped proxy for this bean if you intend to refer to it from a singleton; nested exception is java.lang.IllegalStateException: No context holder available for step scope Here is job configuration

Spring Batch: org.springframework.batch.item.ReaderNotOpenException: Reader must be open before it can be read

 ̄綄美尐妖づ 提交于 2019-11-29 11:20:15
问题 I read SO related questions but the solutions don't work for me. I get the org.springframework.batch.item.ReaderNotOpenException: Reader must be open before it can be read exception. Below is my configuration: @Bean @StepScope public ItemReader<Player> reader(@Value("#{jobParameters[inputZipfile]}") String inputZipfile) { final String [] header = { .. this part omitted for brevity ... }; FlatFileItemReader<Player> reader = new FlatFileItemReader<Player>(); System.out.println("\t\t\t\t\t"

using spring batch to execute jobs in parallel

China☆狼群 提交于 2019-11-29 08:41:12
I have a use case as follows: 1)There is a Parentjob which has multiple child jobs. 2)All child jobs should be executed in parallel. 3)The parent job should wait until all child jobs are done. 4)One child jobs are done control returns to master 5)Master job is completed. 6)In case any of the child job throws exception then also control should return to master job Is this possible using spring batch? EDIT : Am not looking to execute multiple steps of job in parellel.But multiple child jobs of same parent in parallel Maybe something like that? Create job Add chunk tasklet to this job. Reader get

Spring-Batch: how do I return a custom Job exit STATUS from a StepListener to decide next step

一笑奈何 提交于 2019-11-29 07:40:50
The issue is this: I have a Spring Batch job with multiple step. Based on step one i have to decide the next steps. Can i set status in STEP1- passTasklet based on a job parameter so that i can set the Exit status to a custom status and define it in the job definition file to go to which next step. Example <job id="conditionalStepLogicJob"> <step id="step1"> <tasklet ref="passTasklet"/> <next on="BABY" to="step2a"/> <stop on="KID" to="step2b"/> <next on="*" to="step3"/> </step> <step id="step2b"> <tasklet ref="kidTasklet"/> </step> <step id="step2a"> <tasklet ref="babyTasklet"/> </step> <step

Spring Batch JUnit test for multiple jobs

只愿长相守 提交于 2019-11-29 07:21:24
I am having two jobs configured in one context file <batch:job id="JobA" restartable="true"> <batch:step id="abc"> <batch:tasklet > <batch:chunk reader="reader" writer="writer" processor="processor" /> </batch:tasklet> </batch:step> </batch:job> <batch:job id="JobB" restartable="true"> <batch:step id="abc"> <batch:tasklet > <batch:chunk reader="reader" writer="writer" processor="processor" /> </batch:tasklet> </batch:step> </batch:job> When i am doing unit testing for the JobA using JobLauncherTestUtils and testing the job launch it is throwing an exception saying No unique bean of type [org

Grid Size in Spring batch

孤人 提交于 2019-11-29 04:49:05
I have batch job which reads data from bulk files, process it and insert in DB. I'm using spring's partitioning features using the default partition handler. <bean class="org.spr...TaskExecutorPartitionHandler"> <property name="taskExecutor" ref="taskExecutor"/> <property name="step" ref="readFromFile" /> <property name="gridSize" value="10" /> </bean> What is the significance of the gridSize here ? I have configured in such a way that it is equal to the concurrency in taskExecutor. gridSize specifies the number of data blocks to create to be processed by (usually) the same number of workers .

Spring Batch how to filter duplicated items before send it to ItemWriter

浪尽此生 提交于 2019-11-29 04:44:24
I read a flat file (for example a .csv file with 1 line per User, Ex: UserId;Data1;Date2 ). But how to handle duplicated User item in the reader (where is no list of previus readed users...) stepBuilderFactory.get("createUserStep1") .<User, User>chunk(1000) .reader(flatFileItemReader) // FlatFileItemReader .writer(itemWriter) // For example JDBC Writer .build(); Michael Minella Filtering is typically done with an ItemProcessor . If the ItemProcessor returns null, the item is filtered and not passed to the ItemWriter . Otherwise, it is. In your case, you could keep a list of previously seen