spring-batch

Spring batch - running multiple jobs in parallel

☆樱花仙子☆ 提交于 2019-11-28 10:34:37
I am new to Spring batch and couldn't figure out how to do this.. Basically I have a spring file poller which runs every N mins to look for files with some name (ex: A.txt & B.txt) in certain directory. At any moment in time, there could be max 2 files in this directory (A and B). Through Spring Batch Job, these two files will be processed and persisted to 2 different DB tables. These files are somewhat similar, so the same processor/writer is used. Right now the way I set up, every polling cycle 1 file is picked up and job is ran. Let's say there are 2 files in the directory (A.txt and B.txt)

Seperate datasource for jobrepository and writer of Spring Batch

旧街凉风 提交于 2019-11-28 10:12:19
问题 Job is simple CSVtoDBFileWriter : read CSV fileName and location from oracle1 read CSV file(batchreader) write(batch writer) in tables(oracle2) I have 2 datasource : Oracle1 : available when application context loads and : I read properties from Oracle1's tables to create oracle2 : jobRepository related tables should be stored here(oracle1) Oracle2 : Destination database where Spring batch's writer will finally output. Question1 : Do I need XA transactionManager? Question2 : Can I run this

Spring Batch: How to setup a FlatFileItemReader to read a json file?

删除回忆录丶 提交于 2019-11-28 09:47:56
问题 My approach so far: @Bean FlatFileItemReader<Blub> flatFileItemReader() { FlatFileItemReader<Blub> reader = new FlatFileItemReader<>(); reader.setResource(new FileSystemResource("test.json")); JsonLineMapper lineMapper = new JsonLineMapper(); reader.setLineMapper(lineMapper); return reader; } The challenge is: reader.setLineMapper() cannot use the JsonLineMapper . How to use the JsonLineMapper properly? 回答1: How to setup a FlatFileItemReader to read a json file? It depends on the format of

Spring-Batch Multi-line record Item Writer with variable number of lines per record

Deadly 提交于 2019-11-28 09:29:45
I have the below requirement but am not able to decide on the approach to take: I need to write data to a fixed format out put file where each record spans over multiple lines as seen below: 000120992599999990000000000000009291100000000000000000000000010000 000000000000000000000006050052570009700000050000990494920000111100 ABCDE:WXYZ 0200 descriptiongoesheredescriptiongoesheredescriptiongoesher0200 descriptiongoesheredescriptiongoesheredescriptiongoesher0200 descriptiongoesheredescriptiongoesheredescriptiongoesher0200 descriptiongoesheredescriptiongoesheredescriptiongoesher0200

Spring batch :Restart a job and then start next job automatically

时光毁灭记忆、已成空白 提交于 2019-11-28 09:20:47
问题 I need to create a recovery pattern. In my pattern I can launch a job only on a given time window. In case the job fails, it will only be restarted on the next time window and when finish I would like to start the schedule job that was planned in advance for this window. The only different between jobs is the time window parameters. I thought about JobExecutionDecider with conjunction with JobExplorer or overriding a Joblauncher. But all seems too intrusive. I failed to found an example that

Spring Batch - Reading a large flat file - Choices to scale horizontally?

我们两清 提交于 2019-11-28 08:43:26
I have started researching Spring Batch in the last hour or two. And require your inputs. The problem : Read a/multiple csv file(s) with 20 million data, perform minor processing, store it in db and also write output to another flat file in the least time. Most important : I need to make choices which will scale horizontally in the future. Questions : Use Remote Chunking or Partitioning to scale horizontally? Since data is in a flat file both Remote Chunking and Partitioning are bad choices? Which multi process solution will make it possible to read from a large file, spread processing across

Spring Batch Processor

一个人想着一个人 提交于 2019-11-28 07:04:16
问题 I have a requirement in Spring Batch where I have a file with thousands of records coming in a sorted order.The key field is product code. The file may have multiple records of the same product code.The requirement is that I have to group the records that have the same product Code in a collection (i.e List) and then send them over to a method i.e validateProductCodes(List prodCodeList). I am looking for the best way to do this.The approach I thought of was to read every record in the

Spring batch execute dynamically generated steps in a tasklet

倾然丶 夕夏残阳落幕 提交于 2019-11-28 06:47:47
问题 I have a spring batch job that does the following... Step 1. Creates a list of objects that need to be processed Step 2. Creates a list of steps depending on how many items are in the list of objects created in step 1. Step 3. Tries to executes the steps from the list of steps created in step 2. The executing x steps is done below in executeDynamicStepsTasklet(). While the code runs without any errors it does not seem to be doing anything. Does what I have in that method look correct? thanks

How to read CSV file with different number of columns with Spring Batch

放肆的年华 提交于 2019-11-28 06:40:09
问题 I have a CSV file that doesn't have a fixed number of columns, like this: col1,col2,col3,col4,col5 val1,val2,val3,val4,val5 column1,column2,column3 value1,value2,value3 Is there any way to read this kind of CSV file with Spring Batch? I tried to do this: <bean id="ItemReader" class="org.springframework.batch.item.file.FlatFileItemReader"> <!-- Read a csv file --> <property name="resource" value="classpath:file.csv" /> <property name="lineMapper"> <bean class="org.springframework.batch.item

Has anyone tried implementing a unsupported database to use for jobRepository for Spring Batch?

你离开我真会死。 提交于 2019-11-28 06:37:24
问题 My database Sap Hana db is not supported for Spring Batch. I am looking for a guide on how to implement my own DAOs for SimpleJobRepository for SpringBatch. Has tried this before? I did not include the database type property because according to spring batch website not including it will auto search for a database type. I also used JobRepositoryFactoryBean since the db is unsupported. I am excited though to write my own implementation for this, maybe i can contribute it to spring batch source