spring-batch

Spring Batch - Process lines in huge file, start new job for each line

帅比萌擦擦* 提交于 2020-01-06 04:13:11
问题 Addition to this question: Spring Batch - where does the process run Once every hour i get a huge file with Usage Data for a set of services. A Usage Data line has a UserId and a ServiceId and how long/much the service has been used. For each Usage Data i need to check which Subscription the User has, and how much the service cost per. unit for this particular Subscription. This will result in a line i can bill to the user. For each line: Database: Find Subscription by User Database: Find

Spring Batch : Aggregating records and write count

一曲冷凌霜 提交于 2020-01-05 09:28:12
问题 We have some data coming in the flat file. e.g. EmpCode,Salary,EmpName,... 100,1000,...,... 200,2000,...,... 200,2000,...,... 100,1000,...,... 300,3000,...,... 400,4000,...,... We would like to aggregate the salary based on the EmpCode and write to the database as Emp_Code Emp_Salary Updated_Time Updated_User 100 2000 ... ... 200 4000 ... ... 300 3000 ... ... 400 4000 ... ... I have written classes as per the Spring Batch as follows ItemReader - to read the employee data into a Employee

Why does Spring Batch use 1 database connection for each thread?

泪湿孤枕 提交于 2020-01-05 07:38:09
问题 Why does Spring Batch use 1 database connection for each thread? Stack: Java 8 Spring Boot 1.5 Spring Batch 3.0.7 HikariCP 2.7.6 DataSource config: batcdb (postgres) readdb (oracle) writedb (postgres) Each datasource is using HikariCP with default 10 connections each. Spring Batch config: ThreadExecutor-1: core-pool-size: 10 max-pool-size: 10 throttle-limit: 10 Job-1 Config / ThreadPoolTaskExecutor: (pool sizes and throttle limit set via application.yml) @Bean public Step job1Step() { return

Why does Spring Batch use 1 database connection for each thread?

China☆狼群 提交于 2020-01-05 07:38:08
问题 Why does Spring Batch use 1 database connection for each thread? Stack: Java 8 Spring Boot 1.5 Spring Batch 3.0.7 HikariCP 2.7.6 DataSource config: batcdb (postgres) readdb (oracle) writedb (postgres) Each datasource is using HikariCP with default 10 connections each. Spring Batch config: ThreadExecutor-1: core-pool-size: 10 max-pool-size: 10 throttle-limit: 10 Job-1 Config / ThreadPoolTaskExecutor: (pool sizes and throttle limit set via application.yml) @Bean public Step job1Step() { return

Spring Batch File Archiving

这一生的挚爱 提交于 2020-01-05 05:55:27
问题 i am currently in process of learning spring batch and i have been challenged with a task of file archiving. Basically i need to read separate CSV files and put them in a new archived folder with the original filename appended with the current date. What i want to know is how i can get the original filename from the multiResourceItemReader, use it in the FlatfileItemWriter as the filename + date and deleting the original file afterwards. Here's my current code: @Autowired public

Difference between Batch Status and Exit Status in Spring Batch

不羁的心 提交于 2020-01-03 07:24:26
问题 Difference between Batch Status and Exit Status in Spring Batch 回答1: From the Spring Batch documentation: A BatchStatus object that indicates the status of the execution. While running, it's BatchStatus.STARTED, if it fails, it's BatchStatus.FAILED, and if it finishes successfully, it's BatchStatus.COMPLETED The ExitStatus indicating the result of the run. It is most important because it contains an exit code that will be returned to the caller. For more on the difference, see the section 5.3

shared drive csv file load to Mssql table using spring

≯℡__Kan透↙ 提交于 2020-01-03 06:39:31
问题 I am searching for approach/ code base which can fulfill the below requirement. We have source file(formatted) in shared drive which has ~one million record count, this drive has new file every day with date prefix on it(eg: 02-12-2018_abcd.txt) 2.While reading file from sharedrive location, if its any failure occuer it should not commit the sql insert. 3.this job should run on schduled time. I found the couple of approaches to read file from shared drive like jar to read, another approach is

Spring Batch remote partitioning how to shutdown slaves

风格不统一 提交于 2020-01-03 06:03:07
问题 I want to use Spring Batch remote partitioning to handle large workloads on the cloud, and spin up/shutdown VMs on demand. However, when configuring the slave steps, I'm using the StepExecutionRequestHandler to handle the step requests from a JMS queue. Right now the application just hangs. How can I shut down the application after the queue is depleted? 回答1: How can I shut down the application after the queue is depleted? In a remote partitioning setup, workers are listeners on a queue on

spring batch remains in EXECUTING

早过忘川 提交于 2020-01-03 05:23:41
问题 I created a job which uses reader of type org.springframework.batch.item.database.HibernateCursorItemReader to execute a query. The problem is database connection in this case is hitting connection limit (I have a oracle error ORA-12519, TNS:no appropriate service handler found ) and, surprisingly, I noticed exit_code=EXECUTING and status=STARTED on BATCH_STEP_EXECUTION table. If I run again the job it will respond "A job execution for this job is already running" and if I issue -restart on

spring batch remains in EXECUTING

本秂侑毒 提交于 2020-01-03 05:23:11
问题 I created a job which uses reader of type org.springframework.batch.item.database.HibernateCursorItemReader to execute a query. The problem is database connection in this case is hitting connection limit (I have a oracle error ORA-12519, TNS:no appropriate service handler found ) and, surprisingly, I noticed exit_code=EXECUTING and status=STARTED on BATCH_STEP_EXECUTION table. If I run again the job it will respond "A job execution for this job is already running" and if I issue -restart on