spring-batch

Spring Batch - Could not @Autowired SimpleJobLauncher

寵の児 提交于 2019-12-13 05:01:30
问题 I face the below issue though I have correctly defined SimpleJobLauncher Description: Field jobLauncher in com.abcplusd.application.BatchConfig required a bean of type 'org.springframework.batch.core.launch.support.SimpleJobLauncher' that could not be found. Action: Consider defining a bean of type 'org.springframework.batch.core.launch.support.SimpleJobLauncher' in your configuration. The following are my source code.. package com.abcplusd.application; import org.springframework.batch.core

Spring Batch - Write multiple files based on records count

早过忘川 提交于 2019-12-13 04:56:29
问题 In spring batch, I have an requirement to read from the database and to write in a file, The no of rows allowed in a file is N, so if N+10 records are fetched then two files should be created containing N rows and 10 rows respectively. Can someone please help me with the writer implementation? Is there any other easy way of doing it? Thanks. 回答1: Spring batch has MultiResourceItemWriter were you can write based on number of lines <bean id="multiWriter" class="org.springframework.batch.item

SpringBatch: Test a JobExecutionListener

邮差的信 提交于 2019-12-13 04:42:12
问题 I got a JobExecutionListener as follows: public class JobListener implements JobExecutionListener { @Override public void beforeJob(final JobExecution jobExecution) { // do some work with the jobExecution } @Override public void afterJob(final JobExecution jobExecution) { // do some work with the jobExecution } } I want to write a test for my JobListener and i am wondering myself if i need to mock the JobExecution. Do you think that will be ok, or is there another nice solution to this? 回答1:

Spring batch- Parallel processing

可紊 提交于 2019-12-13 04:37:52
问题 I am running the spring batch job in three machines. For example the database has 30 records, the batch job in each machine has to pick up unique 10 records and process it. I read partitioning and Parallel processing and bit confused, which one is suitable? Appreciate your help. 回答1: What you are describing is partitioning. Partitioning is when the input is broken up into partitions and each partition is processed in parallel. Spring Batch offers two different ways to execute partitioning,

SpringBatch - How to catch a Read Exception when resource doesn't exist

馋奶兔 提交于 2019-12-13 04:28:36
问题 I have a Spring job that reads data from a flat file resource. When the job can't find the resource file (it works with a quartz) it throws an ItemStreamException that causes the job overfills my app's log. org.springframework.batch.core.step.AbstractStep: Encountered an error executing the step org.springframework.batch.item.ItemStreamException: Failed to initialize the reader at org.springframework.batch.item.support.AbstractItemCountingItemStreamItemReader.open

ItemReader for records returned by CrudRepository

情到浓时终转凉″ 提交于 2019-12-13 04:26:31
问题 I have a spring batch application wherein reader reads from an external db and processor transforms it to the POJO of my destination db , writer will write the transformed POJO to the destination db I am using following CrudRepository public interface MyCrudRepository extends CrudRepository<MyDbEntity, String> { List<MyDbEntity> findByPIdBetween(String from, String to); List<MyDbEntity> findByPIdGreaterThan(String from); } I wanted to know , how the ItemReader for above would look like?

How to pass a String (Non managed bean) to a managed bean

放肆的年华 提交于 2019-12-13 03:57:59
问题 I have a spring batch job. There is a step that is calling the reader method. STEP @Bean public Step myStep(FlatFileItemWriter<String> writer, Processor processor, @Value("${com.tableName}") String myTableName) { return stepBuilderFactory.get("step1") .<MyBean, String> chunk(this.chuckSize) .reader(reader(myTableName, this.myRowMapper)) .processor(processor) .writer(writer) .build(); } READER Working @Bean public <T> JdbcCursorItemReader<T> reader(@Value("${com.tableName}") String tableName,

How to effectively and correctly load sequential activities to Database using Spring batch?

别等时光非礼了梦想. 提交于 2019-12-13 03:55:51
问题 I'm currently working on a project loading a .dat file info to database. However, this .dat file contains not only data but also actions. The first field indicates the action of the records and all else are just data. Below are some examples records: A key1 key2 data1 data2 D key1 key2 data1 data2 C key1 key2 data1 data2 let, A=add, D=delete, C=update The file size is roughly 5GB. In this case, order of the records to be process does matter. Is it possible to use Spring batch to batch process

Spring Batch MultiLineItemReader with MultiResourcePartitioner

☆樱花仙子☆ 提交于 2019-12-13 03:49:59
问题 I have a File which has Multiline data like this. DataID is Start of a new record. e.g. One record is a combination of ID and concatenating below line until the start of a new record. >DataID1 Line1asdfsafsdgdsfghfghfghjfgjghjgxcvmcxnvm Line2asdfsafsdgdsfghfghfghjfgjghjgxcvmcxnvm Line3asdfsafsdgdsfghfghfghjfgjghjgxcvmcxnvm >DataID2 DataID2asdfsafsdgdsfghfghfghjfgjghjgxcvmcxnvm >DataID3 DataID2asdfsafsdgdsfghfghfghjfgjghjgxcvmcxnvm I was able to implement this using

Spring-batch MultiResourceItemReader vs commit-interval

和自甴很熟 提交于 2019-12-13 03:06:47
问题 I am using spring-batch MultiResourceItemReader to read a directory of XMLs and delegating it to StaxEventItemReader. The commit-interval on the chunk acts on MultiResourceItemReader i.e. the commit happens for each XML. I want to make the commit-interval act on StaxEventItemReader so that I can commit my huge XML data in chunks instead of one XML at a time. Any help? 来源: https://stackoverflow.com/questions/29129056/spring-batch-multiresourceitemreader-vs-commit-interval