spring-batch

how to read both comma separated and pipe line separated csv file in a single item reader in spring batch

邮差的信 提交于 2020-06-29 04:21:36
问题 I am new to sprig batch. I have a folder which contain multiple csv file, I have implemented MultiResourceItemReader () to read those file . It is working only if all csv file are pipe line ("|") separated. I want to read both comma (",") separated csv and pipe line separated csv using single reader. Is it possible ? if yes how ? Here is my code @Bean @StepScope public MultiResourceItemReader<Person> multiResourceItemReader(@Value("#{jobParameters[x]}") String x,@Value("#{jobParameters[y]}")

Unable to create Spring Batch metadata tables when using two data sources

匆匆过客 提交于 2020-06-29 04:08:32
问题 I am trying to define two datasources in Spring boot v2.2.5.RELEASE Batch Project, one for Postgres and other for Oracle . I will read from Oracle and load data into Postgres , hence looking to create all Spring Batch Metadata tables into PostgresDB When I run the code with the configurations, I get the below error 2020-06-19 12:35:35.423 INFO 16028 --- [ main] org.hibernate.dialect.Dialect : HHH000400: Using dialect: org.hibernate.dialect.PostgreSQLDialect 2020-06-19 12:35:35.692 INFO 16028

StaxEventItemWriter - File not writable issue

孤街醉人 提交于 2020-06-28 07:34:07
问题 I have an item writer configured as below which generates an xml: <beans:bean id="delegateItemWriter" class="org.springframework.batch.item.xml.StaxEventItemWriter" scope="step"> <beans:property name="resource" value="file:#{jobParameters['OutputDirPath']}${myFileName}" /> <beans:property name="overwriteOutput" value="true"/> <beans:property name="rootTagName" value="disclosure-feed" /> <beans:property name="rootElementAttributes" > <beans:map> <beans:entry key="xmlns:xsi" value="http://www

The method stream(ItemStream) in the type AbstractTaskletStepBuilder<SimpleStepBuilder<Customer,Customer>> is not applicable for the arguments (

被刻印的时光 ゝ 提交于 2020-06-28 06:55:06
问题 How to classify the elements using Spring Batch ? I want to write data into two different tables or files for now doing this in the console. Error: The method stream(ItemStream) in the type AbstractTaskletStepBuilder> is not applicable for the arguments (ItemWriter) @EnableBatchProcessing @SpringBootApplication public class ClassifierCompositeItemApplication { private final JobBuilderFactory jobBuilderFactory; private final StepBuilderFactory stepBuilderFactory; @Value("classpath:input

Exit Spring Batch Job within tasklet

社会主义新天地 提交于 2020-06-28 02:20:33
问题 I have a Spring Batch tasklet and I can't figure out how to fail from it. I want to check for certain parameters and if they aren't there, fail the job out on that step. @Component public class Tfp211SetupTasklet extends AbstractSetupTasklet { final static Logger LOGGER = LoggerFactory.getLogger(Tfp211SetupTasklet.class); @Override protected RepeatStatus performTask(ExecutionContext ec, ChunkContext chunkContext) { //TODO //add error checking. If the parameter is not there, fail out or throw

Exit Spring Batch Job within tasklet

∥☆過路亽.° 提交于 2020-06-28 02:19:50
问题 I have a Spring Batch tasklet and I can't figure out how to fail from it. I want to check for certain parameters and if they aren't there, fail the job out on that step. @Component public class Tfp211SetupTasklet extends AbstractSetupTasklet { final static Logger LOGGER = LoggerFactory.getLogger(Tfp211SetupTasklet.class); @Override protected RepeatStatus performTask(ExecutionContext ec, ChunkContext chunkContext) { //TODO //add error checking. If the parameter is not there, fail out or throw

Exit Spring Batch Job within tasklet

笑着哭i 提交于 2020-06-28 02:19:12
问题 I have a Spring Batch tasklet and I can't figure out how to fail from it. I want to check for certain parameters and if they aren't there, fail the job out on that step. @Component public class Tfp211SetupTasklet extends AbstractSetupTasklet { final static Logger LOGGER = LoggerFactory.getLogger(Tfp211SetupTasklet.class); @Override protected RepeatStatus performTask(ExecutionContext ec, ChunkContext chunkContext) { //TODO //add error checking. If the parameter is not there, fail out or throw

Spring-Batch: Writing objects to lines with fixed length?

橙三吉。 提交于 2020-06-27 11:06:15
问题 Spring-Batch provides the class FixedLengthTokenizer which makes it easy to read different offsets of a single line into the fileds of an object. Whereby the content of each field is extracted from certain ranges with fixed length: FixedLengthTokenizer tokenizer = new FixedLengthTokenizer(); String[] names = {"A", "B", "C", "D"}; tokenizer.setNames(names); Range[] ranges = {new Range(1, 4), new Range(5, 12), new Range(13, 14), new Range(15, 15)}; tokenizer.setColumns(ranges); I want to do the

Spring Batch and JPA performance issue

有些话、适合烂在心里 提交于 2020-06-27 05:28:09
问题 Before posting the question, I went through many links over web like below and I'm facing drastic performance issue when trying to perform JPA saveAll(entity) when loading data into target DB using Spring boot and Spring Batch . Spring Boot JPA saveAll() inserting to database extremely slowly Spring Boot JPARepository performance on save() https://vladmihalcea.com/the-best-way-to-do-batch-processing-with-jpa-and-hibernate/ https://dzone.com/articles/50-best-performance-practices-for-hibernate

How to reprocessed Partitions failed items in Spring Batch?

徘徊边缘 提交于 2020-06-25 07:20:19
问题 How to reprocessed Items failed during the Spring Batch Partitioned using JdbcCursorItemReader . Partitions have failed due to below error. Here Source DB is Oracle which holds around 1.7 million records and that table doesn't have any PK and due to this I need to create partitions based on the OF FSET an d LIMIT parameters when dealing with the Oracle DB and due to complexity I was unable to use JdbcPagingItemReader , hence I decided to use JdbcCursorItemReader where I can somehow managed