spring-batch

To read Excel can we use Spring batch?

二次信任 提交于 2021-02-02 09:55:48
问题 I want to know if it is possible to use Spring Batch , in order to read from an file Excel and save it in Database. remark : the content of file Excel chang every 2 hours. And if it is not possible with Spring Batch, what other solution can i use 回答1: Go take a look on spring-batch-extensions for Excel. You will find some examples of ExcelItemReader and ExcelItemWriter. Here is the introduction of the spring-batch-extensions project for Excel : Spring Batch extension which contains ItemReader

Spring boot+Spring Batch+MySql not working

戏子无情 提交于 2021-01-29 18:40:19
问题 I am creating a spring batch application using spring boot and mysql as the JPA repository and deploying on cloud foundry. While duing cf-push the batch process is crashed with error Process crashed with type web . However the underlying error message shows 2019-01-11T07:56:08.856-06:00 [APP/PROC/WEB/0] [OUT] org.springframework.dao.DataAccessResourceFailureException: Unable to commit new sequence value changes for BATCH_JOB_EXECUTION_SEQ 2019-01-11T07:56:08.856-06:00 [APP/PROC/WEB/0] [OUT]

Spring Batch Partitioned Step stopped after hours from when a non-skippable exception occured

喜欢而已 提交于 2021-01-29 16:23:31
问题 I want to verify a behaviour of Spring Batch... When running a partitioned step of a Job I got this exception: org.springframework.batch.core.JobExecutionException: Partition handler returned an unsuccessful step at org.springframework.batch.core.partition.support.PartitionStep.doExecute(PartitionStep.java:111) at org.springframework.batch.core.step.AbstractStep.execute(AbstractStep.java:195) at org.springframework.batch.core.job.SimpleStepHandler.handleStep(SimpleStepHandler.java:137) at org

Implementing Java Bean Validation with annotations in Spring Batch app

你离开我真会死。 提交于 2021-01-29 15:39:16
问题 I'm attempting to implement JSR 308 validation annotations on my Java Bean being used in a Spring Batch app. Spring Batch provides a ValidatingItemProcessor , but I want to validate before I get to the processor step, so I decided to go with the Java annotations. What I thought I could was add the annotation to a method parameter, and if that parameter didn't validate, the method would never be called. But it's not working that way. Apparently I still have to call Validator.validate() first.

Spring Batch - Read a byte stream, process, write to 2 different csv files convert them to Input stream and store it to ECS and then write to Database

心不动则不痛 提交于 2021-01-29 10:57:37
问题 I have a requirement where we receive a csv file in the form of byte stream through ECS S3 Pre-Signed url. I have to validate the data and write the validation successful and failed records to 2 different csv files and store them to ECS S3 bucket by converting them to InputStream. Also write the successful records to database and also the pre-signed urls of the inbound, success and failure files. I'm new to Spring Batch. How should I approach this requirement? If I choose a FlatFileItemReader

Why do I need an ItemReader in my job step if I only need to delete rows using ItemWriter

拜拜、爱过 提交于 2021-01-29 10:51:11
问题 I have a step in my batch job that I want to use only to delete rows from a table. The step looks like this: @Bean public Step step2(StepBuilderFactory factory, PurgeAggBalanceWriter writer, DataSource dataSource, PlatformTransactionManager platformTransactionManager){ return stepBuilderFactory.get("step2") .transactionManager(platformTransactionManager) .<Assessment,Assessment>chunk(10) .reader(getReader(dataSource, READER_QUERY2, "AggBalanceMapper", new AggBalanceMapper())) .writer(writer)

Writing List of Items using JdbcBatchItemWriter

生来就可爱ヽ(ⅴ<●) 提交于 2021-01-29 09:40:36
问题 Currently i am using JpaItemWriter to write the list of objects as below which is working fine. Now i want to change the JpaItemWriter to JdbcBatchItemWriter due to performance issue. public class MyItemWriter implements ItemWriter<List<MyDomainObject>> { @Override public void write(List<? extends Lists<MyDomainObject>> items) { JpaItemWriter<MyDomainObject> writer = new JpaItemWriter<>(); for(List<MyDomainObject> o : items) { writer.write(o); } } } Suggest a sample snippets which uses the

getting data from DB in spring batch and store in memory

点点圈 提交于 2021-01-29 07:26:11
问题 In the spring batch program, I am reading the records from a file and comparing with the DB if the data say column1 from file is already exists in table1. Table1 is fairly small and static. Is there a way I can get all the data from table1 and store it in memory in the spring batch code? Right now for every record in the file, the select query is hitting the DB. The file is having 3 columns delimited with "|". The file I am reading is having on an average 12 million records and it is taking

How do I define a bean of type 'java.lang.String' in Spring Batch?

假装没事ソ 提交于 2021-01-29 07:22:47
问题 Consider defining a bean of type 'java.lang.String' in your configuration. is the error I get with the following reader method: @Bean public JdbcCursorItemReader<Assessment> getReader(DataSource datasource, String query, String name) { return new JdbcCursorItemReaderBuilder<Assessment>() .dataSource(datasource) .sql(query) .name(name) .rowMapper(new AssessmentMapper()) .build(); } Where the step config looks like : public Step step1(StepBuilderFactory factory, DataSource dataSource,

How to call StepExcecutionListener in spring batch with kafka integration?

时光怂恿深爱的人放手 提交于 2021-01-29 06:37:03
问题 Below is the config of job in etl.xml <batch:job id="procuerJob"> <batch:step id="Produce"> <batch:partition partitioner="partitioner"> <batch:handler grid-size="${ partitioner.limit}"></batch:handler> <batch:step> <batch:tasklet> <batch:chunk reader="Reader" writer="kafkaProducer" commit-interval="20000"> </batch:chunk> <batch:listeners> <batch:listener ref="producingListener" /> </batch:listeners> </batch:tasklet> </batch:step> </batch:partition> </batch:step> </batch:job> below is the code