spring-batch

Spring Batch JdbcPagingItemReader paging not work

☆樱花仙子☆ 提交于 2020-01-16 17:59:08
问题 I use spring batch to do a data migration job. there are a lot of data so that I decided to use the JdbcPagingItemReader to read the data by page. Below is how I define the reader: private JdbcPagingItemReader<Map<String, Object>> buildItemReader(final DataSource dataSource, String tableName, String tenant){ String tenantName = tenantHelper.determineTenant(tableName); Map<String, Object> sqlParameterValues = new HashMap<>(); sqlParameterValues.put("tableName", tableName); sqlParameterValues

spring batch csv:Adding multiple headers to csv

旧街凉风 提交于 2020-01-16 08:12:19
问题 How to write mulitples header in csv such that the second header values should come from the database //expected output personId,firstName,lastName,email,age fullname,total // this is the second header which should come the database kaa,karthi,sa,123@,34 //below is the code snippet for spring batch writing to csv @Bean(destroyMethod="") public JdbcCursorItemReader<Person> reader(){ JdbcCursorItemReader<Person> cursorItemReader = new JdbcCursorItemReader<>(); cursorItemReader.setDataSource

Can you initialize Spring Batch metadata tables with Liquibase?

白昼怎懂夜的黑 提交于 2020-01-15 11:57:10
问题 Currently I have a setup like below. On running the batch job locally the job will create the necessary metadata tables automatically using the data-source property values since initialize-schema is set to always. Liquibase will also run and create any tables listed in its changelog. Here is my application.yml file spring: batch: initialize-schema: always job: enabled: true liquibase: url: db_url user: deploy_user password: deploy_pass change-log: classpath:db/changelog/db.changelog-master

Local Partitioning in Spring batch

霸气de小男生 提交于 2020-01-15 10:07:06
问题 I'm actually using local partitioning to export customer table that has 100K rows to multiple XML files ( as i can't export data to one file because StaxEventItemWriter<T> isn't Thread-safe) but i don't get better results with multiple Threads even when increasing gridSize to 10. I think the problem in the StaxEventItemWriter because i have some code errors like : java.lang.NullPointerException: null at com.sun.xml.internal.stream.writers.XMLStreamWriterImpl.flush(XMLStreamWriterImpl.java:397

Local Partitioning in Spring batch

旧巷老猫 提交于 2020-01-15 10:06:55
问题 I'm actually using local partitioning to export customer table that has 100K rows to multiple XML files ( as i can't export data to one file because StaxEventItemWriter<T> isn't Thread-safe) but i don't get better results with multiple Threads even when increasing gridSize to 10. I think the problem in the StaxEventItemWriter because i have some code errors like : java.lang.NullPointerException: null at com.sun.xml.internal.stream.writers.XMLStreamWriterImpl.flush(XMLStreamWriterImpl.java:397

Spring Batch Integration config using Java DSL

假装没事ソ 提交于 2020-01-15 09:40:01
问题 The Spring Integration Java DSL Reference and Spring Batch Java Configuration documentation show how to use Java Configuration for Spring Integration and Spring Batch. But they dont show how to configure it for the Spring Batch Integration. How is a JobLaunchingGateway configured using DSL? Cheers, Menno 回答1: The JobLaunchingGateway is a MessageHandler so you would define it as a @Bean and use it in a .handle() method in the flow. 回答2: Gary is correct. You can see a demo of the java config

Spring Batch Integration config using Java DSL

最后都变了- 提交于 2020-01-15 09:37:27
问题 The Spring Integration Java DSL Reference and Spring Batch Java Configuration documentation show how to use Java Configuration for Spring Integration and Spring Batch. But they dont show how to configure it for the Spring Batch Integration. How is a JobLaunchingGateway configured using DSL? Cheers, Menno 回答1: The JobLaunchingGateway is a MessageHandler so you would define it as a @Bean and use it in a .handle() method in the flow. 回答2: Gary is correct. You can see a demo of the java config

Spring Batch Integration config using Java DSL

与世无争的帅哥 提交于 2020-01-15 09:37:07
问题 The Spring Integration Java DSL Reference and Spring Batch Java Configuration documentation show how to use Java Configuration for Spring Integration and Spring Batch. But they dont show how to configure it for the Spring Batch Integration. How is a JobLaunchingGateway configured using DSL? Cheers, Menno 回答1: The JobLaunchingGateway is a MessageHandler so you would define it as a @Bean and use it in a .handle() method in the flow. 回答2: Gary is correct. You can see a demo of the java config

Spring boot repository does not save to the DB if called from scheduled job

ぐ巨炮叔叔 提交于 2020-01-15 08:52:07
问题 I have a spring boot application in which I need to schedule a job to read files from a specific directory and store the data into the DB. I used Spring batch for handling the files part as the number of files is very large. The application has a component named PraserStarer which has a method named startParsing . This method is annotated with @scheduled annotation. @scheduled(fixedDelay = 60 * 1000) public startParsing(){ // start spring batch job } I have a repository interface

Using @SpringApplicationConfiguration: How to set jobparameters in tests when using spring-batch and spring-boot

大憨熊 提交于 2020-01-15 06:56:27
问题 Is there a way, how I can define jobparameters in integration test, when using pure spring-boot with spring-batch? When I define a simple Batch-Job in Spring-Boot and start it with SpringApplication.run(args) , I can pass my program-arguments in the run method. Since I have Spring-Boot-Batch activated, those arguments are converted into JobParameters and are then passed to the job. This happens inside the class JobLauncherCommandLineRunner . Afterwards, I can read this jobparameters via the