spring-batch

Passing stream to job as parameter

僤鯓⒐⒋嵵緔 提交于 2019-12-22 10:38:50
问题 Is there a way to pass a stream while launching the job through job Launcher, something similar to passing jobParameters? I have a separate service for getting file and then I want to initiate the batch job to load it. Code scenario : Consider this sample. Here job is defined but actual launcher resides in the dependency underneath. So consider in sample, I add a controller which read user's input file and then trigger the sample-job defined in sample which is run by joblauncher.run of

Spring Batch | At least one JPA metamodel must be present

孤者浪人 提交于 2019-12-22 09:27:47
问题 I am getting java.lang.IllegalArgumentException: At least one JPA metamodel must be present! error while trying to run a simple Spring Batch application. Relevant code & configuration:- pom.xml:- <dependency> <groupId>org.springframework.boot</groupId> <artifactId>spring-boot-starter-batch</artifactId> <version>1.2.3.RELEASE</version> </dependency> Java configuration:- /** * @author Kumar Sambhav Jain */ @Configuration @EnableBatchProcessing @PropertySources({ @PropertySource("file:${baseDir}

Spring Boot CommandLineRunner : filter option argument

牧云@^-^@ 提交于 2019-12-22 05:34:37
问题 Considering a Spring Boot CommandLineRunner Application, I would like to know how to filter the "switch" options passed to Spring Boot as externalized configuration. For example, with: @Component public class FileProcessingCommandLine implements CommandLineRunner { @Override public void run(String... strings) throws Exception { for (String filename: strings) { File file = new File(filename); service.doSomething(file); } } } I can call java -jar myJar.jar /tmp/file1 /tmp/file2 and the service

Spring Batch process an encoded zipped file

99封情书 提交于 2019-12-22 05:17:10
问题 I’m investigating the use of spring batch to process records from an encoded zipped file. The records are variable length with nested variable length data fields encoded within them. I’m new to Spring and Spring Batch, this is how I plan to structure the batch configuration. The ItemReader would need to read a single record from the zipped (*.gz) file input stream into a POJO (byte array), the length of this record would be contained in the first two bytes of the stream. The ItemProcessor

Spring boot + spring batch without DataSource

跟風遠走 提交于 2019-12-22 03:48:12
问题 I'm trying to configure spring batch inside spring boot project and I want to use it without data source. I've found that ResourcelessTransactionManager is the way to go but I cannot make it work. Problem is I already have 3 another dataSources defined, but I don't want to use any of them in springBatch. I've checked default implementation DefaultBatchConfigurer and if it is not able to find dataSource it will do exactly what I want. Problem is I've 3 of them and dont want to use any. Please

how to best approach to use spring batch annotation or xml files ?

杀马特。学长 韩版系。学妹 提交于 2019-12-22 00:27:46
问题 firstly, thanks for attention,in my spring batch project defined many jobs , for example: <batch:job id="helloWorldJob1" job-repository="jobRepository"> <batch:step id="step1" > <batch:tasklet> <batch:chunk reader="itemReader1" writer="itemWriter1" processor="itemProcessor1"> </batch:chunk> </batch:tasklet> </batch:step> </batch:job> <batch:job id="helloWorldJob2" job-repository="jobRepository"> <batch:step id="step1" > <batch:tasklet> <batch:chunk reader="itemReader2" writer="itemWriter2"

Field error in object 'target' on field '': rejected value []; codes [typeMismatch.target.,typeMismatch.,typeMismatch.java.util.Date,typeMismatch]

你离开我真会死。 提交于 2019-12-22 00:02:43
问题 I've created: https://jira.spring.io/browse/BATCH-2778 I am developing Spring Batch + Redis (Spring Data Redis) example. In this example, I'm reading student.csv file and storing all the data as is in Redis DB. I wanted to used dateOfBirth as Date and I am sure that I need to do some date logic conversion to store value Date in Redis . As per my analysis, it looks like I wont be able to use @JsonFormat(shape = JsonFormat.Shape.STRING, pattern = "yyyy-MM-dd") , because I am not dealing with

To use the default BatchConfigurer the context must contain no more thanone DataSource, found 2

血红的双手。 提交于 2019-12-21 23:44:02
问题 I am using spring boot and spring batch. For meta table i want to use mysql and for all business thing i want to use db2 as a database.When i implemented getting error. application.properties spring.datasource.url = jdbc:mysql://localhost:3306/local spring.datasource.username = root spring.datasource.password = root spring.jpa.show-sql = true spring.jpa.hibernate.ddl-auto = update spring.jpa.properties.hibernate.dialect = org.hibernate.dialect.MySQL5Dialect spring.seconddatasource.url=jdbc

Spring Batch Process Indicator Pattern

╄→гoц情女王★ 提交于 2019-12-21 22:30:35
问题 In a Spring Batch Job I am writing the item to target file (using FlatFileItemWriter) and updating input record "process indicator" field as "processed"/"failed" (using JdbcBatchItemWriter). Which is the best way to make this happen in a "item transaction" ? Using a CompositeItemWriter (delegates FlatFileItemWriter for writing to file and JdbcBatchItemWriter to update the "process indicator" Using ItemWriteListener methods "afterWrite" and "onWriteError" to update the "process indicator" 回答1:

Spring Batch Stax XML reading job is not ending when out of input

自作多情 提交于 2019-12-21 21:48:46
问题 I'm using Spring Batch to set up a job that will process a potentially very large XML file. I think I've set it up appropriately, but at runtime I'm finding that the job runs, processes its input, and then just hangs in an executing state (I can confirm by viewing the JobExecution's status in the JobRepository). I've read through the Batch documentation several times but I don't see any obvious "make the job stop when out of input" configuration that I'm missing. Here's the relevant portion