spring-batch

Triggering spark jobs with REST

a 夏天 提交于 2019-11-27 00:37:07
问题 I have been of late trying out apache spark. My question is more specific to trigger spark jobs. Here I had posted question on understanding spark jobs. After getting dirty on jobs I moved on to my requirement. I have a REST end point where I expose API to trigger Jobs, I have used Spring4.0 for Rest Implementation. Now going ahead I thought of implementing Jobs as Service in Spring where I would submit Job programmatically, meaning when the endpoint is triggered, with given parameters I

Integrating Spring Batch Admin into an existing application

别说谁变了你拦得住时间么 提交于 2019-11-27 00:32:07
问题 I have an application which uses Spring Batch and Spring MVC. I am able to deploy Spring Batch Admin as a separate war and use it against the same DB my application uses, though I would like to integrate it into my own application, possibly modify some of the views as well. Is there an easy way to do this or do I have to fork it and go from there? 回答1: There is an easy way apparently according to this thread; Define a DispatcherServlet for Batch Admin in web.xml : <servlet> <servlet-name

How to Override Spring-boot application.properties programmatically

倖福魔咒の 提交于 2019-11-27 00:14:58
问题 I have jdbc property files which I take from external configuration web-service In spring boot in order to set mysql props it's easy as adding those to application.properties: spring.datasource.url=jdbc:mysql://localhost/mydb spring.datasource.username=root spring.datasource.password=root spring.datasource.driver-class-name=com.mysql.jdbc.Driver How could I override those programticlly in my app? same goes for Spring-batch props: database.driver=com.mysql.jdbc.Driver database.url=jdbc:mysql:/

FlatFileItemWriter should write output file named same as input file

泄露秘密 提交于 2019-11-26 23:43:43
问题 I have a spring batch job that reads files matching naming pattern from a directory, does some processing, and writes back status of processing per line in input file. The writer must produce output files with same name as the input files. To the MultiResourceItemReader I pass the pattern: "files-*.txt" and expect the FlatFileItemWriter to use the name of the input file. How do I specify this constraint in context xml file ? Reader bean <bean id="multiResourceReader" class="org

Spring Batch Framework - Auto create Batch Table

倾然丶 夕夏残阳落幕 提交于 2019-11-26 23:12:48
问题 I just created a batch job using Spring Batch framework, but I don't have Database privileges to run CREATE SQL. When I try to run the batch job I hit the error while the framework tried to create TABLE_BATCH_INSTANCE. I try to disable the <jdbc:initialize-database data-source="dataSource" enabled="false"> ... </jdbc:initialize-database> But after I tried I still hit the error org.springframework.jdbc.BadSqlGrammarException: PreparedStatementCallback; bad SQL grammar [SELECT JOB_INSTANCE_ID,

org.springframework.batch.core.JobExecutionException: Partition handler returned an unsuccessful step

空扰寡人 提交于 2019-11-26 22:12:01
问题 I am learning spring batch and I was able to create simple single step application(github repo link) This application contains a job which does following: 1. reads persons from csv file 2. lowercase their names 3. Save them into databse Now I want to learn partition feature so I added following partitioner: @Component public class MyPartitioner implements Partitioner { @Override public Map<String, ExecutionContext> partition(int gridSize) { Map<String, ExecutionContext> map = new HashMap<>

How to read csv lines chunked by id-column with Spring-Batch?

╄→гoц情女王★ 提交于 2019-11-26 22:08:54
问题 I'm using Spring-Batch to read a csv file, format the content and write it to a database like: StepBuilder<T, T> builder = stepBuilderFactory.get("step") .<T, T>chunk(100) .reader(flatFileItemReader) .processor(processor) .writer(jpaItemWriter); The csv contains an ID column. How can I modify the reader to base the chunks on that ID? Examle: #id, #value 1, first 1000, second 1001, second 1005, second In this case the chunk would only read the first line, then commit, and then continue. Is

Storing in JobExecutionContext from tasklet and accessing in another tasklet

喜欢而已 提交于 2019-11-26 19:13:14
问题 I have a requirement in which a tasklet, stores all the files in the directories in an arraylist. The size of the list is stored in the job execution context. Later this count is accessed from another tasklet in another step. How do it do this. I tried to store in jobexecution context, at runtime throws unmodifiable collection exception, public RepeatStatus execute(StepContribution arg0, ChunkContext arg1) throws Exception { StepContext stepContext = arg1.getStepContext(); StepExecution

Use of multiple DataSources in Spring Batch

百般思念 提交于 2019-11-26 18:57:56
I am trying to configure a couple of datasources within Spring Batch. On startup, Spring Batch is throwing the following exception: To use the default BatchConfigurer the context must contain no more thanone DataSource, found 2 Snippet from Batch Configuration @Configuration @EnableBatchProcessing public class BatchJobConfiguration { @Primary @Bean(name = "baseDatasource") public DataSource dataSource() { // first datasource definition here } @Bean(name = "secondaryDataSource") public DataSource dataSource2() { // second datasource definition here } ... } Not sure why I am seeing this

Receiving multiple messages from MQ asynchronously

我怕爱的太早我们不能终老 提交于 2019-11-26 18:40:15
问题 I use Spring + Hibernate + JPA in my application. I need to read the message from Websphere MQ and insert the message to DB. Sometimes there may be continuous messages available and sometimes very less number of messages and sometimes we can expect no message from Queue . Currently I'm reading the message one by one and inserting them to Database. But it does not help much in terms of performance. I mean when I have chunk of messages(Example 300k messages in Queue) I could not insert them