spring-batch

Spring cloud task's SimpleTaskConfiguration and spring batch's SimpleBatchConfiguration preventing spring boot auto configuration of XA transactions

限于喜欢 提交于 2019-12-08 19:53:01
问题 I am trying to configure XA/distributed transactions for a spring batch / spring cloud task application configured with spring boot. I have added the following dependency hoping to rely on spring boot auto configuration: compile("org.springframework.boot:spring-boot-starter-jta-atomikos") However the following two classes cause two transaction managers to be configured: org.springframework.cloud.task.configuration.SimpleTaskConfiguration org.springframework.batch.core.configuration.annotation

Spring boot integration with spring batch and jpa

家住魔仙堡 提交于 2019-12-08 14:56:14
问题 I am integrating a spring boot project with a spring batch and data jpa project . All stuff related to job and data configuration is right except , persisting my job writer result in database . after I read a file and process it , i can't write it to mysql database . There is no error but no inserting too . interesting thing is my datasource is configured . because before inserting , I can fetch a sample record from database .please assist me to solve this problem. my application.properties :

ItemReader integration testing throwing ClassCastException

旧时模样 提交于 2019-12-08 14:22:09
问题 I am trying to integration test an ItemReader - here is the class: @Slf4j public class StudioReader implements ItemReader<List<Studio>> { @Setter private zoneDao zoneDao; @Getter @Setter private BatchContext context; private AreaApi areaApi = new AreaApi(); public List<Studio> read() throws Exception { return areaApi.getStudioLocations(); } Here is my bean.xml: <bean class="org.springframework.batch.core.scope.StepScope" /> <bean id="ItemReader" class="com.sync.studio.reader.StudioReader"

Aggregating processor or aggregating reader

吃可爱长大的小学妹 提交于 2019-12-08 14:21:13
问题 I have a requirement, which is like, I read items from a DB, if possible in a paging way where the items represent the later "batch size", I do some processing steps, like filtering etc. then I want to accumulate the items to send it to a rest service where I can send it to in batches, e.g. n of them at once instead one by one. Parallelising it on the step level is what I am doing but I am not sure on how to get the batching to work, do I need to implement a reader that returns a list and a

Logging in Spring batch

二次信任 提交于 2019-12-08 13:50:50
问题 I am new to spring batch. I want to know how we can put logging statement while implementing spring batch readers and writers. For example if I define readers and writers in spring context XML then later on I can't debug where my code failed. How can I achieve logging in Spring batch. Do I need to extend the available reader classes in java for example flatFileReader and put logging statement in java class. Or can I achieve this while maintaining my code in context file? And in case I want to

Adding “with ur” or any other prefix in query being generated by JdbcPagingItemReader

扶醉桌前 提交于 2019-12-08 12:42:14
问题 I am writing a java application to read data from one table and writing it in some file. Our table have million of records, which we need to read daily and write in file. So, I am using Spring batch with reader as JdbcPagingItemReader, as I want to read records in pages. Below is my bean defination :- <bean id="pagingItemReader" class="org.springframework.batch.item.database.JdbcPagingItemReader" scope="step"> <property name="dataSource" ref="dataSource" /> <property name="queryProvider">

How do you distribute a spring batch job effectively across jvms?

和自甴很熟 提交于 2019-12-08 11:40:38
问题 In the job I read from a file and store something in a database. I would like to have many running jars of the batch job in different processes and partition the data from the file among the running instances. I would also like to be able to keep adding files to be processed and also distribute the reads from those. I read spring xd might be a good fit, but can't find good tutorials on it. YES I am also a noob of spring batch and xd. 回答1: The first thing to understand is how to remotely

JdbcTemplate - SQLWarning ignored: SQL state '22007', error code '1292', message [Truncated incorrect DOUBLE value: 'stepExecutionContext[toId]']

筅森魡賤 提交于 2019-12-08 10:58:22
问题 I am simply developing the Spring Batch partitioned code looking at http://www.mkyong.com/spring-batch/spring-batch-partitioning-example/ and already went through the link : Error Code 1292 - Truncated incorrect DOUBLE value - Mysql but it did not solved my purpose. I have the following bean <bean id="pagingItemReader" class="org.springframework.batch.item.database.JdbcPagingItemReader" scope="step"> <property name="dataSource" ref="dataSource" /> <property name="pageSize" value="200" />

Using CompositeItemWriter the writer or classify method is not getting called

…衆ロ難τιáo~ 提交于 2019-12-08 10:19:23
问题 I am writing one Spring Batch using Spring boot, and I need to write in two different tables based on conditions so I am trying to CompositeItemWriter however when I invoke the batch the writer is not getting called. Here is my Job Configuration class. @Configuration public class JobConfiguration { ... ... ... @Bean public JdbcCursorItemReader<Notification> reader() { JdbcCursorItemReader<Notification> reader = new JdbcCursorItemReader<Notification>(); reader.setDataSource(dataSource); ... ..

Spring batch Out of memory after huge select in a job

China☆狼群 提交于 2019-12-08 09:19:59
问题 I'm facing a problema with my job, I'm trying to read records from database and write in a txt file. The database contains 1.800.000 records, with 149 columns, the problem is that the select is in the jobConfig.xml, in the bean 'mysqlItemReader', but, i think the select try to load all records in the JVM memory and then i got out of memory, using randtb.cliente limit 200000 it runs ok, but more than 500k of records i got out of memory, how avoid this error? Thanks! <beans xmlns="http://www