spring-batch

Getting “Scope 'step' is not active for the current thread” while creating spring batch beans

为君一笑 提交于 2019-12-02 05:33:37
问题 In my Spring batch configuration, I'm trying to setup a partitioned step, which accesses values from JobParameters as follows : @Bean @Qualifier("partitionJob") public Job partitionJob() throws Exception { return jobBuilderFactory .get("partitionJob") .incrementer(new RunIdIncrementer()) .start(partitionStep(null)) .build(); } @Bean @StepScope //I'm getting exception here - > Error creating bean public Step partitionStep( @Value("#{jobParameters[gridSize]}") String gridSize) throws Exception

Complex XML using Spring Batch; StaxEventItemWriter ; Jaxb2Marshaller

核能气质少年 提交于 2019-12-02 05:12:07
问题 I need to write a slightly complex XML using Spring Batch. Can anyone please help me with the appropriate Spring configuration? Below is the Output the process requires. <XML> <USERLIST ID="something" NAME="Sample"> <USER ID="userID" NAME="Name"/> <USER ID="userID" NAME="Name"/> ........ </USERLIST> <XML> The 'UserList' in the XML above only needs to occur once This is the spring configuration I have so far. <bean id="userXMLWriter" class="org.springframework.batch.item.xml

Spring Batch :Aggregated reader / writer Issue

百般思念 提交于 2019-12-02 04:07:11
I am trying to use Spring batch and implement an aggregated reader (batch file, where multiple records should be treated as one record while writing). Here is the code snippet for my reader: public class AggregatePeekableReader implements ItemReader<List<T>>, ItemStream { private SingleItemPeekableItemReader<T> reader; private boolean process(T currentRecord , InvoiceLineItemsHolder holder) throws UnexpectedInputException, ParseException, Exception { next = peekNextInvoiceRecord(); // finish processing if we hit the end of file if (currentRecord == null ) { LOG.info("Exhausted ItemReader ( END

Standalone example on jberet (jsr352)

僤鯓⒐⒋嵵緔 提交于 2019-12-02 04:06:16
Is there anyway to use jberet as standalone module to execute Batch Jobs? All the time getting samples on using along with WildFly. Surprised to see it looks for container to load implementations while trying some samples. Any insights on why/why not would be helpful kaape Here is a tutorial how to use jberet in a standalone application: http://www.mastertheboss.com/batch-api/running-batch-jobs-in-j2se-applications You'll need to include various jboss dependencies for it to work. Furthermore you need to configure jberet with a separate jberet.properties. I've build a (hopefully) minimal

Spring Batch - MongoDB to XML - Caused by: java.lang.IllegalStateException: A type to convert the input into is required

泄露秘密 提交于 2019-12-02 03:09:57
I am developing Spring Batch - MongoDB to XML example. In this example, when I run the main method I see the below error is cominng. Please guide on the below error. I tried to find the solution on the web, but I dont find anything helpful on the web yet. Exception in thread "main" org.springframework.beans.factory.BeanCreationException: Error creating bean with name 'step1': Cannot resolve reference to bean 'mongodbItemReader' while setting bean property 'itemReader'; nested exception is org.springframework.beans.factory.BeanCreationException: Error creating bean with name 'mongodbItemReader'

Spring Batch - Read once and write twice

五迷三道 提交于 2019-12-02 03:05:24
I am new to Spring Batch. My requirement is I have a reader which gets the records through a web service call/database call and currently I am writing those records to one table. Now I need same records (records read by reader) needs to be processed and write into another table. The point to note here is the second items those are getting stored in second write are of different type of first write. I need like below 1st Step: - Read items of type A --> Write items of Type A 2nd Step:- Read items of type A --> Process to type B ---> Write 10 items of type B For the same above job I need

Remote partition - slave getting greedy

醉酒当歌 提交于 2019-12-02 02:45:43
Following is what we are trying to achieve. We want a big xml file to be staged in a database parallely in different vms. To achieve this, we are using the scalable spring batch remote partition approach and we are running into some issues. Following is the high level setup master - splits an xml file into multiple partitions ( we currently have a grid size of 3). slave 1 - processing partitions (reads index based partitions and writes to DB) slave 2 - processing partitions We are running it in Linux and with active MQ 5.15.3. With the above setup slave 1 is processing 2 partitions at the same

Entities not persisting. Are RepositoryItemWriter & SimpleJpaWriter thread-safe?

谁说胖子不能爱 提交于 2019-12-02 02:15:23
问题 I have encountered an odd issue with a RepositoryItemWriter, where it does not appear to be persisting entities correctly through my configured Spring Data JPA repository to the data source. Step configuration @Bean public Step orderStep(StepBuilderFactory stepBuilderFactory, ItemReader<OrderEncounter> orderEncounterReader, ItemWriter<List<Order>> orderWriter, ItemProcessor<OrderEncounter, List<Order>> orderProcessor, TaskExecutor taskExecutor) { return stepBuilderFactory.get("orderStep") .

Entities not persisting. Are RepositoryItemWriter & SimpleJpaWriter thread-safe?

女生的网名这么多〃 提交于 2019-12-02 01:59:00
I have encountered an odd issue with a RepositoryItemWriter, where it does not appear to be persisting entities correctly through my configured Spring Data JPA repository to the data source. Step configuration @Bean public Step orderStep(StepBuilderFactory stepBuilderFactory, ItemReader<OrderEncounter> orderEncounterReader, ItemWriter<List<Order>> orderWriter, ItemProcessor<OrderEncounter, List<Order>> orderProcessor, TaskExecutor taskExecutor) { return stepBuilderFactory.get("orderStep") .<OrderEncounter, List<Order>> chunk(10) .reader(orderEncounterReader) .processor(orderProcessor) .writer

Spring Batch custom completion policy for dynamic chunk size

﹥>﹥吖頭↗ 提交于 2019-12-02 01:58:07
Context We have a batch job that replicates localized country names (i.e. translations of country names to different languages) to our DB from the external one. The idea was to process all localized country names for a single country in 1 chunk (i.e. first chunk - all translations for Andorra, next chunk - all translations for U.A.E., etc.). We use JdbcCursorItemReader for reading external data + some oracle analytic functions to provide total number of translations available for the country: something like select country_code, language_code, localized_name, COUNT(1) OVER(PARTITION BY c_lng