Spring Batch: Writing data to multiple files with dynamic File Name

后端 未结 2 1972
野性不改
野性不改 2020-12-16 05:16

I have a requirement to get the data from a database and write that data to files based on the filename given in the database.

This is how data is defined in the da

相关标签:
2条回答
  • 2020-12-16 05:57

    I'm not sure but I don't think there is an easy way of obtaining this. You could try to build your own ItemWriter like this:

    public class DynamicItemWriter  implements ItemStream, ItemWriter<YourEntry> {
    
        private Map<String, FlatFileItemWriter<YourEntry>> writers = new HashMap<>();
    
        private LineAggregator<YourEntry> lineAggregator;
    
        private ExecutionContext executionContext;
    
        @Override
        public void open(ExecutionContext executionContext) throws ItemStreamException {
            this.executionContext = executionContext;
        }
    
        @Override
        public void update(ExecutionContext executionContext) throws ItemStreamException {
        }
    
        @Override
        public void close() throws ItemStreamException {
            for(FlatFileItemWriter f:writers.values()){
                f.close();
            }
        }
    
        @Override
        public void write(List<? extends YourEntry> items) throws Exception {
            for (YourEntry item : items) {
                FlatFileItemWriter<YourEntry> ffiw = getFlatFileItemWriter(item);
                ffiw.write(Arrays.asList(item));
            }
        }
    
        public LineAggregator<YourEntry> getLineAggregator() {
            return lineAggregator;
        }
    
        public void setLineAggregator(LineAggregator<YourEntry> lineAggregator) {
            this.lineAggregator = lineAggregator;
        }
    
    
        public FlatFileItemWriter<YourEntry> getFlatFileItemWriter(YourEntry item) {
            String key = item.FileName();
            FlatFileItemWriter<YourEntry> rr = writers.get(key);
            if(rr == null){
                rr = new FlatFileItemWriter<>();
                rr.setLineAggregator(lineAggregator);
                try {
                    UrlResource resource = new UrlResource("file:"+key);
                    rr.setResource(resource);
                    rr.open(executionContext);
                } catch (MalformedURLException e) {
                    e.printStackTrace();
                }
                writers.put(key, rr);
                //rr.afterPropertiesSet();
            }
            return rr;
        }
    }
    

    and configure it as a writer:

    <bean id="csvWriter" class="com....DynamicItemWriter">
            <property name="lineAggregator">
            <bean
             class="org.springframework.batch.item.file.transform.DelimitedLineAggregator">
                <property name="delimiter" value=","/>
                <property name="fieldExtractor" ref="csvFieldExtractor"/>
            </bean>
        </property>
    
    0 讨论(0)
  • 2020-12-16 05:57

    In spring-batch, you can do this using ClassifierCompositeItemWriter.

    Since ClassifierCompositeItemWriter gives you access to your object during write, you can write custom logic to instruct spring to write to different files.

    Take a look at below sample. The ClassifierCompositeItemWriter needs an implementation of Classifier interface. Below you can see that I have created a lambda where I am implementing the classify() method of the Classifier interface. The classify() method is where you will create your ItemWriter. In our example below, we have created a FlatFileItemWriter which gets the name of the file from the item itself and then creates a resource for that.

    @Bean
    public ClassifierCompositeItemWriter<YourDataObject> yourDataObjectItemWriter(
        Classifier<YourDataObject, ItemWriter<? super YourDataObject>> itemWriterClassifier
    ) {
      ClassifierCompositeItemWriter<YourDataObject> compositeItemWriter = new ClassifierCompositeItemWriter<>();
      compositeItemWriter.setClassifier(itemWriterClassifier);
      return compositeItemWriter;
    }
    
    @Bean
    public Classifier<YourDataObject, ItemWriter<? super YourDataObject>> itemWriterClassifier() {
      return yourDataObject -> {
        String fileName = yourDataObject.getFileName();
    
        BeanWrapperFieldExtractor<YourDataObject> fieldExtractor = new BeanWrapperFieldExtractor<>();
        fieldExtractor.setNames(new String[]{"recId", "name"});
        DelimitedLineAggregator<YourDataObject> lineAggregator = new DelimitedLineAggregator<>();
        lineAggregator.setFieldExtractor(fieldExtractor);
    
        FlatFileItemWriter<YourDataObject> itemWriter = new FlatFileItemWriter<>();
        itemWriter.setResource(new FileSystemResource(fileName));
        itemWriter.setAppendAllowed(true);
        itemWriter.setLineAggregator(lineAggregator);
        itemWriter.setHeaderCallback(writer -> writer.write("REC_ID,NAME"));
    
        itemWriter.open(new ExecutionContext());
        return itemWriter;
      };
    }
    

    Finally, you can attach your ClassifierCompositeItemWriter in your batch step like you normally attach your ItemWriter.

    @Bean
    public Step myCustomStep(
        StepBuilderFactory stepBuilderFactory
    ) {
      return stepBuilderFactory.get("myCustomStep")
          .<?, ?>chunk(1000)
          .reader(myCustomReader())
          .writer(yourDataObjectItemWriter(itemWriterClassifier(null)))
          .build();
    }
    

    NOTE: As pointed out in comments by @Ping, a new writer will be created for each chunk, which is usually a bad practice and not an optimal solution. A better solution would be to maintain a hashmap of filename and writer so that you can reuse the writer.

    0 讨论(0)
提交回复
热议问题