Spring Batch: Writing data to multiple files with dynamic File Name

一世执手 提交于 2019-12-30 04:58:10

问题


I have a requirement to get the data from a database and write that data to files based on the filename given in the database.

This is how data is defined in the database:

Columns --> FILE_NAME, REC_ID, NAME
 Data --> file_1.csv, 1, ABC
 Data --> file_1.csv, 2, BCD
 Data --> file_1.csv, 3, DEF
 Data --> file_2.csv, 4, FGH
 Data --> file_2.csv, 5, DEF
 Data --> file_3.csv, 6, FGH
 Data --> file_3.csv, 7, DEF
 Data --> file_4.csv, 8, FGH

As you see, basically the file names along with the data is defined in the Database so what SpringBatch should do is get this data and write it to the corresponding file specified in the Database (i.e., file_1.csv should only contain 3 records (1,2,3), file_2.csv should only contain records 4 and 5, etc.)

Is it possible to use MultiResourceItemWriter for this requirement (please note that entire file name is dynamic and needs to be retrieved from Database).


回答1:


I'm not sure but I don't think there is an easy way of obtaining this. You could try to build your own ItemWriter like this:

public class DynamicItemWriter  implements ItemStream, ItemWriter<YourEntry> {

    private Map<String, FlatFileItemWriter<YourEntry>> writers = new HashMap<>();

    private LineAggregator<YourEntry> lineAggregator;

    private ExecutionContext executionContext;

    @Override
    public void open(ExecutionContext executionContext) throws ItemStreamException {
        this.executionContext = executionContext;
    }

    @Override
    public void update(ExecutionContext executionContext) throws ItemStreamException {
    }

    @Override
    public void close() throws ItemStreamException {
        for(FlatFileItemWriter f:writers.values()){
            f.close();
        }
    }

    @Override
    public void write(List<? extends YourEntry> items) throws Exception {
        for (YourEntry item : items) {
            FlatFileItemWriter<YourEntry> ffiw = getFlatFileItemWriter(item);
            ffiw.write(Arrays.asList(item));
        }
    }

    public LineAggregator<YourEntry> getLineAggregator() {
        return lineAggregator;
    }

    public void setLineAggregator(LineAggregator<YourEntry> lineAggregator) {
        this.lineAggregator = lineAggregator;
    }


    public FlatFileItemWriter<YourEntry> getFlatFileItemWriter(YourEntry item) {
        String key = item.FileName();
        FlatFileItemWriter<YourEntry> rr = writers.get(key);
        if(rr == null){
            rr = new FlatFileItemWriter<>();
            rr.setLineAggregator(lineAggregator);
            try {
                UrlResource resource = new UrlResource("file:"+key);
                rr.setResource(resource);
                rr.open(executionContext);
            } catch (MalformedURLException e) {
                e.printStackTrace();
            }
            writers.put(key, rr);
            //rr.afterPropertiesSet();
        }
        return rr;
    }
}

and configure it as a writer:

<bean id="csvWriter" class="com....DynamicItemWriter">
        <property name="lineAggregator">
        <bean
         class="org.springframework.batch.item.file.transform.DelimitedLineAggregator">
            <property name="delimiter" value=","/>
            <property name="fieldExtractor" ref="csvFieldExtractor"/>
        </bean>
    </property>



回答2:


In spring-batch, you can do this using ClassifierCompositeItemWriter.

Since ClassifierCompositeItemWriter gives you access to your object during write, you can write custom logic to instruct spring to write to different files.

Take a look at below sample. The ClassifierCompositeItemWriter needs an implementation of Classifier interface. Below you can see that I have created a lambda where I am implementing the classify() method of the Classifier interface. The classify() method is where you will create your ItemWriter. In our example below, we have created a FlatFileItemWriter which gets the name of the file from the item itself and then creates a resource for that.

@Bean
public ClassifierCompositeItemWriter<YourDataObject> yourDataObjectItemWriter(
    Classifier<YourDataObject, ItemWriter<? super YourDataObject>> itemWriterClassifier
) {
  ClassifierCompositeItemWriter<YourDataObject> compositeItemWriter = new ClassifierCompositeItemWriter<>();
  compositeItemWriter.setClassifier(itemWriterClassifier);
  return compositeItemWriter;
}

@Bean
public Classifier<YourDataObject, ItemWriter<? super YourDataObject>> itemWriterClassifier() {
  return yourDataObject -> {
    String fileName = yourDataObject.getFileName();

    BeanWrapperFieldExtractor<YourDataObject> fieldExtractor = new BeanWrapperFieldExtractor<>();
    fieldExtractor.setNames(new String[]{"recId", "name"});
    DelimitedLineAggregator<YourDataObject> lineAggregator = new DelimitedLineAggregator<>();
    lineAggregator.setFieldExtractor(fieldExtractor);

    FlatFileItemWriter<YourDataObject> itemWriter = new FlatFileItemWriter<>();
    itemWriter.setResource(new FileSystemResource(fileName));
    itemWriter.setAppendAllowed(true);
    itemWriter.setLineAggregator(lineAggregator);
    itemWriter.setHeaderCallback(writer -> writer.write("REC_ID,NAME"));

    itemWriter.open(new ExecutionContext());
    return itemWriter;
  };
}

Finally, you can attach your ClassifierCompositeItemWriter in your batch step like you normally attach your ItemWriter.

@Bean
public Step myCustomStep(
    StepBuilderFactory stepBuilderFactory
) {
  return stepBuilderFactory.get("myCustomStep")
      .<?, ?>chunk(1000)
      .reader(myCustomReader())
      .writer(yourDataObjectItemWriter(itemWriterClassifier(null)))
      .build();
}

NOTE: As pointed out in comments by @Ping, a new writer will be created for each chunk, which is usually a bad practice and not an optimal solution. A better solution would be to maintain a hashmap of filename and writer so that you can reuse the writer.



来源:https://stackoverflow.com/questions/15974458/spring-batch-writing-data-to-multiple-files-with-dynamic-file-name

易学教程内所有资源均来自网络或用户发布的内容,如有违反法律规定的内容欢迎反馈
该文章没有解决你所遇到的问题?点击提问,说说你的问题,让更多的人一起探讨吧!