How to read huge CSV file in Mule

后端 未结 2 1101
北荒
北荒 2021-01-06 04:24

I\'am using Mule Studio 3.4.0 Community Edition. I have a big problem about how to parse a large CSV file incoming with File Endpoint. The scenario is that I have 3 CSV fil

2条回答
  •  粉色の甜心
    2021-01-06 05:10

    I believe that the csv-to-maps-transformer is going to force the whole file into memory. Since you are dealing with one large file, personally, I would tend to just write a Java class to handle it. The File endpoint will pass a filestream to your custom transformer. You can then make a JDBC connection and pick off the information a row at a time without having to load the whole file. I have used OpenCSV to parse the CSV for me. So your java class would contain something like the following:

    protected Object doTransform(Object src, String enc) throws TransformerException {  
    
        try {
            //Make a JDBC connection here
    
            //Now read and parse the CSV
    
            FileReader csvFileData = (FileReader) src;
    
    
            BufferedReader br = new BufferedReader(csvFileData);
            CSVReader reader = new CSVReader(br);
    
            //Read the CSV file and add the row to the appropriate List(s)
            String[] nextLine;
            while ((nextLine = reader.readNext()) != null) {
                //Push your data into the database through your JDBC connection
            }
            //Close connection.
    
                   }catch (Exception e){
        }
    

提交回复
热议问题