How to read huge CSV file in Mule

后端 未结 2 1100
北荒
北荒 2021-01-06 04:24

I\'am using Mule Studio 3.4.0 Community Edition. I have a big problem about how to parse a large CSV file incoming with File Endpoint. The scenario is that I have 3 CSV fil

2条回答
  •  既然无缘
    2021-01-06 05:12

    As SteveS said, the csv-to-maps-transformer might try to load the entire file to memory before process it. What you can try to do is split the csv file in smaller parts and send those parts to VM to be processed individually. First, create a component to achieve this first step:

    public class CSVReader implements Callable{
        @Override
        public Object onCall(MuleEventContext eventContext) throws Exception {
    
            InputStream fileStream = (InputStream) eventContext.getMessage().getPayload();
            DataInputStream ds = new DataInputStream(fileStream);
            BufferedReader br = new BufferedReader(new InputStreamReader(ds));
    
            MuleClient muleClient = eventContext.getMuleContext().getClient();
    
            String line;
            while ((line = br.readLine()) != null) {
                muleClient.dispatch("vm://in", line, null);
            }
    
            fileStream.close();
            return null;
        }
    }
    

    Then, split your main flow in two

    
    
    
        
            
        
        
        
    
    
    
        
        
            .
            .
            Your JDBC Stuff
            .
            .
        
    
    

    Maintain your current file-connector configuration to enable streaming. With this solution the csv data can be processed without the need to load the entire file to memory first. HTH

提交回复
热议问题