I\'am using Mule Studio 3.4.0 Community Edition. I have a big problem about how to parse a large CSV file incoming with File Endpoint. The scenario is that I have 3 CSV fil
As SteveS said, the csv-to-maps-transformer
might try to load the entire file to memory before process it. What you can try to do is split the csv file in smaller parts and send those parts to VM
to be processed individually.
First, create a component to achieve this first step:
public class CSVReader implements Callable{
@Override
public Object onCall(MuleEventContext eventContext) throws Exception {
InputStream fileStream = (InputStream) eventContext.getMessage().getPayload();
DataInputStream ds = new DataInputStream(fileStream);
BufferedReader br = new BufferedReader(new InputStreamReader(ds));
MuleClient muleClient = eventContext.getMuleContext().getClient();
String line;
while ((line = br.readLine()) != null) {
muleClient.dispatch("vm://in", line, null);
}
fileStream.close();
return null;
}
}
Then, split your main flow in two
.
.
Your JDBC Stuff
.
.
Maintain your current file-connector
configuration to enable streaming. With this solution the csv data can be processed without the need to load the entire file to memory first.
HTH