Large dataset processing using Mule ESB from database: how to update the processed records based on certain batch size?
问题 I have a large set of records to be processed eg: 100,000 records. My use case has 4 steps: pick the records from database table using jdbc inbound adapter convert the record to xml format post the message to the queue then update the same record with some status flag as it has been processed so that it will not be picked again I don't want to pick all the records from table at one shot for processing: is there a way so that pick in some batches and don't want to update table for one record?