+ Start a Discussion
LBSLBS 

Best Practices for Batch Processing

Hi There,

We have a business process to execute batch inserts/updates and Searches. Here is the scenario, We have a community Portal where external users will select List of records (custom records) based on a certain logic. This list may have up to 100-500 records.
And based on their selection, we have to process them. Like create Accounts, Contacts and create set of Custom records, send email alerts etc.. + Searches. This process is significantly heavy. And after inserting some of above custom records, there are triggers + Apex classes to run too. This will simply exceeds most of the governor limits.
Apparently this community was in a Client side application previously, where we used SOAP API to handle the Process. In that way we didn't have to bother about governor limits.
So I seek the help from yours to understand the best option to follow to implement this. Few options we are considering right now is,
Using Batch Apex
Using BulkTK - BulkTK
Using @future
Use Apex REST?
Appreciate your help here to understand the best possible approach. Or else please share your thoughts

Many Thanks, Lahiru
Narveer SinghNarveer Singh
Hi Lahiru,

I went through same situation in my current project where i need to create Some task and Action on basis of one bye one record insertion and old values present.For this i have used Batch class.Below are the steps by which i have handled the scenario :

Have create Checkbox field(By default is will be false) on upcoming record and then fetch and Process record as :

Map<String,String> fields = Utility.getFieldMap('Object', 'Object');       
List<Object> list =  Database.query('SELECT ' + String.join(fields .values(),',') + ' FROM Object WHERE Checkbox = False LIMIT 100');
for(Object ob : list) {
    ob.Checkbox= true;
}
Engine.Var = 1;
update list;

When ever our record will insert it will execute  trigger method as :

if((system.isBatch()) {
    
        for(Object obj : Trigger.New) {
            
            if(Trigger.isUpdate && obj.Checkbox == True && obj.Checkbox != Trigger.OldMap.get(Object.id).obj.Checkbox) {
                    List.add(obj);
                }
        }    
        if(iprListForce.size()>0) {
          Database.executeBatch(new Engine(List,1),1);  
        }
    }

Engine will be you main batch class Where you need to put your logic to process one by one record.

By this i have handled all the governer Limit etc and my code is working perfectly fine for around more then 70000 records.

Hope this helps you!

Please accept my solution as Best Answer if my reply was helpful. It will make it available for other as the proper solution.Let me know if anything else.

Best Regards
Narveer