function readOnly(count){ }
Starting November 20, the site will be set to read-only. On December 4, 2023,
forum discussions will move to the Trailblazer Community.
+ Start a Discussion
Vanessa Bell 16Vanessa Bell 16 

Apex CPU Time Limit exceeded error received for Batch Apex

Hello Everyone!

I have built an Apex Class and Scheduler Class in our FSB2 Full Sandbox Environment. The classes are intended to update all Open Opportunities that have a Days_Stale__c  > 30 and (Today_s_Date__c < TODAY () OR Today_s_Date__c = null) with a value of TODAY() in the Today_s_Date__c Opportunity field. 

When the Today_s_Date__c field is updated to TODAY() on the Opportunity record, it is intended to invoke a Process Builder that updates a "Reminder Date" field with a Date and sends an email to the Opportunity Owner advising the Opporunity is "stale" and needs to be updated.

Process Builder: Opportunity - Email Alerts for Stale Dated Opps 

Batch Apex Class: DailyStaleOppsProcessor
Batch Apex Test Class: DailyStaleOppsProcessorTest
Scheduled Apex Class: DailyStaleOppsscheduledBatchable
Scheduled Apex Test Class: DailyStaleOppsscheduledBatchableTest

When we run the Open Execute Anonymous Window to execute the DailyStaleOppsProcessor apex class, we eventually receive an error saying Apex CPU Time Limit exceeded. 

I have 100% Code Coverage on both classes. I am unsure of how to rewrite my class to stop receiving this error. Here's my code: 

global class DailyStaleOppsProcessor implements Database.Batchable<SObject>, Database.Stateful{
    
    
        List<Opportunity> listRecords = new List<Opportunity>();

    global Database.QueryLocator start (Database.BatchableContext BC)
    {
        String query = 'select id, today_s_date__c from opportunity where isclosed= false and (Days_Stale__c > 30) and (today_s_date__c < today or today_s_date__c = null)' ;
        return Database.getQueryLocator(query);
    }
    
    global void execute(Database.BatchableContext BC, List<SObject> scope){

        for(Opportunity obj : (Opportunity []) scope) {
            
        if(obj.Today_s_Date__c!=date.today()){
            obj.Today_s_Date__c = System.today();
            listRecords.add(obj);
            }
         }
     
    }
     
    global void finish(Database.BatchableContext BC){
        system.debug('list to be deleted size  :: '+listRecords.size());
        if(!listRecords.isEmpty())
            {
              update listRecords;
            }

    }
}
Suraj MakandarSuraj Makandar
Hi Vanessa Bell 16,


Salesforce has a timeout limit for transactions based on CPU usage. If transactions consume too much CPU time, they will be shut down as a long-running transaction.

Please refer below article for more details:
https://help.salesforce.com/articleView?id=000339361&language=en_US&type=1&mode=1

Thanks,
Suraj
Amit Kumar1Amit Kumar1

you should not perform any DML on finish method. You can see the total processed record by tracking the variable with the Database. Stateful.

global class DailyStaleOppsProcessor implements Database.Batchable<SObject>, Database.Stateful{
    //tracking the total records processed
    global Integer recordsProcessed = 0;
    global Database.QueryLocator start (Database.BatchableContext BC)
    {
        String query = 'select id, today_s_date__c from opportunity where isclosed= false and (Days_Stale__c > 30) and (today_s_date__c < today or today_s_date__c = null)' ;
        return Database.getQueryLocator(query);
    }
    
    global void execute(Database.BatchableContext BC, List<SObject> scope){
        List<Opportunity> listRecords = new List<Opportunity>();
        for(Opportunity obj : scope) {
            
            if(obj.Today_s_Date__c!=date.today()){
                obj.Today_s_Date__c = System.today();
                listRecords.add(obj);
            }
        }
        system.debug('list to be deleted size  :: '+listRecords.size());
        if(listRecords.size()>0)
        {
            try{
                update listRecords;
                // increment the instance member counter
                recordsProcessed = recordsProcessed + 1;}
            catch(Exception ex){
                System.debug(ex);//catch the exception
            }
        }
    }
    
    global void finish(Database.BatchableContext BC){
        //View the total Records processed.
        System.debug(recordsProcessed + ' records processed. Shazam!');
        AsyncApexJob job = [SELECT Id, Status, NumberOfErrors, 
                            JobItemsProcessed,
                            TotalJobItems, CreatedBy.Email
                            FROM AsyncApexJob
                            WHERE Id = :bc.getJobId()];
    }
}
David Zhu 🔥David Zhu 🔥
CPU Time limit exceeding issue usually happens when the scope size is too large.
he default size is 200 for a batch.  With a size of 200, a batch job won't hve such issue.
But in your case, there is Process Builder kicked in. Process Builder is not running on batch mode. It runs every time for each record changed in the batch. It could be running as many times as scope size 200.

You can do a test by descreasing the scope size to less than 10 (like 1 or 2). The issue should be gone.  

Database.executeBatch(new DailyStaleOppsProcessor (),2);

Another issue is what Amit indicated, it is not a good practice to put DML in Finish method even though it seems in your case it won't hit 10,000 record limit.