You need to sign in to do that
Don't have an account?
Vanessa Bell 16
Apex CPU Time Limit exceeded error received for Batch Apex
Hello Everyone!
I have built an Apex Class and Scheduler Class in our FSB2 Full Sandbox Environment. The classes are intended to update all Open Opportunities that have a Days_Stale__c > 30 and (Today_s_Date__c < TODAY () OR Today_s_Date__c = null) with a value of TODAY() in the Today_s_Date__c Opportunity field.
When the Today_s_Date__c field is updated to TODAY() on the Opportunity record, it is intended to invoke a Process Builder that updates a "Reminder Date" field with a Date and sends an email to the Opportunity Owner advising the Opporunity is "stale" and needs to be updated.
Process Builder: Opportunity - Email Alerts for Stale Dated Opps
Batch Apex Class: DailyStaleOppsProcessor
Batch Apex Test Class: DailyStaleOppsProcessorTest
Scheduled Apex Class: DailyStaleOppsscheduledBatchable
Scheduled Apex Test Class: DailyStaleOppsscheduledBatchableTest
When we run the Open Execute Anonymous Window to execute the DailyStaleOppsProcessor apex class, we eventually receive an error saying Apex CPU Time Limit exceeded.
I have 100% Code Coverage on both classes. I am unsure of how to rewrite my class to stop receiving this error. Here's my code:
global class DailyStaleOppsProcessor implements Database.Batchable<SObject>, Database.Stateful{
List<Opportunity> listRecords = new List<Opportunity>();
global Database.QueryLocator start (Database.BatchableContext BC)
{
String query = 'select id, today_s_date__c from opportunity where isclosed= false and (Days_Stale__c > 30) and (today_s_date__c < today or today_s_date__c = null)' ;
return Database.getQueryLocator(query);
}
global void execute(Database.BatchableContext BC, List<SObject> scope){
for(Opportunity obj : (Opportunity []) scope) {
if(obj.Today_s_Date__c!=date.today()){
obj.Today_s_Date__c = System.today();
listRecords.add(obj);
}
}
}
global void finish(Database.BatchableContext BC){
system.debug('list to be deleted size :: '+listRecords.size());
if(!listRecords.isEmpty())
{
update listRecords;
}
}
}
I have built an Apex Class and Scheduler Class in our FSB2 Full Sandbox Environment. The classes are intended to update all Open Opportunities that have a Days_Stale__c > 30 and (Today_s_Date__c < TODAY () OR Today_s_Date__c = null) with a value of TODAY() in the Today_s_Date__c Opportunity field.
When the Today_s_Date__c field is updated to TODAY() on the Opportunity record, it is intended to invoke a Process Builder that updates a "Reminder Date" field with a Date and sends an email to the Opportunity Owner advising the Opporunity is "stale" and needs to be updated.
Process Builder: Opportunity - Email Alerts for Stale Dated Opps
Batch Apex Class: DailyStaleOppsProcessor
Batch Apex Test Class: DailyStaleOppsProcessorTest
Scheduled Apex Class: DailyStaleOppsscheduledBatchable
Scheduled Apex Test Class: DailyStaleOppsscheduledBatchableTest
When we run the Open Execute Anonymous Window to execute the DailyStaleOppsProcessor apex class, we eventually receive an error saying Apex CPU Time Limit exceeded.
I have 100% Code Coverage on both classes. I am unsure of how to rewrite my class to stop receiving this error. Here's my code:
global class DailyStaleOppsProcessor implements Database.Batchable<SObject>, Database.Stateful{
List<Opportunity> listRecords = new List<Opportunity>();
global Database.QueryLocator start (Database.BatchableContext BC)
{
String query = 'select id, today_s_date__c from opportunity where isclosed= false and (Days_Stale__c > 30) and (today_s_date__c < today or today_s_date__c = null)' ;
return Database.getQueryLocator(query);
}
global void execute(Database.BatchableContext BC, List<SObject> scope){
for(Opportunity obj : (Opportunity []) scope) {
if(obj.Today_s_Date__c!=date.today()){
obj.Today_s_Date__c = System.today();
listRecords.add(obj);
}
}
}
global void finish(Database.BatchableContext BC){
system.debug('list to be deleted size :: '+listRecords.size());
if(!listRecords.isEmpty())
{
update listRecords;
}
}
}
Salesforce has a timeout limit for transactions based on CPU usage. If transactions consume too much CPU time, they will be shut down as a long-running transaction.
Please refer below article for more details:
https://help.salesforce.com/articleView?id=000339361&language=en_US&type=1&mode=1
Thanks,
Suraj
you should not perform any DML on finish method. You can see the total processed record by tracking the variable with the Database. Stateful.
he default size is 200 for a batch. With a size of 200, a batch job won't hve such issue.
But in your case, there is Process Builder kicked in. Process Builder is not running on batch mode. It runs every time for each record changed in the batch. It could be running as many times as scope size 200.
You can do a test by descreasing the scope size to less than 10 (like 1 or 2). The issue should be gone.
Database.executeBatch(new DailyStaleOppsProcessor (),2);
Another issue is what Amit indicated, it is not a good practice to put DML in Finish method even though it seems in your case it won't hit 10,000 record limit.