+ Start a Discussion
Patrick Marks 2Patrick Marks 2 

Need help-apex cpu time limit exceeded

Hi everyone-

I scheduled an apex job to run daily at midnight but noticed that the job failed due to the Apex CPU time limit exceeded error. The code works in my developer sandbox but not in production. I'm wondering if it is because of something in my SOQL query, but unclear on where the culprit would be. The query only returns about 500 results. Any help is appreciated- here is my class:
 
global without sharing class UnassignAccountsDueToInactivity implements Schedulable {
    global void execute(SchedulableContext ctx) {
        List<Account> acctList = [SELECT Id, Name, OwnerId 
                                  FROM Account 
                                  WHERE (OwnerProfileName__c LIKE '%AE%' OR OwnerProfileName__c LIKE '%SDR%') AND 
                                  (Current_Customer__c != 'Current Customer A' AND Current_Customer__c != 'Current Customer B' AND Current_Customer__c != 'Current Customer C') AND
                                  Number_of_Open_Opportunities__c <= 0 AND
                                  DaysSinceLastActivity__c >= 30 AND
                                  DaysSinceLastOwnerChangeDate__c >= 7];
        
        if(!acctList.isEmpty()){
            for(Account acc : acctList){
                acc.DisqualifiedReason__c = 'No response';
                acc.OwnerId = '000000000000000';
            }
            update acctList;
        }
    }
}
Alain CabonAlain Cabon
Hi,

What are the results of the query plans when you launch the following queries with the developer console? 

Query Plan Tool FAQ
https://help.salesforce.com/articleView?id=000199003&type=1

User-added image

User-added image

====================================

SELECT Id, Name, OwnerId
FROM Account
WHERE (OwnerProfileName__c LIKE '%AE%' OR OwnerProfileName__c LIKE '%SDR%')
AND (Current_Customer__c != 'Current Customer A'
AND Current_Customer__c != 'Current Customer B'
AND Current_Customer__c != 'Current Customer C')
AND Number_of_Open_Opportunities__c <= 0
AND DaysSinceLastActivity__c >= 30
AND DaysSinceLastOwnerChangeDate__c >= 7

=====================================

SELECT Id, Name, OwnerId
FROM Account
WHERE (OwnerProfileName__c LIKE '%AE%' OR OwnerProfileName__c LIKE '%SDR%')
AND Current_Customer__c NOT IN ( 'Current Customer A','Current Customer B','Current Customer C')
AND Number_of_Open_Opportunities__c <= 0
AND DaysSinceLastActivity__c >= 30
AND DaysSinceLastOwnerChangeDate__c >= 7

=====================================

SELECT Id, Name, OwnerId
FROM Account
WHERE (OwnerProfileName__c LIKE '%AE%')
AND Current_Customer__c NOT IN ( 'Current Customer A','Current Customer B','Current Customer C')
AND Number_of_Open_Opportunities__c <= 0
AND DaysSinceLastActivity__c >= 30
AND DaysSinceLastOwnerChangeDate__c >= 7

===================

SELECT Id, Name, OwnerId
FROM Account
WHERE (OwnerProfileName__c LIKE '%SDR%')
AND Current_Customer__c NOT IN ( 'Current Customer A','Current Customer B','Current Customer C')
AND Number_of_Open_Opportunities__c <= 0
AND DaysSinceLastActivity__c >= 30
AND DaysSinceLastOwnerChangeDate__c >= 7

=====================
Patrick Marks 2Patrick Marks 2
Hi Alain, here's the results:

Notes:
Not considering filter for optimization because unindexed. Table: Account Fields: ["IsDeleted"]
Not considering filter for optimization because unindexed. Table: Profile Fields: ["Name"]
Not considering filter for optimization because the operator is not optimizable and the filter is not owner-optimizable. Table: Account Fields: ["Current_Customer__c"]
Not considering filter for optimization because unindexed. Table: Account Fields: ["Number_of_Open_Opportunities__c"]
Not considering filter for optimization because unindexed. Table: Account Fields: ["DaysSinceLastActivity__c"]
Not considering filter for optimization because unindexed. Table: Account Fields: ["DaysSinceLastOwnerChangeDate__c"]

It won't let me paste an image of the table with Cardinality, Fields, Leading Operation Type, etc.
Patrick Marks 2Patrick Marks 2
Here's the results from the table:


Cardinality     Fields       Leading Operation Type              Cost                       sObject Cardinality     sObject Type
2                                    TableScan                                    0.82                        54                               Profile
7885                              TableScan                                    1.006637                98557                         Account
405                                TableScan                                    2.79878                  405                             User
JayantJayant
The time taken by queries to run does not count against the Apex CPU Time Limit.

How many records does your query return in Sandbox and in Production ?
Probably the number of records in Production is much larger.
You are not using a limit on your query as well, put a limit there since a transaction allows upto 50000 rows to be queried but the number of rows allowed for DML is just 10000. The day your query starts returning more than 10000 rows, it would fail with DML rows limit as well.

If the number of records to process is larger, schedule more frequently. Lets say you need to process 5000 records on a daily basis. You can schedule once to process all 5000 at one go or you may schedule to run twice a day to process 2500 each.

Easiest solution is determining a good LIMIT and applying to the query.
To determine this limit you may try diferent numbers and see the results. If 1000 is the max that your class can run successfully within 10 seconds (the CPU limit), maybe you should limit it as 500.  
Alain CabonAlain Cabon
Hi,

Honestly, given that your query works fine in the sandbox, the plan could be already fine.

Tablescan is not good overall but that would mean that you could index some fields and the only way is to check "external id" or "unique" (very restrictive option) to fix the query plan.

The cost of the query compared to the Force.com Query Optimizer’s selectivity threshold. Values above 1 mean that the query won’t be selective.

 1.006637 => very close to one (good)
 2.79878   => the worst result but it is a little object ( user ).
and profile (<1, it is a tiny object).

You can consider the caching of some data like the users but it is a heavy solution (recoding).

Safely Cache Values with the CacheBuilder Interface
Let’s look at an example. Suppose you’re coding an Apex controller class for a Visualforce page. In the Apex class, you often run a SOQL query that looks up a User record based on a user ID. SOQL queries can be expensive, and Salesforce user records don’t typically change much, so the User information is a good candidate for CacheBuilder.
In your controller class, create an inner class that implements the CacheBuilder interface and overrides the doLoad(Stringvar) method. Then add the SOQL code to the doLoad(String var) method with the user ID as its parameter.

https://developer.salesforce.com/docs/atlas.en-us.apexcode.meta/apexcode/apex_platform_cache_builder.htm

You can try alternative (with or without "OR", one very slow request or two faster requests) but not sure (in Oracle, "OR" is often a problem).

Sometimes the problem is linked to the deleted data still in the recycle bin (we had had this problem).

Integer count1 = [select count() from account all rows];
system.debug('count: ' + count1);


SELECT Id, Name, OwnerId
FROM Account
WHERE (OwnerProfileName__c LIKE '%AE%' OR OwnerProfileName__c LIKE '%SDR%')
AND Current_Customer__c NOT IN ( 'Current Customer A','Current Customer B','Current Customer C')
AND Number_of_Open_Opportunities__c <= 0
AND DaysSinceLastActivity__c >= 30
AND DaysSinceLastOwnerChangeDate__c >= 7

... faster?
JayantJayant
Just reiterating -
The time taken by queries to run does not count against the Apex CPU Time Limit.
$hwet@$hwet@
Reason for CPU Time Limit : 
https://help.salesforce.com/articleView?id=000232681&language=en_US&type=1
https://salesforce.stackexchange.com/questions/155336/contact-mass-update-system-limitexception-apex-cpu-time-limit-exceeded
Alain CabonAlain Cabon
@Patrick Marks

If you have found a solution or a workaround, that will be interesting to have your feedback because the options are very few given that we cannot index the fields easily and there is always the dilemna of a query not very selective (cursor quickly opened) but very long to read versus a very selective query with fields not indexed and a query time out because the cursor has never been initiated.

The other idea is to slice the read data by periods of time (fields indexed by default).

Q: Will NOT IN work the same as IN when it comes to optimization?
A: The optimizer interprets "field NOT IN (value, value, value)" as "field != value AND field != value AND field != value". Notice at present only != null and != on booleans are optimizable.
 
Q: What is the best way to write a "not in" SOQL query, so that the query optimizer will use an index?
A: These are some ideas:
1) Switch to an IN operator if the number of values is doable and the union of them would be selective
2) Add another selective filter to your query.

https://help.salesforce.com/articleView?id=000181277&type=1&language=en_US
 
JayantJayant
Please don't waste your time on optimizing query, it has got nothing to do with the exception you are getting and its already optimal. Time taken by SOQL query executions are not counted against the CPU Time Limit.
Following are your best options - 
1. Just move your code to a different method in another class that does not implement Schedulable interface. Mark that method as @future and call it from your Scheduled class. This would increase the allowed CPU Time Limit to 60 seconds(for future methods) instead of 10 seconds(for scheduled or synchronous methods). 60 seconds would be more than enough to process all the records.
2. Reduce the number of records your query returns by applying a LIMIT clause and execute multiple times to process all your records instead of processing all records in just one transaction.

Approach 1 is really very simple and effective and won't take more than 5 minues :).
Patrick Marks 2Patrick Marks 2
Hey everyone-I manually updated the 500 Accounts, which caused the code to work the following day with the next batch. Appreciate everyone's help!