+ Start a Discussion
Joel CHADETJoel CHADET 

How to work with more than 50000 rows

Hi, 

In the controller class of a lightning component, I have this following group by query :
 
SELECT CALENDAR_YEAR(Transaction_Date__c) year,CALENDAR_MONTH(Transaction_Date__c) gbfield, SUM(Net_Net_Sales_EUR__c) amount FROM Sales_Reporting__c
WHERE Transaction_Date__c IN (LAST_N_FISCAL_YEARS:1,THIS_FISCAL_YEAR)
AND Sold_To__r.Country__r.Id ='a4O0L000000QJsKUAW' group by CALENDAR_YEAR(Transaction_Date__c),CALENDAR_MONTH(Transaction_Date__c)
ORDER BY CALENDAR_MONTH(Transaction_Date__c)

In developer console it returns 33 rows.
When i try to run my component i have this error message : 

FATAL_ERROR System.LimitException: Too many query rows: 50001
Number of SOQL queries: 1 out of 100
number of query rows: 139180 out of 50000 *******

My query cannot be more filtered, my component use Chart.js to display Chart, so all the 139180 rows are needed.

Any suggestion ?
Best Answer chosen by Joel CHADET
challier fabricechallier fabrice

Hi Joel, in order to bypass the soql row limit from Apex, you'll need to execute your query through the rest api. Then process the result within your class. Here is an example :

 

public class TestClass {
    
    public static void tryMe(){
        String endPoint = URL.getSalesforceBaseUrl().toExternalForm()+'/services/data/v44.0/query/?q=SELECT+name+from+Account+limit+10';
        HttpRequest req = new HttpRequest(); 
        
        req.setHeader('Authorization', 'Bearer ' + UserInfo.getSessionID()); 
        req.setHeader('Content-Type', 'application/json'); 
        req.setEndpoint(endPoint); 
        req.setMethod('GET'); 
        Http h = new Http(); 
        HttpResponse res = h.send(req); 
        system.debug(res.getBody());
        
        Map<String, Object> fieldMap = (Map<String, Object>)JSON.deserializeUntyped(res.getBody());
        system.debug('fcha>>'+fieldMap);
        List<Object> recordsMap =   (List<Object>)fieldMap.get('records');
        for(Object anObj : recordsMap){
            Map<String, Object> record=(Map<String, Object>) anObj;
            system.debug('name>'+record.get('name'));
        }
    }
    
}

as you are running your code in lightning , you'll face a new issue related to the lightning session that will not let you call a rest Api service.

follow the blog below to solve this new issue :
https://sfdcpanther.wordpress.com/2018/02/05/how-to-call-salesforce-api-from-lightning-component/

I hope it will help you
 

Fabrice

All Answers

Raj VakatiRaj Vakati
Can you check is this Query is inside any for loop ?? .. Give me apex class
Joel CHADETJoel CHADET
Hi Raj, No it is not in any loop .
Here is the apex class, the query executed is in my first post.
@AuraEnabled
public static Map<Integer ,Map<String,Decimal[]>> getSalesByMonthData(Id recordId, String startDate, String endDate, String groupByField, String AccountCurrency, String whereClause){
    system.debug('@@@  getSalesByMonthData amountFields '+AccountCurrency);
    String  amountFields ;
    Map<String,String> currencyMap = getCurrency(recordId,AccountCurrency);
    for(String key : currencyMap.keySet()){
        amountFields = currencyMap.get(key);
    }
    Map<Integer ,Map<String,Decimal[]>> datasets = new Map<Integer ,Map<String,Decimal[]>>();
    //Build the query
    String reportQuery = 'SELECT CALENDAR_YEAR(Transaction_Date__c) year,'+groupByField+' gbfield, SUM('+amountFields+') amount FROM Sales_Reporting__c WHERE ';
    if(whereClause != null && whereClause!= ''){
        System.debug(' @@@ where clause before :  '+whereClause);
        String whereClauseAfter = whereClause.replaceAll('recordId',recordId);
        System.debug(' @@@ where clause after :  '+whereClauseAfter);
        reportQuery +=whereClauseAfter;
        }
    reportQuery +=' group by CALENDAR_YEAR(Transaction_Date__c),'+groupByField+' ORDER BY SUM('+amountFields+') desc LIMIT 50000';
    
    List<AggregateResult> result = new List<AggregateResult>();
    system.debug('@@@ reportQuery '+reportQuery);
    try{
        result = Database.query(reportQuery);        
    }catch(Exception e){
        system.debug(e); // <- Too many query rows: 50001
    }
...
challier fabricechallier fabrice

Hi Joel, in order to bypass the soql row limit from Apex, you'll need to execute your query through the rest api. Then process the result within your class. Here is an example :

 

public class TestClass {
    
    public static void tryMe(){
        String endPoint = URL.getSalesforceBaseUrl().toExternalForm()+'/services/data/v44.0/query/?q=SELECT+name+from+Account+limit+10';
        HttpRequest req = new HttpRequest(); 
        
        req.setHeader('Authorization', 'Bearer ' + UserInfo.getSessionID()); 
        req.setHeader('Content-Type', 'application/json'); 
        req.setEndpoint(endPoint); 
        req.setMethod('GET'); 
        Http h = new Http(); 
        HttpResponse res = h.send(req); 
        system.debug(res.getBody());
        
        Map<String, Object> fieldMap = (Map<String, Object>)JSON.deserializeUntyped(res.getBody());
        system.debug('fcha>>'+fieldMap);
        List<Object> recordsMap =   (List<Object>)fieldMap.get('records');
        for(Object anObj : recordsMap){
            Map<String, Object> record=(Map<String, Object>) anObj;
            system.debug('name>'+record.get('name'));
        }
    }
    
}

as you are running your code in lightning , you'll face a new issue related to the lightning session that will not let you call a rest Api service.

follow the blog below to solve this new issue :
https://sfdcpanther.wordpress.com/2018/02/05/how-to-call-salesforce-api-from-lightning-component/

I hope it will help you
 

Fabrice

This was selected as the best answer
Joel CHADETJoel CHADET
Thanks a lot Fabrice a rest api call idea is genius and solve my issue !!!
Arnie Ug 18Arnie Ug 18
Hey CHADET, How did you aggregate all the rows with the REST API method?
I am hitting Heap Size Limit Exceptions when I try to aggregate the records.
Fabrice Challier 12Fabrice Challier 12
Hi Arnie, Does you aggregate query run fine within developer console or workbench first ?
Fabrice
Arnie Ug 18Arnie Ug 18
Hi Fabrice,

Sorry for the confusion but I'm not using an aggregate query its a child to parent query. Yes, it's fine at dev console. And works with workbench.

I am having a problem when I try to get the multiple batches of 2000 records into one list variable.
I am getting a heap size limit exception using multiple requests below. 
It is understandable as my actual query gets 10 fields, but have not yet found a way on how to resolve/work around it.
        String endPoint = URL.getSalesforceBaseUrl().toExternalForm()+'/services/data/v44.0/query/?q=SELECT+name+from+Account';
        HttpRequest req = new HttpRequest(); 
        
        req.setHeader('Authorization', 'Bearer ' + UserInfo.getSessionID()); 
        req.setHeader('Content-Type', 'application/json'); 
        req.setEndpoint(endPoint); 
        req.setMethod('GET'); 
        Http h = new Http(); 
        HttpResponse res = h.send(req); 
        system.debug(res.getBody());
         
        Map<String, Object> fieldMap = (Map<String, Object>)JSON.deserializeUntyped(res.getBody());
        system.debug('fcha>>'+fieldMap);
        List<Object> recordsMap =   (List<Object>)fieldMap.get('records');

        while(fieldMap.get('nextRecordsUrl') != null){
               String endPoint = URL.getSalesforceBaseUrl().toExternalForm()+fieldMap.get('nextRecordsUrl');
               req.setEndpoint(endPoint); 

               h = new Http(); 
               res = h.send(req); 
               system.debug(res.getBody());
        
               Map<String, Object> fieldMap = (Map<String, Object>)JSON.deserializeUntyped(res.getBody());
               system.debug('fcha>>'+fieldMap);
               recordsMap.addAll(List<Object>)fieldMap.get('records'));
        }

 
Fabrice CHALLIER 16Fabrice CHALLIER 16

Sorry , I was a bit busy. Could you figure out where the heapSize exception is happening ?

I mean, does it happen on line 12 (therefore you could not see the system.debug)
Or is it happening after.

If the issue is happening before line 12 , you 'll have to split your query in many so that you don't hit the limit.
If it's happening after line 12, 
you'll need to make your own garbage collector. This mean that you'll have to remove from memory the record when you don't need them in order to gain some memory

Could you share the code you are trying ? The one provided is not compiling