+ Start a Discussion

Calculating Record Counts with More than 50,000 records?

Using count():

Integer totalShipments = [SELECT count() FROM Invoice_Detail__c WHERE Invoice__r.Invoice_Type__c = 'SHP'];

produces (as expected as this code was from before aggregate queries)

System.LimitException: Too many query rows: 50001


Using AggregateResult:

Integer totalShipments = 0;
List<AggregateResult> groupedResults = [SELECT COUNT(Id) theCount FROM Invoice_Detail__c WHERE Invoice__r.Invoice_Type__c = 'SHP'];
for (AggregateResult ar : groupedResults) {
totalShipments = Integer.valueOf(ar.get('theCount'));

produces (not as expected)

System.LimitException: Too many query rows: 50001

LIMIT_USAGE_FOR_NS Number of query rows: 52493 out of 50000 ******* CLOSE TO LIMIT


So apparently even though Oracle is quite capable of efficiently producing a summary aggregate result without providing the details and I imagine an ambitious programmer could compose such a SQL query despite having to generate it from a SOQL aggregate query to fit the internal SF complex multi-tenant schema, we are being charged for each row included in the aggregate calculation instead of just the single aggregate row returned.


Is there any workaround other than perhaps calculating this client side with javascript in a visualforce page (and being charged 1 API transaction per 2000).   I imagine it would be quite slow to do 25+ sequential API transactions on a button click.


I'm trying to place a company metric in a dashboard from a visualforce page with a custom controller.  Yet I seem unable to make this simple calculation.  I can't use Batch Apex to pre-calculate as eventually there will varying filter criteria such as date ranges that quite reasonably could include "for all time".


I think the new @ReadOnly feature that is in pilot might be use case for this.


Do a search in the Apex docs, and maybe request it to see if it will do what you want?