+ Start a Discussion
Always ThinkinAlways Thinkin 

Very Large Query Failing - Help!

I have a query against a table of 100,000+ records that I have been optimizing. Yesterday, Salesforce added custom indexing on the Contacted__c field, and since then, the query fails even when the table is reduced to less than 100,000 records (I had to permanently delete 20,000 Leads to Marketing's disgust).


The query is: 

List<Lead> leadsNotContacted = [SELECT Id, OwnerId, FirstName FROM   Lead WHERE Contacted__c = FALSE AND LastModifiedDate > 2010-04-30T00:00:00.000Z AND IsConverted = FALSE AND IsDeleted = FALSE AND OwnerId IN :idScheduledStaff];

For optimization, there is a custom indexed field (Contacted__c) which returns about 7500 records, an audit field (LastModDate) which returns about 45,000 records.The bind pulls in about 20-25 User IDs. Ultimately, the query should result in about 400-500 records.


The code worked fine up to 100,000 records prior to indexing it. After indexing it, once we passed 100,000 records, I could not restore its functionality even by dropping the table down to 80,000 records. 


Why would the query continue to throw errors when the table is well below 100,000 and the indexed field is below 10,000?



I can't help on the core issue - you might want to log a ticket with support.


Is there a reason you can't use batch apex?  Then you won't have to worry about table size given batch apex supports 50mil records.