You need to sign in to do that
Don't have an account?
System.QueryException: Non-selective query against large object type
This error just started on a trigger that has been running perfectly for many months.
PaymentDuplicatePreventer: execution of BeforeInsert caused by: System.QueryException: Non-selective query against large object type (more than 100000 rows). Consider an indexed filter or contact salesforce.com about custom indexing. Even if a field is indexed a filter might still not be selective when: 1. The filter value includes null (for instance binding with a list that contains null) 2. Data skew exists whereby the number of matching rows is very large (for instance, filtering for a particular foreign key value that occurs many times) Trigger.PaymentDuplicatePreventer: line 23, column 2
All this trigger is supposed to do is stop duplicates based on the unique code, which is just a formula field that combines two other fields on the record. Here is the trigger. It is pretty much copied exactly from the force.com cookbook about how to prevent duplicates. Any help is appreciated, thanks.
trigger PaymentDuplicatePreventer on Payments__c(before insert) { //Create a map to hold all the payments we have to query against Map<String, Payments__c> payMap = new Map<String, Payments__c>(); //Loop over all passed in payments for (Payments__c payment : System.Trigger.new) { // As long as this payment has a payment code and either it's an insert or it's doesn't conflict with another payment in this batch */ if ((payment.UniquePaymentCode__c != null) && (System.Trigger.isInsert || (payment.UniquePaymentCode__c != System.Trigger.oldMap.get(payment.Id).UniquePaymentCode__c))) { // Make sure another new payment isn't also a duplicate. If it is, flag it, if not, add it if (payMap.containsKey(payment.UniquePaymentCode__c)) { payment.UniquePaymentCode__c.addError('Another new payment has the same unique identifier.'); } else { payMap.put(payment.UniquePaymentCode__c, payment); } } } /* Using a single database query, find all the payments in the database that have the same uniquepaymentcode as any of the payments being inserted or updated. */ if(payMap.size() > 0) { for (Payments__c payment : [SELECT Id,UniquePaymentCode__c FROM Payments__c WHERE UniquePaymentCode__c IN :payMap.KeySet()]) { try { Payments__c newPay = payMap.get(payment.UniquePaymentCode__c); if(newPay != null) { newPay.UniquePaymentCode__c.addError('A payment for this person in this study already exists.'); } } catch ( System.DmlException e) { payment.adderror('Payment' + payment.UniquePaymentCode__c + 'Error ' + e); } } } }
Ended up adding a date range to my duplicate finder as duplicates can only happen within a short time span for this particular object. For the record, salesforce says they cannot do indexing on formula fields.
All Answers
Check out this thread for more explaination and some ways you can resolve the issue:
http://community.salesforce.com/t5/Apex-Code-Development/SystemQueryException-Non-Selective-Query-against-large-object/m-p/193105
There are also other previous threads on the topic, just search for the error.
Thank you, I read through that, and basically understand that I either need to put in a where clause (but I don't know where clause would make sense here other than the one I have) or I need to index the field by making it an external identifier (but I cannot, because it is a formula field) or somehow break it into smaller batches (not sure how to do this) or talk to Salesforce about custom indexing.
Guess I'm not the only one who had this just suddenly crop up. Any suggestions on tweaks I could make to the code to resolve this? I have logged a case with salesforce, but I'm not sure they'll respond in time.
It is happening now because your custom object "Payments__c" probably just exceeded 100,000 records.
It looks like you are supplying the where condition on the field "UniquePaymentCode__c". If this is the formula field you're talking about, SalesForce may be able to do a custom index on it for you. Yea, response times from SalesForce is sketchy at best. I'd advise you give them a call if it is critical.
Yeah, kinda figured I probably just breached the 100,000 mark, that would make sense. I'll call them up and see what we can do. Thanks!
Ended up adding a date range to my duplicate finder as duplicates can only happen within a short time span for this particular object. For the record, salesforce says they cannot do indexing on formula fields.