function readOnly(count){ }
Starting November 20, the site will be set to read-only. On December 4, 2023,
forum discussions will move to the Trailblazer Community.
+ Start a Discussion
Greg Rohman 9Greg Rohman 9 

Assistance with 50k record limit in trigger - comparing to existing records

Hello.

I have a business use case where we are looking to generate a unique identifier every time an Account record is created. The identifier needs to follow a specific format, so an autonumber field is not an option in this scenario.

We have a fairly simple bulkified trigger in place to generate this code. Basically, the trigger:
  • Queries all of the existing Account records for their existing unique identifiers (outside of the trigger loop) for comparison
  • Then, for each record in Trigger.new:
    • Generates a new code in the format we need
    • Compares it against all of the existing codes (from the Account query outside the loop)
      • If it's unique:
        • update the Account in Trigger.new
        • Add the code to the full Account query list so it’s included in the next comparison
      • If it's not unique, re-generate the code and repeat the process
This has worked well, but we've exceeded 50k Account records in our org, and the initial query to obtain all of the codes to check against is now failing. Since our unique code does contain the first letter of the Account name, I considered adding a WHERE clause to the Account query… but since that query has to be outside of the Trigger.new loop, I don’t believe that will work either. And, since the 50k limit is per transaction and not per query, splitting the Account query into multiple queries likely will not work either.

Any suggestions for an approach to this issue? Much appreciated. Thank you.

-Greg
 
VinayVinay (Salesforce Developers) 
Hi Greg,

Chcek below similar best practices.

https://salesforce.stackexchange.com/questions/111977/design-practises-against-50-000-row-limit

Thanks,
Antoninus AugustusAntoninus Augustus
This is very unscalable I would suggest rethinking this business requirement and finding a more sustainable approach. Although, something that might work would be utilizing a batch class to process the records asynchronously.
You would have to change the trigger context to ‘after insert’, create an instance of the batch class and pass the list of Accounts being inserted to the batch class through the constructor.

This batch class would then generate the unique id for the newly created Account in the background, with the capability of processing millions of records.

Here’s a sample class, I’m not sure if it’ll actually work or if it will fit your need but is something to consider.
 
public with sharing class UniqueIdGeneratorBatch implements Database.Batchable<sobject>, Database.Stateful {

    List<Account> lstAccounts = new List<Account>();
    Set<Id> setAccountIds = new Set<Id>();
    Set<Id> setUniqueIds = new Set<Id>();

    public UniqueIdGeneratorBatch(Map<Id, Account> mapAccounts) {
        this.lstAccounts = mapAccounts.values();
        this.setAccountIds = mapAccounts.keySet();
    }
    public Database.QueryLocator start(Database.BatchableContext context){
        return Database.getQueryLocator('SELECT UNIQUE_ID__c FROM Account WHERE ID NOT IN: this.setAccountIds');
    }
    public void execute(Database.BatchableContext context, List<Account> lstAccountsToCompare) {
        for (Account objAccount : lstAccountsToCompare) {
            setUniqueIds.add(objAccount.UNIQUE_ID__c);
        }
    }
    public void finish(Database.BatchableContext context) {
        for (Account objAccount : lstAccounts) {
            while(true) {
                ID UID = ''; //Generate unique ID
                if (!setUniqueIds.contains(UID)) {
                    objAccount.UNIQUE_ID__c = UID;
                    setUniqueIds.add(UID);
                    break;
                }
            }
        }
        update this.lstAccounts;
    }
}