function readOnly(count){ }
Starting November 20, the site will be set to read-only. On December 4, 2023,
forum discussions will move to the Trailblazer Community.
+ Start a Discussion
Starz26Starz26 

What is the best practice here?

If I return 200 records an put them in a list.

 

If i update 50 of them, what is the best way to perform the update:

 

1. Update the List (which triggers the update context triggers for all 200 records

    

     For(Account[] al : [Select ID, BillingPostalCode From Account Limit 200]){

           For(Account a : al){           

                      if (a.Name.contains('50')){ //Returns only 50 of the 200 records

                             a.BillingPostalCode = 54321;

           }

     update al;

     }

 

2. or keep track of the updated records and only update those 50 (at a cost to heap size)

 

Account[] tbuAcc = New Account[]{};

 

For(Account[] al : [Select ID, BillingPostalCode From Account Limit 200]){

           For(Account a : al){           

                      if (a.Name.contains('50')){ //Returns only 50 of the 200 records

                             a.BillingPostalCode = 54321;

                             tbuAcc.add(a);

           }

     if(tbuAcc.size() >0)

         update tbuAcc;

     }

 

 

Best Answer chosen by Admin (Salesforce Developers) 
mulvelingmulveling

Your tbuAcc array should be using very little heap space -- I can't imagine a scenario in which saving that little bit would be worth the bother. Remember, it's an array of references, each of which is fairly small. The Objects themselves have already been allocated (once each) in memory by the time you put them in tbuAcc. Even the mem allocated for the objects you've show here wouldn't be big, relative to your 3MB of heap space.

 

I certainly try to keep DML update calls limited to only records that have been "touched" -- yes, that usually means an extra list (or lists). You must consider validation/workflows that may be lurking in the config side; you don't want to risk firing these until it's absolutely necessary. These things can have "side effects" like sending emails. Don't want to risk sending extras (this is mostly a problem with poorly written/conceived rules, which is...most of them =P). Also consider what would happen if one of those untouched records is in an invalid state (e.g. due to a newly configured validation rule). DML updating it along with the batch of 200 could fire that rule and cause the whole batch to fail, even if the 50 records that actually count were OK.

 

While we're at it, I would also test whether a DML update on a record that hasn't been touched would still update the LastModified Date -- if so, that would be yet another compelling reason to filter your DML updates (to not do so would dilute the value of that mod-stamp).

 

Your triggers should always be written to be bulk safe anyways (for batches of at least 200), so that shouldn't factor in to this decision.

 

All Answers

hwelch15hwelch15

 

 

First, filter as much as possible at the query level.

 

[select id, name from account where name like '%50%']

 

Then for the actual update it depends what event and object fired your trigger.  Is this a before or after trigger on the account or some other object?  If the trigger is before and on the account object then you don't have to do an update call.  Just simply do the field assignment and the update is automatic.  If you are doing an after then #2 unless you are sure that you will be updating every single record that's returned from the query.

Rahul SharmaRahul Sharma

Hello Starz26,

 

In second one you are using another list to collect the updated records which will cause heap size as you said.

First options is better. Since we must always avoid use of extra variables.

Starz26Starz26

hwelch15 wrote:

 

 

First, filter as much as possible at the query level.

 

[select id, name from account where name like '%50%']

 

Then for the actual update it depends what event and object fired your trigger.  Is this a before or after trigger on the account or some other object?  If the trigger is before and on the account object then you don't have to do an update call.  Just simply do the field assignment and the update is automatic.  If you are doing an after then #2 unless you are sure that you will be updating every single record that's returned from the query.


The 50 was for example purposes only....Showing that only a subset of the qry records actually get updated after processing

Starz26Starz26

Rahul Sharma wrote:

Hello Starz26,

 

In second one you are using another list to collect the updated records which will cause heap size as you said.

First options is better. Since we must always avoid use of extra variables.


I was thinking so, but this would also require that all triggers be written in bulk (Great use case for that argument)....

 

I am getting that it is better to use more server resources (trigger all 200 record events) that increasing the client side Heap.......

mulvelingmulveling

Your tbuAcc array should be using very little heap space -- I can't imagine a scenario in which saving that little bit would be worth the bother. Remember, it's an array of references, each of which is fairly small. The Objects themselves have already been allocated (once each) in memory by the time you put them in tbuAcc. Even the mem allocated for the objects you've show here wouldn't be big, relative to your 3MB of heap space.

 

I certainly try to keep DML update calls limited to only records that have been "touched" -- yes, that usually means an extra list (or lists). You must consider validation/workflows that may be lurking in the config side; you don't want to risk firing these until it's absolutely necessary. These things can have "side effects" like sending emails. Don't want to risk sending extras (this is mostly a problem with poorly written/conceived rules, which is...most of them =P). Also consider what would happen if one of those untouched records is in an invalid state (e.g. due to a newly configured validation rule). DML updating it along with the batch of 200 could fire that rule and cause the whole batch to fail, even if the 50 records that actually count were OK.

 

While we're at it, I would also test whether a DML update on a record that hasn't been touched would still update the LastModified Date -- if so, that would be yet another compelling reason to filter your DML updates (to not do so would dilute the value of that mod-stamp).

 

Your triggers should always be written to be bulk safe anyways (for batches of at least 200), so that shouldn't factor in to this decision.

 

This was selected as the best answer
Starz26Starz26

Thank you for the detailed answer..... ****To answer your question - YES, the last modified date does get update with the update of the entire recordset even if nothing changed. Hence why I think the TBU list may be necessary...******

 

Lets expand this a bit....

 

Lets got to the max for a trigger, 10,000 records.....

 

Lets say 5,000 of them will be updated....

 

1. Create a list = [Select ID Account from account limit 10000];

loop through list and chang 5,000 of them and put each updated one in a list TBU then at the end updated the list TBU

 

or

 

2. For(Account[] al : [Select ID Account from account limit 10000]){

 

      For(Account a : al){

         //do stuff to 5,000 records total. Knowing that it process in batches of 200

 

     }

  update al;

}

 

So same basic question...

 

Good discussion so far......

mulvelingmulveling

Definitely, #1. I think it will be extremely rare to find that the addition of the TBU list causes the heap limit of 3MB to be exceeded. As always, you can test this before deployment -- fortunately the heap memory used by a reference should not vary with context.