function readOnly(count){ }
Starting November 20, the site will be set to read-only. On December 4, 2023,
forum discussions will move to the Trailblazer Community.
+ Start a Discussion
WizradWizrad 

Trigger with 100+ inserts

Hi all,

 

I am writing an after insert trigger on an object called EBD__c.

 

The idea of the trigger is as follows: There is an EBD for each month/location combo.  EBDs are loaded via data loader one month at a time.  The users may not load 1 EBD for every month for every location, they load as many as needed and for every location an EBD is not entered for that month, a new one is created.  So lets say the user is going to load 20 EBDs and there are 125 locations.  We would be updating 20 EBDs and inserting 105 new EBDS.  Unfortunately you cant insert more than 100 things via trigger.  So I start chunking up my list that is greater than size 100 into blocks of 100 and calling future methods to upsert these blocks, and this is where things get ugly.  Future methods only take primitives.  Creating a static instance variable with the list I want the future method to upsert doesnt work either, the data seems to go poof when it goes out of scope.  My only idea is to create a list of strings that contain all the data I need to pass to the future method, but that seems arduous, and stupid.

 

Any ideas?

Thanks

Best Answer chosen by Admin (Salesforce Developers) 
jhenningjhenning

Are you actually hitting a governor limit? Because if you are inserting 20 rows, then your trigger batch size is 20 and thus the "total number of records processed as a result of DML statements" governor limit should be 2,000 not 100. The limit is a factor of the trigger batch size.

All Answers

jhenningjhenning

Are you actually hitting a governor limit? Because if you are inserting 20 rows, then your trigger batch size is 20 and thus the "total number of records processed as a result of DML statements" governor limit should be 2,000 not 100. The limit is a factor of the trigger batch size.

This was selected as the best answer
WizradWizrad
Ha, this is sort of embarrassing. Initially we were trying to load an entire year of data at once. Thats when we were hitting limits. When we decided to do it a month at a time I didn't even think about the fact that we werent going to be hitting a limit anymore. Much thanks jhennig.