function readOnly(count){ }
Starting November 20, the site will be set to read-only. On December 4, 2023,
forum discussions will move to the Trailblazer Community.
+ Start a Discussion
MicahHoover.ax1233MicahHoover.ax1233 

gov limit - spacing records from trigger

I have a trigger that converts one record into a BUNCH of other records depending on the length of time between two date fields. If the dates are long (like 20 months) and there are 150 of them I hit 3 governor limits (script lines, DML, and query).

 

Several experts have looked over the code with me and the margin for optimization is low, and I'm thinking about the alternatives right now (please feel free to comment as to how much value you see these paths having or if I am leaving out any good ones):

 

1) Have the DB guy only load these records with long dates 20 or 30 at a time. (Q: How long does he need to wait before SF puts them into separate trigger calls?)

 

2) Get rid of the trigger entirely and have a workflow rule that checks a flag on each incoming record to see if it has been processed yet and handle them until the governor limits get close. (Q: How nasty of a hack is this in SF land?)

 

3) Drop SF for heroku, ec2, azure, etc.

Best Answer chosen by Admin (Salesforce Developers) 
Noam.dganiNoam.dgani
1. You can post some of your code for us to have a look, because if your experts didn't know the answers to your questions below, I would rethink the definition of expert. 2.using external tools (like the apex data loader) you can set batch size and even upload a batch of 1 record at a time. Each batch is considered as a standalone transaction, and the limits you mentioned are confined to a transaction. It's a hack, and an ugly one. 3. If your code is truly optimized, you can always use batch apex to chunk up the data in code ( instead of relying on the fact that everybody will always remember to upload in batches of 1 ). 4.as for the workflow hack, salesforce will roll back the entire transaction once you have an unhandled exception, so that won't work.

All Answers

Noam.dganiNoam.dgani
1. You can post some of your code for us to have a look, because if your experts didn't know the answers to your questions below, I would rethink the definition of expert. 2.using external tools (like the apex data loader) you can set batch size and even upload a batch of 1 record at a time. Each batch is considered as a standalone transaction, and the limits you mentioned are confined to a transaction. It's a hack, and an ugly one. 3. If your code is truly optimized, you can always use batch apex to chunk up the data in code ( instead of relying on the fact that everybody will always remember to upload in batches of 1 ). 4.as for the workflow hack, salesforce will roll back the entire transaction once you have an unhandled exception, so that won't work.
This was selected as the best answer
MicahHoover.ax1233MicahHoover.ax1233

Thanks for helping understand the platform a little better, Noam.

 

It looks like there is a way to periodically check to see how close a block of code is getting to the limit. The downside is it looks like there is a lot to check and it would count against the limits itself (aside from the nastiness of what a hack that approach would be). See here:

 

http://www.bulkified.com/Monitoring+Salesforce.com+Governor+Limits

 

I am disinclined to post the source code on the internet because that would be a victory for the FSF and potentially for my company's competitors.

 

I misunderstood the difference between bulkifying apex and batch apex, which I am now reading up on. Unfortunately, it looks like trigger-based batch requests have some warnings about using them on triggers.  *Sigh*.

 

http://www.salesforce.com/us/developer/docs/apexcode/Content/apex_batch.htm

 

Thanks, Noem!