function readOnly(count){ }
Starting November 20, the site will be set to read-only. On December 4, 2023,
forum discussions will move to the Trailblazer Community.
+ Start a Discussion
RiverChaserRiverChaser 

Suspend Triggers to Mass Delete?

I need to do a mass delete of records for a custom object. I'm running into a problem with "Too many SOQL queries" though. I have a trigger on the custom object and that is causing me to exceed the limit.

Is there a way to suspend triggers for an object for an operation like this? There is an Is Active property on the trigger, but I see no way to change the setting.

Interestingly, there is no Edit/Delete link on the trigger when I look at the custom object, and when I look at the code there is no option to edit it. Is this because I created it using the Apex Toolkit for Eclipse? Is there a way to make it inactive using the toolkit?

My only idea at this point would be to delete the trigger and its tests and class methods, do the delete, then add the code back. But that is going to be a mess that I'd rather not deal with.

By the way, I'm doing the deletion through the Apex Data Loader. I don't seen any option around the problem in that tool either. Have I missed something?

What options do I have?

Thanks!
Don
jgrenfelljgrenfell
Don,

I hit this same problem when doing imports, and my somewhat hacky solution was to explicitly write into the trigger not to do anything if the trigger size was greater than ten (if (trigger.new.size()<10)). You could do this same thing with trigger.old.size for deletes.

Jessie

Message Edited by jgrenfell on 10-29-2007 08:35 AM

RiverChaserRiverChaser
Hi Jessie,

Ah, lovely idea! Since when the user is deleting records it is likely to be one at a time or a small handful.

I'll give it a whirl. Thanks!

Don
RiverChaserRiverChaser
Rats. No, this isn't going to work. Well, it works for deleting the records, but then I bump up against the same SOQL limitation when re-importing the new records. I need to have the trigger execute for the imported records, otherwise data in another object will be incorrect.

Any other solutions? Surely I don't have to break up an 8,000+ record upload into chunks of 10 or 20 at a time?

Don
jgrenfelljgrenfell
None that I'm aware of.  I had to just recreate what the triggers did myself and import my new records plus the changes those records would have made had the trigger executed.  This, in my mind, is the biggest limitation of Apex code- the governor limits are strict to the point of prohibitive.  I understand why they need them, but I'm hoping they increase those maximums at some point.
gokubigokubi
When you deploy code to production, you can't modify it in production. See this thread on the subject:

http://community.salesforce.com/sforce/board/message?board.id=apex&message.id=1441

The best way to handle your situtation is to write triggers that will work when called in bulk from a mass delete or insert.  Here's a thread that talks about some of the issues with bulk triggers:

http://community.salesforce.com/sforce/board/message?board.id=apex&message.id=1374

It's much harder to write bulk triggers than it is to write triggers that aren't bulk safe. But, when you make them bulk safe, they will always work, no matter how the data is created/modified/deleted.