function readOnly(count){ }
Starting November 20, the site will be set to read-only. On December 4, 2023,
forum discussions will move to the Trailblazer Community.
+ Start a Discussion
curiousguy_22curiousguy_22 

How many times a trigger will be executed when try to insert 1000 records.

How many times a trigger will be executed when try to insert 1000 records to a standard object like Accounts. How to test this? Thank you very much in advance.

gm_sfdc_powerdegm_sfdc_powerde

From my experience of bulk inserts, trigger gets invoked once for every 200 records.  This is probably because Salesforce batches 200 records in a single transaction during bulk inserts.  Not sure if it's guaranteed or documented though.   The best way to test it would be to enable logs for a particular user, add a trigger with a debug statement and then insert 1000 records to a standard object using data loader.

sfdcfoxsfdcfox

A trigger will be executed at least once for every 200 records in a transaction. Note that if you have recursive triggers (such as a TriggerA that may trigger TriggerB, which may in turn call TriggerA a second time) is possible, so you may need to plan for such occurrences. It is also possible that a trigger will be executed more than once through Workflow Rules and/or Import Wizards. This value is subject to change at any time (though it hasn't as of yet), so best coding practices dictate that you should always use bulk processing that does not use hard-coded values (by use of for-each loops, etc).

Sunil NandipatiSunil Nandipati

Would it be the same case if i use the bulk api with a batch size of 2000?  Does the trigger still execute after every 200 in this case?

 

 

sfdcfoxsfdcfox

As far as I'm aware, even using a bulk API with large batch sizes will result in the same processing. The bulk API is virtually identical to the Import Wizard in terms of functionality; it is designed to handle a large number of records with relative ease, but the processing of those records won't change. Internally, the size of each batch would still be 200. The batch size of 2000 would instruct the bulk uploader program to post 2000 records to the async job queue per file upload; the file is effectively constructed on the server and processed as rows become available. The 200 records per trigger invocation will still apply.

EmplEmpl

Hey Ankit,

 

I pretty much have the same problem.

 

My trigger works if my batch size is 200 in Dataloader.

 

But, if I use Bulk PIA na dmake batch size more than 200, it doesnt execute for some of the records, probably 200 per batch only.

 

Can you please give some direction on this?

 

Thanks,

Vishal

Guy_KeshetGuy_Keshet
the answer is 5 - a trigger processes up to 200 records per invokation, so for 1000 records inserte, the trigger would fire 5 times
Abhishek Pal 33Abhishek Pal 33
Hi Everyone,

I am inserting records of more than 1000 records which in turn causes trigger to fired. The trigger chunks the data in batch of 200. 
Now I am hitting the SOQL governor limit due to this process.
There is one set of SOQL queries that gets executed in one batch of 200 records but since it happened more than 5 times it reaches governor limit.

Is there any way to avoid this condition.

Thanks in advance.
-Abhishek
Guy_KeshetGuy_Keshet
Hi Abhishek - what governor limit are you hitting exactly? IF you load is a one off, here's two options for you: 1. split your input to two files and run two seperate loads (assuming each is under 1000 records). 2. Swtich off the trigger for the duration of the load, and replace it with a custom data patch (i.e. create a sript to do the work the trigger would normally do)
Abhishek Pal 33Abhishek Pal 33
Hi Keshat,

I am getting EXCEPTION_THROWN|[364]|System.LimitException: Too many SOQL queries: 101

Now the point is I don't want to split my inputs. And for point no-2  it can be done but its not feasible in our scenario. 

Is there any way so that user can insert records through data loader of his choice but internally we will manage them into different apex transaction.?

What causes this exception is its a single apex transaction that is going on for all the data chunks that salesforce does internally and for each 200 records the Trigger gets fired and some SOQL queries gets executed.

For large set of data(more than 1000) the above process happened repeatedly.

-Abhishek
Guy_KeshetGuy_Keshet
than your issue is different - it's related to the quality of your code. the trigger needs to be bulkified - read through the links that Ankit shared above, and implement, most likely you'll need to review your code , move your SoQL queries out of loops and use maps within the loops. 
As the issue happens repeatdly I have to assume the code is not written for bulk transactions, so the code should be fixed.

I dont quite undertsnd your quesiotn about "data loader of his choice but ... different apex transaction" - any data loading tools uses SFDC's APIs, and any record insert/update would cause an object trigger to fire.
 
PreeSFDCDevPreeSFDCDev
Hi Abhishek, I am facing the exactly same issue as you have mentioned.
I have around 1500 records to update and there are other triggers which will also get fired after update.
As Salesforce internally divided it among the batches of 200 so after the execution of first batch my total SOQL is around 20 ( as many other queries run before and after each update)  which eventually hit the SOQL limit of 100 after few batchs of execution and the records fail to update.
Are you able to indentify the solution to this issue. Please let me know.  It will help us to proceed furtther.
Thanks in advance.
Abhishek Pal 33Abhishek Pal 33
Hi Preety,

What I have done is to create a Batch class which will be invoked whenever we will update the records. Since each batch transaction has its own limit of 100 SOQL queries so it will avoid this governor limit error.
Each execution of a batch Apex job is considered a discrete transaction. For example, a batch Apex job that contains 1,000 records and is executed without the optional scope parameter from Database.executeBatch is considered five transactions of 200 records each. The Apex governor limits are reset for each transaction
Please marked this if it resolved your issue.

-Abhishek
PreeSFDCDevPreeSFDCDev
Hi Abhishek,
Thanks a much for your reply. I was thinking of writing batch apex ,however I tried with @future method and it worked. I  made my method to update the records as @future and called this method from After trigger.
Vivek S 42Vivek S 42
Hi, 

I am also facing similar issue. In our salesforce environment using informatica we are inserting or updating records in large number of upto 1,50,000 records in parallel. in turn of these operations triggers get fired and they call each other and some other triggers to perform some update operations. And also there are some workflows which will get fired on conditions meet. From the logs of transaction i can see the job of insert getting started with the chunk of 200 records and when entire cycle got complete of what it takes to finish the 200 records which are firing triggers and updating the different objects, Again a new set of records are trying to get insert with in the same transaction. Since a new set is also getting chain in the same transaction, after 3 to 4 chunks are done the job is getting failed because of apex cpu time limit exceeded error. Obviously we will face apex cpu timeout since for all chunks of records are getting fired in single transaction and governor limits are not getting reset for each chunk processed.  

Can anyone tell me how this bulk load from informatica behaves with salesforce? example, if i insert 20000 set of records, if salesforce is processing with each size of 200 chunk, then 100 chunks are required to complete this. Is this all 100 chunks comes under single transaction? 

Note: my code is complex and critical logic. Code optimization is done. Cannot go for future calls. Wherever i can use future calls i have made all the necessery optimizatons. 
SFDC GSFDC G
Hi vivek,
Have you solved your problem?
How may API Calls are  hitting for inserting/updateing 150k records?