function readOnly(count){ }
Starting November 20, the site will be set to read-only. On December 4, 2023,
forum discussions will move to the Trailblazer Community.
+ Start a Discussion
Abhishek Pal 33Abhishek Pal 33 

Getting System.LimitException: Too many SOQL queries: 101

Hello Everyone,
I am getting SOQL exception whenever I will insert records more than 1000. Now the scenario is when all the records inserted it only executes 3 SOQL statements but then Trigger fired (Before Insert,After Insert,Before Update,After Update) in a batch of 200. 

The same code of trigger repeats for each 200 batch of records and then the SOQL limits increased to 100 in the last batch and then ultimately failed.

The issue is it all executed in one apex transaction.(Apex Transaction Limit 100)

NOTE: I cannot change the functionality inside the trigger.

Is there any solution for this sceanrio?

Thanks in advance.

-Abhishek
Sami Ullah AzamSami Ullah Azam
Save the records in a list then try to insert them user unconditional for loop or by using trigger.
Abhishek Pal 33Abhishek Pal 33
Hello Sami,

I guess you didn't get my point. Salesforce will batch up mass updates and include up to 200 records at once in it (this commonly happens when using tools like Data Loader and I am using this).
I am inserting the records in list only. It just when it chunks in batch of 200, then for whole 200 records it fired the Trigger and some SOQL queries happened.

The point is it happened each time for 200 records batch. Say if I insert 1200 records then it will make 6 batches and All trigger gets executed 6 times for different set of 200 records.
And each Trigger has its own set of functions which has SOQL queries present.

This all happened in one Apex transaction.

NOTE: I cannot change the functionality of Trigger and its methods present.

Hope you get my point now.

-Abhishek
David HamburgDavid Hamburg
Post your code so we can take a look, sounds like you have a SOQL inside a loop
Ravi Dutt SharmaRavi Dutt Sharma
Hi Abhishek,

Reduce the batch of data loader to get around this issue. The default batch size is 200. Try reducing it from as low say as possible, lets say 1 and see if the issue still pertains. Thanks.
Guy_KeshetGuy_Keshet
you state you cant change the trigger, so as I seeit your best option is to create an script (outside salesforce) to split your input file to chuncks, and run individual loads. Each one will generate it's own transactions with it's own set of govenor limits. If you have 3 SoQL queries in the trigger for each record (which is poor coding practise) than limit your upload file size to 33.
There are multiple ETL tools that can do the work for you (some charagble some free) , write your own unix script and use the SQLForce (https://www.capstorm.com/sqlforce-project)library to connect and insert your records, or write a windows script (e.g. using perl) and use command line dataloader (https://developer.salesforce.com/docs/atlas.en-us.dataLoader.meta/dataLoader/using_the_command_line_interface.htm)
Abhishek Pal 33Abhishek Pal 33
Hi Ravi,

The issue is with the trigger and the default size may be is 200 that it operates on. So even if I reduce the dataloader size it wont affect my issue. I tested with 100 rows still Trigger operates on batch of 200.
I checked my logs and found that my Account Trigger is doing recursive stuff. Whenever there is any update Before/After in the Trigger methods it will execute the Trigger and the SOQL queries is repeating.

What I found I need to limit this recursive Account Trigger calling.

-Abhishek
Guy_KeshetGuy_Keshet
if your account trigger is recursive, than there's nothing you can do other than fix the code. The best option is to evaluate the recursion and stop it from happening the quickest option is to add a static field, evaluate if it's on: if not - raise it and continue, of it's on, stop processing Guy Keshet -