You need to sign in to do that
Don't have an account?
System.LimitException: Apex heap size too large: Error in Trigger code
I have written a Trigger on DummyTransaction object where i am creating list of Transaction object and Insert that list after all dummy transaction's are processed on by one in enhance for loop.
I have also trigger on Transaction object which is Inserting Contact list after processing all transactions in enhance for loop. i have large amount of Dummy Transaction load using the dataloader.
I got this error
DummyTransactionInsertUpdate: execution of BeforeInsert
caused by: System.DmlException: Insert failed. First exception on row 0; first error: CANNOT_INSERT_UPDATE_ACTIVATE_ENTITY, ManageContactsOnTransaction: execution of AfterInsert
caused by: System.LimitException: Apex heap size too large: 7522771
External entry point: []
Trigger.DummyTransactionInsertUpdate: line 274, column 1
the 274 line of trigger is ==> insert listTx;this is a List of Transaction object which object have ManageContactsOnTransaction Before insert trigger.
The DummyTransactionInsertUpdate is BEFOR INSERT Trigger creating list of Transaction object and INSERT that list at the end of trigger.
The ManageContactsOnTransaction is AFTER INSERT Trigger which is creating list of Contact object and INSERT that list , this Trgger is generating the error.
Please, anybody help me how i will resolve this issue...
What is the limit of Heap Size memory in Trgger code ?
Hi,
With the Spring ’10 release, Salesforce.com removed the limit on the number of items a collection can hold. So now, instead of ensuring that your collections contain no more than 1000 items, you have to monitor your heap size. Here are some strategies on how to write Apex scripts that run within these limits.
First of all, what is the heap? Dynamic memory allocation (also known as heap-based memory allocation) is the allocation of memory storage for use in a computer program during the runtime of that program. In Apex the heap is the reference to the amount of memory used by your reachable objects for a given script and request. When you create objects in your Apex code, memory is allocated to store these objects.
As with many other things in Force.com, there are governors and limits that prevent you from hijacking the heap and degrading the performance of other running applications. The heap limit is calculated at runtime and differs on how your code is invoked:
• Triggers – 300,000 bytes
• Anonymous Blocks, Visualforce Controllers, or WSDL Methods – 3,000,000 bytes
• Tests – 1,500,000 bytes
These limits also scale with trigger batch sizes:
• For 1-40 records, the normal limits apply
• For 41-80 records, two times the normal limits apply
• For 81-120 records, three times the normal limits apply
• For 121-160 records, four times the normal limits apply
• For 161 or more records, five times the normal limits apply
Luckily Salesforce.com increased the heap size limits in Summer ’10 but you still may run into some issues. Here are a few things you can do to write heap-friendly code.
Watch the Heap
When your scripts run you can view the heap size in the debug logs. If you notice your heap approaching the limit, you will need to investigate why and try to refactor your code accordingly.
Use the Transient Keyword
Try using the “Transient” keyword with variables in your controllers and extensions. The transient keyword is used to declare instance variables that cannot be saved, and shouldn’t be transmitted as part of the view state for a Visualforce page.
Use Limit Methods
Use heap limits methods in your Apex code to monitor/manage the heap during execution.
• Limits.getHeapSize() – Returns the approximate amount of memory (in bytes) that has been used for the heap in the current context.
• Limits.getLimitHeapSize() – Returns the total amount of memory (in bytes) that can be used for the heap in the current context.
// check the heap size at runtime
if (Limits.getHeapSize > 275000) {
// implement logic to reduce
}
One strategy to reduce heap size during runtime is to remove items from the collection as you iterate over it.
Did this answer your question? If not, let me know what didn't work, or if so, please mark it solved.
If you are breaching the heap limit you need to look at breaking up your transactions. If you are just over, you could use an @future method to handle some of the inserts. Failing that, you may need to look at firing off a batch process that breaks the records into smaller chunks.
Hi Bob,
I have a scheduled job which updated records on Account object. It has same heap size problem and I converted my scheduled class to scheduled batch job, but I still get the below error:
System.LimitException: Batchable instance is too big
Thanks
Hari
One reason you might see this is if you are maintaining state between the execute calls. If that is hanging on to a lot of records for example. Failing that you may need to reduce your batch size.