function readOnly(count){ }
Starting November 20, the site will be set to read-only. On December 4, 2023,
forum discussions will move to the Trailblazer Community.
+ Start a Discussion
ShashikantKulkarniShashikantKulkarni 

Update large number of object at a time

Hi All,

I am not expecting any code here but just a logic or at least a way to solve my problem.
My requirement is:
I have a batch job, in that I am invoking a class which is looking for a custom object records. The reords could be more than 30000. Now I want to go through each record, do some 'processing' based on values in other fields and then update a value field in the same record. Also during 'processing' step I might look for Account and other standard objects and their field values and use them in my calculation.
The problem in doing this is that I am loading too much data in memory and getting "Apex heap size too large" error.  I tried the suggestions given in some posts on how to update records using list, and load only required data in memory and believe me that I did all that but still getting error as I need to process huge number of records. Is there any other way to do this without getting memory error or without hitting the governor limits?
Best Answer chosen by ShashikantKulkarni
StephenKennyStephenKenny
Hi there,

You can absoloutley do this. Generally speaking, 30000 records is not a very large data to work with so I suspect with a little tweaking of your code, you will be able to avoid these limits.

There is some useful information here:http://www.salesforce.com/us/developer/docs/apexcode/Content/apex_batch_interface.htm
The first thing I would suggest is to reduce your batch size from 200 to say 100.

Alternatively, it might just be a case of tidying up your code to ensure best practices are followed - you would be surprised at the difference this could make. If you want to post your code, we can help.

Please remeber to mark this thread as solved with the answer that best helps you.

Kind Regards
Stephen

All Answers

StephenKennyStephenKenny
Hi there,

You can absoloutley do this. Generally speaking, 30000 records is not a very large data to work with so I suspect with a little tweaking of your code, you will be able to avoid these limits.

There is some useful information here:http://www.salesforce.com/us/developer/docs/apexcode/Content/apex_batch_interface.htm
The first thing I would suggest is to reduce your batch size from 200 to say 100.

Alternatively, it might just be a case of tidying up your code to ensure best practices are followed - you would be surprised at the difference this could make. If you want to post your code, we can help.

Please remeber to mark this thread as solved with the answer that best helps you.

Kind Regards
Stephen
This was selected as the best answer
ShashikantKulkarniShashikantKulkarni
Thanks Stephen.
I will go through the link you have provided.