function readOnly(count){ }
Starting November 20, the site will be set to read-only. On December 4, 2023,
forum discussions will move to the Trailblazer Community.
+ Start a Discussion
A.V.A.V. 

Copying large datasets (handling SOQL and DML Limits)

I am interested to find out what approaches people have used for handling large recordsets with Apex where the number of rows returned by SOQL and a number of DML statements may hit the current limits.

My use case:
I need to copy child records from one object to another. In 99% of times there is no problem, but occassionally there may be a case when a number of child records exceeds 10,000, which is the current limit.

Right now my code looks something like this (just an example):
Code:
public static void cloneNotes(Id idFrom, Id idTo)
{
Integer counter = 0;
for ( Note[] sourceObjs : [Select Body, Division, IsPrivate, OwnerId, ParentId, Title from Note where ParentId = :idFrom])
{
Note[] targetObjs = sourceObjs.deepClone(false);

for(Note targetObj : targetObjs){
targetObj.ParentId = idTo;
counter++;
}

insert targetObjs;
}

System.debug('DEBUG - Total number of notes copyied: ' + counter);
}

 One solution I can think of is to add a custom field to a child object (Child.AlreadyCloned__c) and set to true for each batch of copied records. Then call the  cloneNotes() as  an anonymous block  until  all records have been cloned.  When done - reset all flags back false. This, however will  require some additional  work to avoid overlappinf if multiple  users are cloning same  records, also this is not efficient due to extra updates....



Any  alternative suggestions  on this one?