function readOnly(count){ }
Starting November 20, the site will be set to read-only. On December 4, 2023,
forum discussions will move to the Trailblazer Community.
+ Start a Discussion
Cory 16Cory 16 

Best practice before mass data loads

Hi, I've been a Salesforce developer for a few years now and one of the tasks that I've had to carry out many times is performing an update on a set of thousands of records via data loader. For example, we want to update a field on all 10,000 opportunities in the org.

Now, in order to do that update safely, I will always backup the opportunities into a CSV before I run the data load. I'll also look through all the triggers, process builders, workflow rules to make sure that updating the field won't trigger cascading changes throughout the system.
But even if you do all of that, you still can't be 100% sure that there were no unexpected changes in the system as a result of the data load... unexpected field updates etc.

I just want to start a discussion to see if I can learn what other people do as best practices when performing mass dataloads. What steps do you take to give yourself peace of mind? Let me know. Thank you
Vishwajeet kumarVishwajeet kumar
Hello,
Their are limited things which can be triggered on a object record on insert/update synchoronously - Workflows, Triggers, Flows, Process Builders. If you disable all of these if you have any which will act based on particular field change, you should be good.

Some Asynchoronous processes could be on scheduled jobs, which you can verify and make sure it does not come in your way at the time you are running data load, if timing for both is different it should ok.


Thanks