function readOnly(count){ }
Starting November 20, the site will be set to read-only. On December 4, 2023,
forum discussions will move to the Trailblazer Community.
+ Start a Discussion
Mahesh@sfworldMahesh@sfworld 

data loader for loading large volume of data

Hi,
I have a load file having around 800000 records trying to UPSERT  through data loader for one of object Invoice__c .Load stopped in between at the record count 600000.I want to load only remaing records but how can I identify the records loaded in the loadfile .
Does the dataloader loads the records any particular order ?Does it sorts the records based on any key ?

Please let me know if any one experienced this before.

Thanks,
FilikinFilikin
I found that it loaded the records in the order they appeared in the import file.
This happens to be frequently when I am attempting to upload attachments and I forget to check the file sizes.
So I have to remove the offending file from the list in the CSV file and all the records before it.
Mahesh@sfworldMahesh@sfworld
Hi Filikin,
Thank you for the reply ..How do you seperate the offending file (error file records) from the import file ?Do you perform any v-lookups for externalID's matching in import file vs error record file?I am using a csv import file.
The idea is not to load the whole load file again instead load only previous error records .I am doing a full load initially and going forward its a delta loads (way smaller than full).
 
FilikinFilikin
Usually I just edit the csv file and remove the records up to the one that failed. Sometimes I use the start at field in the data loader settings There is no need to do matching since the data loader loads the records in the order they appear in the csv file and a batch is either loaded or it isn't Sent from my Samsung device