function readOnly(count){ }
Starting November 20, the site will be set to read-only. On December 4, 2023,
forum discussions will move to the Trailblazer Community.
+ Start a Discussion
mweissmweiss 

Data loader error on upsert - Duplicate value found

I am batching the data loader to do upserts to five objects, using external IDs that need to be unique so that I can make sure to reference the same salesforce records every time I run the upsert, since each time, some of the records will have some different values.

 

The problem I'm having is that sometimes the records are updated when a match is found that already exists, and sometimes it gives me the error "duplicate value found". It's about a 50/50 split between the outcome that I want and the error messages. In theory, I shouldn't ever get any error messages when I re-run the upsert, since it should just update the records to the new values.

 

Removing the unique restriction on the external ID is not an option for me, as it just defeats the purpose of me using one completely if more than one record can have it.

 

I haven't been able to find anything so far, so any help on this would be greatly appreciated.

Best Answer chosen by Admin (Salesforce Developers) 
aalbertaalbert

Set the Batch Size to 1 in the data loader settings. This will cause the import to take longer since only 1 record is being sent per API call, but it will guarantee duplicate external Ids are not being submitted in the same API call (per upsert call).

All Answers

aalbertaalbert

Does your dataset have duplicates in it? For example, does the external id "abc" exist more than once in the dataset you are importing? That might cause that error to be thrown. Try setting the batch size to 1 if you have to (a setting in the data loader) to prove that is the issue.

mweissmweiss

I have run some tests with a couple of smaller data sets and that looks like the issue.

 

I have multiples of the external ID in my data set (phone numbers), so is there any way of making it so that they all make it through without error and without manipulating the data before upserting it?

aalbertaalbert

Set the Batch Size to 1 in the data loader settings. This will cause the import to take longer since only 1 record is being sent per API call, but it will guarantee duplicate external Ids are not being submitted in the same API call (per upsert call).

This was selected as the best answer
mweissmweiss

It seems that has taken care of it.

 

Thanks for the help!

Geisner@google.comGeisner@google.com

Thanks! Helped me too.

Sandeep123Sandeep123

Hi All,

 

Without setting batchsize 1, is there any other option to get rid of this error while using external id filds update.

 

Same thing i am using here, Account key ( External id) is updating using upsert in trigger, gives duplicate account key error.

 


Thanks in advance, please help sounds very bad for me now.

Gnana Moses 4Gnana Moses 4
Batch size limit is 5000 per day. So setting batch size 1 may not be the correct solutoin if I load more than 5000 records
Billy Silva 13Billy Silva 13

For anybody still looking for a solution, I'll mentioned below what worked for me:

  • Most likely the format of the start date of your forecast quote does not match to your data loader settings.
  • If you are using the data loader either enable the European Date Format setting if your uploading dates with a European format.
  • Or instead adjust the format of the date field in your source file to "YYYY-MM-DDT00:00:00Z".