function readOnly(count){ }
Starting November 20, the site will be set to read-only. On December 4, 2023,
forum discussions will move to the Trailblazer Community.
+ Start a Discussion
Abhilash Mishra 13Abhilash Mishra 13 

Upsert giving "duplicate ID Exception"

I have trying to upsert data by using API in salesforce. and while upserting I get and " duplicate ID Exception". when passing the array.
how can fix this. i know my  array has records with same external id(duplicate). But thats why I am upserting them. how can I over come this limitation. it happens with apex to. here is for sample.

list<account> li= new list<account>();
for(integer i =0;i<50;i++){
    account acc= new account();
    acc.name='' tech corp"; 
    acc.company_email__C='example@teccorp.com'; /*this field is an external id */
    acc.updated_count__C=i;
    li.add(acc);
}
upsert li company_email__C;
This also gives the same Error.

Thanks 
Abhilash Mishra

 

Amit Chaudhary 8Amit Chaudhary 8
Please try below code.
list<account> li= new list<account>();
for(integer i =0;i<50;i++)
{
    account acc= new account();
    acc.name='' tech corp"; 
    acc.company_email__C='example'+i+'@teccorp.com'; /*this field is an external id */
    acc.updated_count__C=i;
    li.add(acc);
}
upsert li company_email__C;

In same batch id you will try to insert and update same record you will get duplicate ID error

 
Abhilash Mishra 13Abhilash Mishra 13
well in that case it will insert a new record everytime. "company_email__c" will be different. for each object.
is there another way to do this?
Amit Chaudhary 8Amit Chaudhary 8


Just to confirm the behavior of data loader/Code and external IDs: the data loader/Code tries to match records it's trying to upsert on the external id that you have specified. If it does not find a match , it inserts, if it does find one single match it updates.

If it finds more matches it throws an error: this could happen if you have records with the same ids coming from an external system (can avoid this using the unique attribute for external ids) and in this case it makes sense that the data loader throws errors since it would not know what records you want to update

Check below post for more info
1) http://www.jitendrazaa.com/blog/salesforce/all-about-upsert-and-external-id-in-dataloader-and-apex-videos/
2) http://blog.jeffdouglas.com/2010/05/07/using-exernal-id-fields-in-salesforce/
 
Abhilash Mishra 13Abhilash Mishra 13
I have gone through these already and I understand what you are saying. But I have a situation where I have a batch(suppose a list of 10 accounts) and all of them have same external ID. Now what I actually want is upsert to follow array indexs. like, 1 record is userted against the already existing then 2nd then 3rd.
does not upsert suppose to work this way ?
Amit Chaudhary 8Amit Chaudhary 8
if in same context or same batch you will pass all 10 record with same external ID then it will not work
Abhilash Mishra 13Abhilash Mishra 13
what could be the best alterantive in this situation, can you suggest something?
Amit Chaudhary 8Amit Chaudhary 8
1) Make your external Id as unqiue . In that case your will never able to same two record with same external ID in salesforce data base.
2) if that is batch job then make the batch job size 1. (Not a good pratice )
 
Sreedhar DSreedhar D
Hi Abhilash,

Do you have any workaround for this? I am also running into the same scenario. Thank you.
Don KellyDon Kelly
You could use a Map object with external id as key instead of list, then at upsert you'd only have 1 of each to upsert.
Something like this
Map<String,Account> ma = new Map<String,Account>()
for(integer i =0;i<50;i++)
{
    account acc= new account();
    acc.name='' tech corp";
    acc.company_email__C='example'+i+'@teccorp.com'; /*this field is an external id */
    acc.updated_count__C=i;
    ma.put(acc.company_email__c, acc);
}

upsert ma.values() company_email__C;
purna unnagiripurna unnagiri
Hi Abhilash,
Do you have any workaround for this? I am also running into the same scenario while processing the list of records via file upload process in community. Thank you.