function readOnly(count){ }
Starting November 20, the site will be set to read-only. On December 4, 2023,
forum discussions will move to the Trailblazer Community.
+ Start a Discussion
Shenwei LiuShenwei Liu 

Bulk Loading Data Using API Upsert

I’m doing the prototypes of my company’s SF app which needs to load bulk data to SF using the API calls daily. I’m using my developer account to test the data loading and have created a custom object with about 40 fields on SF site. I have downloaded my enterprise WSDL and will use .NET code to call the API methods. The initial data load process will insert about 50K rows to the SF custom object/table. Loading process for every day is just inserting some small number of new rows and updating some existing rows.
 
My company would also like to use our internally developed application, not other tools, to export/import initial data sets. Is there any problem if I send the bulk data set containing 50K records with 40 fields to SF using the API Upsert call? Are there any performance and permission issues? I need to confirm these before I make the API call. Thanks.
ShashankShashank (Salesforce Developers) 
If you are doing it using the bulk API, it should not be a problem as it is designed for large data sets. What you need to note is that it is a queued asynchronous process and the time taken to finish depends on resource availability. You should also ensuure that there are no network issues, since it is a batched process and sometimes network issues can make the jobs get stuck without proceeding further.

Here is some insight on the limits: https://www.salesforce.com/us/developer/docs/api_asynch/Content/asynch_api_concepts_limits.htm

This page points out some best practices: https://developer.salesforce.com/page/Loading_Large_Data_Sets_with_the_Force.com_Bulk_API