You need to sign in to do that
Don't have an account?
Shenwei Liu
Bulk Loading Data Using API Upsert
I’m doing the prototypes of my company’s SF app which needs to load bulk data to SF using the API calls daily. I’m using my developer account to test the data loading and have created a custom object with about 40 fields on SF site. I have downloaded my enterprise WSDL and will use .NET code to call the API methods. The initial data load process will insert about 50K rows to the SF custom object/table. Loading process for every day is just inserting some small number of new rows and updating some existing rows.
My company would also like to use our internally developed application, not other tools, to export/import initial data sets. Is there any problem if I send the bulk data set containing 50K records with 40 fields to SF using the API Upsert call? Are there any performance and permission issues? I need to confirm these before I make the API call. Thanks.
My company would also like to use our internally developed application, not other tools, to export/import initial data sets. Is there any problem if I send the bulk data set containing 50K records with 40 fields to SF using the API Upsert call? Are there any performance and permission issues? I need to confirm these before I make the API call. Thanks.
Here is some insight on the limits: https://www.salesforce.com/us/developer/docs/api_asynch/Content/asynch_api_concepts_limits.htm
This page points out some best practices: https://developer.salesforce.com/page/Loading_Large_Data_Sets_with_the_Force.com_Bulk_API