You need to sign in to do that
Don't have an account?
Sagar Hinsu 12
How to use Bulk Api in python
I know there are some docs where its explained that but.
I have a csv for an custom object with 10000 records.
I want to upload this data to my org using bulk api.
I know there are tools are dataloader.io and apex data loader. but i want it as custom tool in python.
Thanks.
I have a csv for an custom object with 10000 records.
I want to upload this data to my org using bulk api.
I know there are tools are dataloader.io and apex data loader. but i want it as custom tool in python.
Thanks.
All Answers
i want to upload through a csv. my file is custom_object__c.csv file.
I have a csv name user__c.csv.
user__c.csv has data like
first_name__c,last_name__c,user_id__c
a,b,1
c,d,2
e,f,3
i,j,4 etc etc(many rows like 20000)
now can you give small example for uploading using bulk api.
Thank You.
I am getting the following error when I am using this api:
RuntimeError: You must set SALESFORCE_CLIENT_ID, SALESFORCE_CLIENT_SECRET, SALESFORCE_REDIRECT_URI to use username/pass login
Any help on this is highly appreicated!
u need to get the client id, client secret key and instance URL.
https://developer.salesforce.com/forums/?id=906F0000000AfcgIAC
I also need to do a bulk update in our salesforce instance using python. Same scenario with you guys as stated above.
I have a question though, where will i save the csv file that i'm going to use? what i can see above is only the csv filename. The path to the file was not mentioned. Please advise on this. This will greatly help me on the current project that i'm working.
Hoping for your prompt response. Many thanks!
The code above assumes that the csv file is in the same directory as the running code. If you want to change that location then you will need to update the open call to have the path to your csv. Or you can build the csv in memory and pass it to the DictReader. Or you can build the dict in memory and use that directly.
Does anyone have any idea how reliable is this salesforce_bulk library for production usage. Does it support data query to export 100mb data from SF. I have been using simple_salesforce which is easy to use however for large data something greater like 15mb the behavior is quite unpredictable. It may extract the data or it may fail.
I am having a similar problem:
The object I'm passing is 'Product2' and I can use the same SalesforceBulk object to successfully perform a bulk query on Product2 using bulk.create_query_job. This code fails on create_insert_job with the following response:
[400] Bulk API HTTP Error: InvalidJob - No create access for object:Product2
Is this a permissions issue? I am using an admin account.
Click: Python Training in Pune (https://www.sevenmentor.com/best-python-classes-in-pune.php)