You need to sign in to do that
Don't have an account?
JBerry
Dataloader command line = slow loading?
Hi,
I was wondering if this is typical or not. When I use the command line to update data on salesforce, it runs extremely slow.
I am using the "process ../conf csvupdate" command line, with my process-conf.xml file located in my /conf directory.
The process starts off quick, but when it gets to "Setting object reference types", this step alone takes 10 minutes on average. When I run the same update in the GUI with the same field mappings file, it is instant with a speed of "27000 records per minute".
Any ideas? I'm not getting any errors or anything, its all successful, and nothing in my error logs.
I was wondering if this is typical or not. When I use the command line to update data on salesforce, it runs extremely slow.
I am using the "process ../conf csvupdate" command line, with my process-conf.xml file located in my /conf directory.
The process starts off quick, but when it gets to "Setting object reference types", this step alone takes 10 minutes on average. When I run the same update in the GUI with the same field mappings file, it is instant with a speed of "27000 records per minute".
Any ideas? I'm not getting any errors or anything, its all successful, and nothing in my error logs.
I changed my process batch size down to 5, and only imported 5 records. It still took 10 minutes to process at the same spot.
Am I looking in the wrong place to set the buffer? I was under the impression that is the process batch size variable inside of my process-conf.xml file
Could you please send me your process-conf.xml file, so I can take a look?
Please make sure it doesn't include any confidential information before you send it.
Thanks,
Markus