function readOnly(count){ }
Starting November 20, the site will be set to read-only. On December 4, 2023,
forum discussions will move to the Trailblazer Community.
+ Start a Discussion
mwendlingmwendling 

data loader batch size bug?

I'm using the AppExchange data loader 8.0 to upload from my DB2 database to sforce.
If I use a sfdc.loadBatchSize value of anything other than an exact factor of my
result set size I end up with an error:

2007-02-07 16:11:49,155 ERROR [testObMasterProcess] progress.NihilistProgressAdapter doneError (NihilistProgressAdapter.java:51) - Error encounted trying to get value for column:  for row #75 (database execute query). Database configuration: queryJoborder.  Error: Invalid operation: result set closed.

My expected result set in this case is 74 rows.

Everything works fine with the following values:
sfdc.loadBatchSize=1
sfdc.loadBatchSize=2
sfdc.loadBatchSize=37
sfdc.loadBatchSize=74

It fails with the following values:
sfdc.loadBatchSize=10
sfdc.loadBatchSize=20
sfdc.loadBatchSize=100

Is this a bug in the Data Loader or am I missing something?

-Matt




AlexWSFDCAlexWSFDC
This could potentially be a bug, please file a case with support.  Do you know what was your dataAccess.readBatchSize?  This controls the size of the database batch.

--Alex
mwendlingmwendling
For dataAccess.readBatchSize I tried both 10 and 200.  The results were the same.
AlexWSFDCAlexWSFDC
This sounds like a legitimate bug, please file a case with support for this problem and provide the details you've posted here plus data loader version and your system information, including dataloader configuration.

Thank you for finding this problem, we'll do our best to address this issue.
knicholsknichols
Any updates on this?  I just ran into the same problem...
AlexWSFDCAlexWSFDC
I haven't seen the case on this.  Anyone has the information??