• Tien Tran 29
  • NEWBIE
  • 10 Points
  • Member since 2020

  • Chatter
    Feed
  • 0
    Best Answers
  • 0
    Likes Received
  • 0
    Likes Given
  • 3
    Questions
  • 16
    Replies
I using Data Loader CLI v50 and enable Bulk API to upload files to ContentVersion to my sandbox Salesforce.  
 
I got the following error:
2020-10-09 10:53:26,046 INFO  [csvContentVersionInsertClosed_0-999] visitor.BulkApiVisitorUtil createJob (BulkApiVisitorUtil.java:115) - Created Bulk API Job: 7506C000003fFeKQAU
2020-10-09 10:53:34,267 ERROR [csvContentVersionInsertClosed_0-999] action.AbstractAction handleException (AbstractAction.java:222) - Exception occured during loading
com.salesforce.dataloader.exception.LoadException: Failed to create batch
        at com.salesforce.dataloader.action.visitor.DAOLoadVisitor.handleException(DAOLoadVisitor.java:147)
        at com.salesforce.dataloader.action.visitor.BulkLoadVisitor.handleException(BulkLoadVisitor.java:148)
        at com.salesforce.dataloader.action.visitor.BulkLoadVisitor.loadBatch(BulkLoadVisitor.java:137)
        at com.salesforce.dataloader.action.visitor.DAOLoadVisitor.visit(DAOLoadVisitor.java:105)
        at com.salesforce.dataloader.action.AbstractLoadAction.visit(AbstractLoadAction.java:85)
        at com.salesforce.dataloader.action.AbstractAction.execute(AbstractAction.java:131)
        at com.salesforce.dataloader.controller.Controller.executeAction(Controller.java:173)
        at com.salesforce.dataloader.process.ProcessRunner.run(ProcessRunner.java:156)
        at com.salesforce.dataloader.process.ProcessRunner.run(ProcessRunner.java:107)
        at com.salesforce.dataloader.process.ProcessRunner.main(ProcessRunner.java:266)
Caused by: [AsyncApiException  exceptionCode='ClientInputError'
 exceptionMessage='Failed to create batch'
]

        at com.sforce.async.BulkConnection.createBatchWithInputStreamAttachments(BulkConnection.java:340)
        at com.salesforce.dataloader.action.visitor.BulkApiVisitorUtil.createBatch(BulkApiVisitorUtil.java:136)
        at com.salesforce.dataloader.action.visitor.BulkLoadVisitor.createBatch(BulkLoadVisitor.java:275)
        at com.salesforce.dataloader.action.visitor.BulkLoadVisitor.doOneBatch(BulkLoadVisitor.java:189)
        at com.salesforce.dataloader.action.visitor.BulkLoadVisitor.createBatches(BulkLoadVisitor.java:167)
        at com.salesforce.dataloader.action.visitor.BulkLoadVisitor.loadBatch(BulkLoadVisitor.java:134)
        ... 7 more
Caused by: com.sforce.ws.ConnectionException: unable to find end tag at:  START_TAG seen ..."http://www.force.com/2009/06/asyncapi/dataload">\n <exceptionCode>... @3:17
        at com.sforce.ws.bind.TypeMapper.consumeEndTag(TypeMapper.java:474)
        at com.sforce.async.BatchInfo.load(BatchInfo.java:310)
        at com.sforce.async.BatchRequest.loadBatchInfo(BatchRequest.java:98)
        at com.sforce.async.BulkConnection.createBatchWithInputStreamAttachments(BulkConnection.java:334)
        ... 12 more
2020-10-09 10:53:34,272 ERROR [csvContentVersionInsertClosed_0-999] progress.NihilistProgressAdapter doneError (NihilistProgressAdapter.java:58) - Failed to create batch
2020-10-09 10:53:34,284 ERROR [csvContentVersionInsertClosed_0-999] action.AbstractAction handleException (AbstractAction.java:222) - Exception occured during loading
com.salesforce.dataloader.exception.LoadException: Failed to create batch
        at com.salesforce.dataloader.action.visitor.DAOLoadVisitor.handleException(DAOLoadVisitor.java:147)
        at com.salesforce.dataloader.action.visitor.BulkLoadVisitor.handleException(BulkLoadVisitor.java:148)
        at com.salesforce.dataloader.action.visitor.BulkLoadVisitor.loadBatch(BulkLoadVisitor.java:137)
        at com.salesforce.dataloader.action.visitor.DAOLoadVisitor.flushRemaining(DAOLoadVisitor.java:125)
        at com.salesforce.dataloader.action.visitor.BulkLoadVisitor.flushRemaining(BulkLoadVisitor.java:281)
        at com.salesforce.dataloader.action.AbstractLoadAction.flush(AbstractLoadAction.java:92)
        at com.salesforce.dataloader.action.AbstractAction.execute(AbstractAction.java:138)
        at com.salesforce.dataloader.controller.Controller.executeAction(Controller.java:173)
        at com.salesforce.dataloader.process.ProcessRunner.run(ProcessRunner.java:156)
        at com.salesforce.dataloader.process.ProcessRunner.run(ProcessRunner.java:107)
        at com.salesforce.dataloader.process.ProcessRunner.main(ProcessRunner.java:266)
Caused by: [AsyncApiException  exceptionCode='ClientInputError'
 exceptionMessage='Failed to create batch'
]

        at com.sforce.async.BulkConnection.createBatchWithInputStreamAttachments(BulkConnection.java:340)
        at com.salesforce.dataloader.action.visitor.BulkApiVisitorUtil.createBatch(BulkApiVisitorUtil.java:136)
        at com.salesforce.dataloader.action.visitor.BulkLoadVisitor.createBatch(BulkLoadVisitor.java:275)
        at com.salesforce.dataloader.action.visitor.BulkLoadVisitor.doOneBatch(BulkLoadVisitor.java:189)
        at com.salesforce.dataloader.action.visitor.BulkLoadVisitor.createBatches(BulkLoadVisitor.java:167)
        at com.salesforce.dataloader.action.visitor.BulkLoadVisitor.loadBatch(BulkLoadVisitor.java:134)
        ... 8 more
Caused by: com.sforce.ws.ConnectionException: unable to find end tag at:  START_TAG seen ..."http://www.force.com/2009/06/asyncapi/dataload">\n <exceptionCode>... @3:17
        at com.sforce.ws.bind.TypeMapper.consumeEndTag(TypeMapper.java:474)
        at com.sforce.async.BatchInfo.load(BatchInfo.java:310)
        at com.sforce.async.BatchRequest.loadBatchInfo(BatchRequest.java:98)
        at com.sforce.async.BulkConnection.createBatchWithInputStreamAttachments(BulkConnection.java:334)
        ... 13 more
2020-10-09 10:53:34,286 ERROR [csvContentVersionInsertClosed_0-999] progress.NihilistProgressAdapter doneError (NihilistProgressAdapter.java:58) - Failed to create batch

Please advice.

Thanks,

Tien

I am trying to migrate data from Jira to my Salesforce. I have to import a huge amount of tickets (~6000 records), comments (~70000 records) and attachments (~50000 records). I researched and found that I can use Data Loader with Bulk Api enabled.

I split my data and process 1000 tickets (import 1000 cases, CaseComments for 1000 tickets, ContentVersions for 1000 tickets) at a time. I successfully upserted 1000 cases, then I tried to insert 10340 comments for that 1000 cases and I got all of them failed.
User-added imageCheck error log, I saw many errors:
CANNOT_INSERT_UPDATE_ACTIVATE_ENTITY:CaseCommentTrigger: execution of AfterInsert

caused by: ConnectApi.RateLimitException: You have reached the Connect API's hourly request limit for this user and application.  Please try again later.

Class.ConnectApi.ChatterFeeds.postFeedElement: line 1458, column 1
Class.CaseCommentHandler: line 223, column 1
Trigger.CaseCommentTrigger: line 21, column 1
This is the settings I used:
<bean id="CaseCommentInsertClosed_0-999" class="com.salesforce.dataloader.process.ProcessRunner" singleton="false">
      <description>description</description>
      <property name="name" value="csvCaseCommentInsertClosed_0-999"/>
      <property name="configOverrideMap">
         <map>
            <entry key="sfdc.timeoutSecs" value="600"/>
            <entry key="sfdc.bulkApiCheckStatusInterval" value="10000"/>
            <entry key="sfdc.debugMessages" value="true"/>
            <entry key="sfdc.debugMessagesFile" value="C:\Jira_SF_Data\DataLoader_Log\CaseCommentInsertClosed_0-999SoapTrace.log"/>
            <entry key="sfdc.endpoint" value="myenpoint"/>
            <entry key="sfdc.username" value="myuser"/>
            <entry key="sfdc.password" value="mypassword"/>
            <entry key="sfdc.entity" value="CaseComment"/>
            <entry key="process.operation" value="insert"/>
            <entry key="process.mappingFile" value="C:\Users\OPSWAT\OneDrive - OPSWAT\Jira2SF_Migration\Jira2SF\Mappings\CaseComment.sdl"/>
            <entry key="dataAccess.name" value="C:\Jira_SF_Data\CSV_Data\CaseComment_Closed_0-999.csv"/>
            <entry key="process.outputSuccess" value="C:\Jira_SF_Data\DataLoader_Log\CaseCommentInsertClosed_0-999_success.csv"/>
            <entry key="process.outputError" value="C:\Jira_SF_Data\DataLoader_Log\CaseCommentInsertClosed_0-999_error.csv"/>
            <entry key="dataAccess.type" value="csvRead"/>
            <entry key="dataAccess.readUTF8" value="true"/>
            <entry key="sfdc.externalIdField" value=""/>
            <entry key="sfdc.extractionSOQL" value=""/>
            <entry key="sfdc.useBulkApi" value="true"/>
            <entry key="sfdc.bulkApiSerialMode" value="true"/>
            <entry key="sfdc.truncateFields" value="true"/>
            <entry key="sfdc.writeBatchSize" value="1000"/>
            <entry key="sfdc.readBatchSize" value="100"/>
            <entry key="sfdc.loadBatchSize" value="200"/>
         </map>
      </property>
   </bean>
I found this document about Bulk API limit (https://developer.salesforce.com/docs/atlas.en-us.salesforce_app_limits_cheatsheet.meta/salesforce_app_limits_cheatsheet/salesforce_app_limits_platform_bulkapi.htm). I am using DataLoader with Bulk Api v1. But I don't think I exceed the limit with those settings or maybe I am misunderstanding.

Please advise how to avoid exceeding the rate limit.

Thanks in advance.
 

Hi,

I am migrating all tickets from Jira to Salesforce. I successfully imported cases and case comments. In order to upload a file to a case, I'm using ContentVersion object and writing a trigger to update FirstPublishLocationId to the Id of a case after insert. The files are uploaded to a case but the OwnerId field isn't affected. The owner is the user running the API by default.

I read some posts and they said that this OwnerId is only updatable, not creatable. 

I tried to export all records from ContentVersion object with the hope that I can get the ID and update the OwnerId. The problem is I don't see the file which I uploaded to Salesforce using the API in the report.

I am really stuck. Can you please advise how to update the owner of the file? Or you have any other approaches to upload files to a case. Please advise.

Thanks in advance.

 

I using Data Loader CLI v50 and enable Bulk API to upload files to ContentVersion to my sandbox Salesforce.  
 
I got the following error:
2020-10-09 10:53:26,046 INFO  [csvContentVersionInsertClosed_0-999] visitor.BulkApiVisitorUtil createJob (BulkApiVisitorUtil.java:115) - Created Bulk API Job: 7506C000003fFeKQAU
2020-10-09 10:53:34,267 ERROR [csvContentVersionInsertClosed_0-999] action.AbstractAction handleException (AbstractAction.java:222) - Exception occured during loading
com.salesforce.dataloader.exception.LoadException: Failed to create batch
        at com.salesforce.dataloader.action.visitor.DAOLoadVisitor.handleException(DAOLoadVisitor.java:147)
        at com.salesforce.dataloader.action.visitor.BulkLoadVisitor.handleException(BulkLoadVisitor.java:148)
        at com.salesforce.dataloader.action.visitor.BulkLoadVisitor.loadBatch(BulkLoadVisitor.java:137)
        at com.salesforce.dataloader.action.visitor.DAOLoadVisitor.visit(DAOLoadVisitor.java:105)
        at com.salesforce.dataloader.action.AbstractLoadAction.visit(AbstractLoadAction.java:85)
        at com.salesforce.dataloader.action.AbstractAction.execute(AbstractAction.java:131)
        at com.salesforce.dataloader.controller.Controller.executeAction(Controller.java:173)
        at com.salesforce.dataloader.process.ProcessRunner.run(ProcessRunner.java:156)
        at com.salesforce.dataloader.process.ProcessRunner.run(ProcessRunner.java:107)
        at com.salesforce.dataloader.process.ProcessRunner.main(ProcessRunner.java:266)
Caused by: [AsyncApiException  exceptionCode='ClientInputError'
 exceptionMessage='Failed to create batch'
]

        at com.sforce.async.BulkConnection.createBatchWithInputStreamAttachments(BulkConnection.java:340)
        at com.salesforce.dataloader.action.visitor.BulkApiVisitorUtil.createBatch(BulkApiVisitorUtil.java:136)
        at com.salesforce.dataloader.action.visitor.BulkLoadVisitor.createBatch(BulkLoadVisitor.java:275)
        at com.salesforce.dataloader.action.visitor.BulkLoadVisitor.doOneBatch(BulkLoadVisitor.java:189)
        at com.salesforce.dataloader.action.visitor.BulkLoadVisitor.createBatches(BulkLoadVisitor.java:167)
        at com.salesforce.dataloader.action.visitor.BulkLoadVisitor.loadBatch(BulkLoadVisitor.java:134)
        ... 7 more
Caused by: com.sforce.ws.ConnectionException: unable to find end tag at:  START_TAG seen ..."http://www.force.com/2009/06/asyncapi/dataload">\n <exceptionCode>... @3:17
        at com.sforce.ws.bind.TypeMapper.consumeEndTag(TypeMapper.java:474)
        at com.sforce.async.BatchInfo.load(BatchInfo.java:310)
        at com.sforce.async.BatchRequest.loadBatchInfo(BatchRequest.java:98)
        at com.sforce.async.BulkConnection.createBatchWithInputStreamAttachments(BulkConnection.java:334)
        ... 12 more
2020-10-09 10:53:34,272 ERROR [csvContentVersionInsertClosed_0-999] progress.NihilistProgressAdapter doneError (NihilistProgressAdapter.java:58) - Failed to create batch
2020-10-09 10:53:34,284 ERROR [csvContentVersionInsertClosed_0-999] action.AbstractAction handleException (AbstractAction.java:222) - Exception occured during loading
com.salesforce.dataloader.exception.LoadException: Failed to create batch
        at com.salesforce.dataloader.action.visitor.DAOLoadVisitor.handleException(DAOLoadVisitor.java:147)
        at com.salesforce.dataloader.action.visitor.BulkLoadVisitor.handleException(BulkLoadVisitor.java:148)
        at com.salesforce.dataloader.action.visitor.BulkLoadVisitor.loadBatch(BulkLoadVisitor.java:137)
        at com.salesforce.dataloader.action.visitor.DAOLoadVisitor.flushRemaining(DAOLoadVisitor.java:125)
        at com.salesforce.dataloader.action.visitor.BulkLoadVisitor.flushRemaining(BulkLoadVisitor.java:281)
        at com.salesforce.dataloader.action.AbstractLoadAction.flush(AbstractLoadAction.java:92)
        at com.salesforce.dataloader.action.AbstractAction.execute(AbstractAction.java:138)
        at com.salesforce.dataloader.controller.Controller.executeAction(Controller.java:173)
        at com.salesforce.dataloader.process.ProcessRunner.run(ProcessRunner.java:156)
        at com.salesforce.dataloader.process.ProcessRunner.run(ProcessRunner.java:107)
        at com.salesforce.dataloader.process.ProcessRunner.main(ProcessRunner.java:266)
Caused by: [AsyncApiException  exceptionCode='ClientInputError'
 exceptionMessage='Failed to create batch'
]

        at com.sforce.async.BulkConnection.createBatchWithInputStreamAttachments(BulkConnection.java:340)
        at com.salesforce.dataloader.action.visitor.BulkApiVisitorUtil.createBatch(BulkApiVisitorUtil.java:136)
        at com.salesforce.dataloader.action.visitor.BulkLoadVisitor.createBatch(BulkLoadVisitor.java:275)
        at com.salesforce.dataloader.action.visitor.BulkLoadVisitor.doOneBatch(BulkLoadVisitor.java:189)
        at com.salesforce.dataloader.action.visitor.BulkLoadVisitor.createBatches(BulkLoadVisitor.java:167)
        at com.salesforce.dataloader.action.visitor.BulkLoadVisitor.loadBatch(BulkLoadVisitor.java:134)
        ... 8 more
Caused by: com.sforce.ws.ConnectionException: unable to find end tag at:  START_TAG seen ..."http://www.force.com/2009/06/asyncapi/dataload">\n <exceptionCode>... @3:17
        at com.sforce.ws.bind.TypeMapper.consumeEndTag(TypeMapper.java:474)
        at com.sforce.async.BatchInfo.load(BatchInfo.java:310)
        at com.sforce.async.BatchRequest.loadBatchInfo(BatchRequest.java:98)
        at com.sforce.async.BulkConnection.createBatchWithInputStreamAttachments(BulkConnection.java:334)
        ... 13 more
2020-10-09 10:53:34,286 ERROR [csvContentVersionInsertClosed_0-999] progress.NihilistProgressAdapter doneError (NihilistProgressAdapter.java:58) - Failed to create batch

Please advice.

Thanks,

Tien

I am trying to migrate data from Jira to my Salesforce. I have to import a huge amount of tickets (~6000 records), comments (~70000 records) and attachments (~50000 records). I researched and found that I can use Data Loader with Bulk Api enabled.

I split my data and process 1000 tickets (import 1000 cases, CaseComments for 1000 tickets, ContentVersions for 1000 tickets) at a time. I successfully upserted 1000 cases, then I tried to insert 10340 comments for that 1000 cases and I got all of them failed.
User-added imageCheck error log, I saw many errors:
CANNOT_INSERT_UPDATE_ACTIVATE_ENTITY:CaseCommentTrigger: execution of AfterInsert

caused by: ConnectApi.RateLimitException: You have reached the Connect API's hourly request limit for this user and application.  Please try again later.

Class.ConnectApi.ChatterFeeds.postFeedElement: line 1458, column 1
Class.CaseCommentHandler: line 223, column 1
Trigger.CaseCommentTrigger: line 21, column 1
This is the settings I used:
<bean id="CaseCommentInsertClosed_0-999" class="com.salesforce.dataloader.process.ProcessRunner" singleton="false">
      <description>description</description>
      <property name="name" value="csvCaseCommentInsertClosed_0-999"/>
      <property name="configOverrideMap">
         <map>
            <entry key="sfdc.timeoutSecs" value="600"/>
            <entry key="sfdc.bulkApiCheckStatusInterval" value="10000"/>
            <entry key="sfdc.debugMessages" value="true"/>
            <entry key="sfdc.debugMessagesFile" value="C:\Jira_SF_Data\DataLoader_Log\CaseCommentInsertClosed_0-999SoapTrace.log"/>
            <entry key="sfdc.endpoint" value="myenpoint"/>
            <entry key="sfdc.username" value="myuser"/>
            <entry key="sfdc.password" value="mypassword"/>
            <entry key="sfdc.entity" value="CaseComment"/>
            <entry key="process.operation" value="insert"/>
            <entry key="process.mappingFile" value="C:\Users\OPSWAT\OneDrive - OPSWAT\Jira2SF_Migration\Jira2SF\Mappings\CaseComment.sdl"/>
            <entry key="dataAccess.name" value="C:\Jira_SF_Data\CSV_Data\CaseComment_Closed_0-999.csv"/>
            <entry key="process.outputSuccess" value="C:\Jira_SF_Data\DataLoader_Log\CaseCommentInsertClosed_0-999_success.csv"/>
            <entry key="process.outputError" value="C:\Jira_SF_Data\DataLoader_Log\CaseCommentInsertClosed_0-999_error.csv"/>
            <entry key="dataAccess.type" value="csvRead"/>
            <entry key="dataAccess.readUTF8" value="true"/>
            <entry key="sfdc.externalIdField" value=""/>
            <entry key="sfdc.extractionSOQL" value=""/>
            <entry key="sfdc.useBulkApi" value="true"/>
            <entry key="sfdc.bulkApiSerialMode" value="true"/>
            <entry key="sfdc.truncateFields" value="true"/>
            <entry key="sfdc.writeBatchSize" value="1000"/>
            <entry key="sfdc.readBatchSize" value="100"/>
            <entry key="sfdc.loadBatchSize" value="200"/>
         </map>
      </property>
   </bean>
I found this document about Bulk API limit (https://developer.salesforce.com/docs/atlas.en-us.salesforce_app_limits_cheatsheet.meta/salesforce_app_limits_cheatsheet/salesforce_app_limits_platform_bulkapi.htm). I am using DataLoader with Bulk Api v1. But I don't think I exceed the limit with those settings or maybe I am misunderstanding.

Please advise how to avoid exceeding the rate limit.

Thanks in advance.
 

Hi,

I am migrating all tickets from Jira to Salesforce. I successfully imported cases and case comments. In order to upload a file to a case, I'm using ContentVersion object and writing a trigger to update FirstPublishLocationId to the Id of a case after insert. The files are uploaded to a case but the OwnerId field isn't affected. The owner is the user running the API by default.

I read some posts and they said that this OwnerId is only updatable, not creatable. 

I tried to export all records from ContentVersion object with the hope that I can get the ID and update the OwnerId. The problem is I don't see the file which I uploaded to Salesforce using the API in the report.

I am really stuck. Can you please advise how to update the owner of the file? Or you have any other approaches to upload files to a case. Please advise.

Thanks in advance.

 

I have been tasked to build an API that will update Salesforce account records with data in Beauhurst, when the salesforce record has a CH number. 

I was thinking a daily mass update using BULK and the schedular class, but would also like the option for the user to click a button on a single account record page, to update that account record only. - I guess this is another REST api, totally seperate to the daily one?

I've not built an API from scratch before, is there anything I need to be aware of using bulk?

I'm new to the Data Loader, so I'm testing the batch / command line capabilities.  In doing so, I got the batch api enabled and working within my sandbox, but my test insert job failed after 15,000 records.  I checked the space and that's not the issue.
 
The only thing I can see is the "unable to find end tag" error and I can't find any documentation on what this means.  Any help would be great.  Thanks!
 

 
 
2010-03-25 10:45:52,973 INFO  [deleteAccounts] action.ActionFactory getActionInstance (ActionFactory.java:64) - Instantiating operation: delete
2010-03-25 10:45:53,254 INFO  [deleteAccounts] controller.Controller executeAction (Controller.java:114) - executing operation: delete
2010-03-25 10:45:53,254 INFO  [deleteAccounts] action.AbstractLoadAction execute (AbstractLoadAction.java:130) - Loading Using Bulk API: delete
2010-03-25 10:45:53,426 INFO  [deleteAccounts] visitor.BulkLoadVisitor createJobInfo (BulkLoadVisitor.java:207) - Created Bulk API Job: 750Q00000008TqMIAU
2010-03-25 10:45:53,738 INFO  [deleteAccounts] progress.NihilistProgressAdapter setSubTask (NihilistProgressAdapter.java:68) - Aborting job
2010-03-25 10:45:54,082 FATAL [deleteAccounts] action.AbstractLoadAction execute (AbstractLoadAction.java:172) - Exception occured during loading
com.salesforce.dataloader.exception.LoadException: Failed to create batch
at com.salesforce.dataloader.action.visitor.DAOLoadVisitor.handleException(DAOLoadVisitor.java:165)
at com.salesforce.dataloader.action.visitor.DAOLoadVisitor.handleException(DAOLoadVisitor.java:169)
at com.salesforce.dataloader.action.visitor.BulkLoadVisitor.handleException(BulkLoadVisitor.java:107)
at com.salesforce.dataloader.action.visitor.BulkLoadVisitor.loadBatch(BulkLoadVisitor.java:91)
at com.salesforce.dataloader.action.visitor.DAOLoadVisitor.visit(DAOLoadVisitor.java:125)
at com.salesforce.dataloader.action.AbstractLoadAction.visitRowList(AbstractLoadAction.java:202)
at com.salesforce.dataloader.action.AbstractLoadAction.execute(AbstractLoadAction.java:148)
at com.salesforce.dataloader.controller.Controller.executeAction(Controller.java:115)
at com.salesforce.dataloader.process.ProcessRunner.run(ProcessRunner.java:130)
at com.salesforce.dataloader.process.ProcessRunner.main(ProcessRunner.java:222)
Caused by: [AsyncApiException  exceptionCode='ClientInputError'
 exceptionMessage='Failed to create batch'
]

at com.sforce.async.RestConnection.createBatchFromStream(RestConnection.java:148)
at com.salesforce.dataloader.action.visitor.BulkLoadVisitor.createBatch(BulkLoadVisitor.java:217)
at com.salesforce.dataloader.action.visitor.BulkLoadVisitor.doOneBatch(BulkLoadVisitor.java:133)
at com.salesforce.dataloader.action.visitor.BulkLoadVisitor.createBatches(BulkLoadVisitor.java:113)
at com.salesforce.dataloader.action.visitor.BulkLoadVisitor.loadBatch(BulkLoadVisitor.java:87)
... 6 more
Caused by: com.sforce.ws.ConnectionException: unable to find end tag at:  START_TAG seen ..."http://www.force.com/2009/06/asyncapi/dataload">\n <exceptionCode>... @3:17
at com.sforce.ws.bind.TypeMapper.consumeEndTag(TypeMapper.java:398)
at com.sforce.async.BatchInfo.load(BatchInfo.java:200)
at com.sforce.async.BatchRequest.loadBatchInfo(BatchRequest.java:75)
at com.sforce.async.RestConnection.createBatchFromStream(RestConnection.java:142)
... 10 more
2010-03-25 10:45:54,145 ERROR [deleteAccounts] progress.NihilistProgressAdapter doneError (NihilistProgressAdapter.java:51) - Failed to create batch