function readOnly(count){ }
Starting November 20, the site will be set to read-only. On December 4, 2023,
forum discussions will move to the Trailblazer Community.
+ Start a Discussion
Tien Tran 29Tien Tran 29 

Failed to create Bulk API job, caused by exception: Unable to find end tag at: START_TAG

I using Data Loader CLI v50 and enable Bulk API to upload files to ContentVersion to my sandbox Salesforce.  
 
I got the following error:
2020-10-09 10:53:26,046 INFO  [csvContentVersionInsertClosed_0-999] visitor.BulkApiVisitorUtil createJob (BulkApiVisitorUtil.java:115) - Created Bulk API Job: 7506C000003fFeKQAU
2020-10-09 10:53:34,267 ERROR [csvContentVersionInsertClosed_0-999] action.AbstractAction handleException (AbstractAction.java:222) - Exception occured during loading
com.salesforce.dataloader.exception.LoadException: Failed to create batch
        at com.salesforce.dataloader.action.visitor.DAOLoadVisitor.handleException(DAOLoadVisitor.java:147)
        at com.salesforce.dataloader.action.visitor.BulkLoadVisitor.handleException(BulkLoadVisitor.java:148)
        at com.salesforce.dataloader.action.visitor.BulkLoadVisitor.loadBatch(BulkLoadVisitor.java:137)
        at com.salesforce.dataloader.action.visitor.DAOLoadVisitor.visit(DAOLoadVisitor.java:105)
        at com.salesforce.dataloader.action.AbstractLoadAction.visit(AbstractLoadAction.java:85)
        at com.salesforce.dataloader.action.AbstractAction.execute(AbstractAction.java:131)
        at com.salesforce.dataloader.controller.Controller.executeAction(Controller.java:173)
        at com.salesforce.dataloader.process.ProcessRunner.run(ProcessRunner.java:156)
        at com.salesforce.dataloader.process.ProcessRunner.run(ProcessRunner.java:107)
        at com.salesforce.dataloader.process.ProcessRunner.main(ProcessRunner.java:266)
Caused by: [AsyncApiException  exceptionCode='ClientInputError'
 exceptionMessage='Failed to create batch'
]

        at com.sforce.async.BulkConnection.createBatchWithInputStreamAttachments(BulkConnection.java:340)
        at com.salesforce.dataloader.action.visitor.BulkApiVisitorUtil.createBatch(BulkApiVisitorUtil.java:136)
        at com.salesforce.dataloader.action.visitor.BulkLoadVisitor.createBatch(BulkLoadVisitor.java:275)
        at com.salesforce.dataloader.action.visitor.BulkLoadVisitor.doOneBatch(BulkLoadVisitor.java:189)
        at com.salesforce.dataloader.action.visitor.BulkLoadVisitor.createBatches(BulkLoadVisitor.java:167)
        at com.salesforce.dataloader.action.visitor.BulkLoadVisitor.loadBatch(BulkLoadVisitor.java:134)
        ... 7 more
Caused by: com.sforce.ws.ConnectionException: unable to find end tag at:  START_TAG seen ..."http://www.force.com/2009/06/asyncapi/dataload">\n <exceptionCode>... @3:17
        at com.sforce.ws.bind.TypeMapper.consumeEndTag(TypeMapper.java:474)
        at com.sforce.async.BatchInfo.load(BatchInfo.java:310)
        at com.sforce.async.BatchRequest.loadBatchInfo(BatchRequest.java:98)
        at com.sforce.async.BulkConnection.createBatchWithInputStreamAttachments(BulkConnection.java:334)
        ... 12 more
2020-10-09 10:53:34,272 ERROR [csvContentVersionInsertClosed_0-999] progress.NihilistProgressAdapter doneError (NihilistProgressAdapter.java:58) - Failed to create batch
2020-10-09 10:53:34,284 ERROR [csvContentVersionInsertClosed_0-999] action.AbstractAction handleException (AbstractAction.java:222) - Exception occured during loading
com.salesforce.dataloader.exception.LoadException: Failed to create batch
        at com.salesforce.dataloader.action.visitor.DAOLoadVisitor.handleException(DAOLoadVisitor.java:147)
        at com.salesforce.dataloader.action.visitor.BulkLoadVisitor.handleException(BulkLoadVisitor.java:148)
        at com.salesforce.dataloader.action.visitor.BulkLoadVisitor.loadBatch(BulkLoadVisitor.java:137)
        at com.salesforce.dataloader.action.visitor.DAOLoadVisitor.flushRemaining(DAOLoadVisitor.java:125)
        at com.salesforce.dataloader.action.visitor.BulkLoadVisitor.flushRemaining(BulkLoadVisitor.java:281)
        at com.salesforce.dataloader.action.AbstractLoadAction.flush(AbstractLoadAction.java:92)
        at com.salesforce.dataloader.action.AbstractAction.execute(AbstractAction.java:138)
        at com.salesforce.dataloader.controller.Controller.executeAction(Controller.java:173)
        at com.salesforce.dataloader.process.ProcessRunner.run(ProcessRunner.java:156)
        at com.salesforce.dataloader.process.ProcessRunner.run(ProcessRunner.java:107)
        at com.salesforce.dataloader.process.ProcessRunner.main(ProcessRunner.java:266)
Caused by: [AsyncApiException  exceptionCode='ClientInputError'
 exceptionMessage='Failed to create batch'
]

        at com.sforce.async.BulkConnection.createBatchWithInputStreamAttachments(BulkConnection.java:340)
        at com.salesforce.dataloader.action.visitor.BulkApiVisitorUtil.createBatch(BulkApiVisitorUtil.java:136)
        at com.salesforce.dataloader.action.visitor.BulkLoadVisitor.createBatch(BulkLoadVisitor.java:275)
        at com.salesforce.dataloader.action.visitor.BulkLoadVisitor.doOneBatch(BulkLoadVisitor.java:189)
        at com.salesforce.dataloader.action.visitor.BulkLoadVisitor.createBatches(BulkLoadVisitor.java:167)
        at com.salesforce.dataloader.action.visitor.BulkLoadVisitor.loadBatch(BulkLoadVisitor.java:134)
        ... 8 more
Caused by: com.sforce.ws.ConnectionException: unable to find end tag at:  START_TAG seen ..."http://www.force.com/2009/06/asyncapi/dataload">\n <exceptionCode>... @3:17
        at com.sforce.ws.bind.TypeMapper.consumeEndTag(TypeMapper.java:474)
        at com.sforce.async.BatchInfo.load(BatchInfo.java:310)
        at com.sforce.async.BatchRequest.loadBatchInfo(BatchRequest.java:98)
        at com.sforce.async.BulkConnection.createBatchWithInputStreamAttachments(BulkConnection.java:334)
        ... 13 more
2020-10-09 10:53:34,286 ERROR [csvContentVersionInsertClosed_0-999] progress.NihilistProgressAdapter doneError (NihilistProgressAdapter.java:58) - Failed to create batch

Please advice.

Thanks,

Tien

Tien Tran 29Tien Tran 29
I decreased the batch_size from 2000 to 100. It worked for 900 records then failed. I wanted to process for 12.691 records. Please advise.
Tien Tran 29Tien Tran 29
I also test with batch_size = 10. Then I got 987 success and 3 errors. After that, the number of batches failed kept increasing until I got the same error and exception as above. Not sure what proper settings to upload a huge amount of files to Salesforce, :(
Tien Tran 29Tien Tran 29
This is the settings I used:
<bean id="ContentVersionInsertClosed_0-999" class="com.salesforce.dataloader.process.ProcessRunner" singleton="false">
      <description>description</description>
      <property name="name" value="csvContentVersionInsertClosed_0-999"/>
      <property name="configOverrideMap">
         <map>
            <entry key="sfdc.timeoutSecs" value="600"/>
            <entry key="sfdc.bulkApiCheckStatusInterval" value="10000"/>
            <entry key="sfdc.debugMessages" value="true"/>
            <entry key="sfdc.debugMessagesFile" value="C:\Jira_SF_Data\DataLoader_Log\ContentVersionInsertClosed_0-999SoapTrace.log"/>
            <entry key="sfdc.endpoint" value="my endpoint"/>
            <entry key="sfdc.username" value="my username"/>
            <entry key="sfdc.password" value="my password"/>
            <entry key="sfdc.entity" value="ContentVersion"/>
            <entry key="process.operation" value="insert"/>
            <entry key="process.mappingFile" value="C:\Users\OPSWAT\OneDrive - OPSWAT\Jira2SF_Migration\Jira2SF\Mappings\ContentVersion.sdl"/>
            <entry key="dataAccess.name" value="C:\Jira_SF_Data\CSV_Data\ContentVersion_Closed_0-999.csv"/>
            <entry key="process.outputSuccess" value="C:\Jira_SF_Data\DataLoader_Log\ContentVersionInsertClosed_0-999_success.csv"/>
            <entry key="process.outputError" value="C:\Jira_SF_Data\DataLoader_Log\ContentVersionInsertClosed_0-999_error.csv"/>
            <entry key="dataAccess.type" value="csvRead"/>
            <entry key="dataAccess.readUTF8" value="true"/>
            <entry key="sfdc.externalIdField" value=""/>
            <entry key="sfdc.extractionSOQL" value=""/>
            <entry key="sfdc.useBulkApi" value="true"/>
            <entry key="sfdc.bulkApiSerialMode" value="true"/>
            <entry key="sfdc.truncateFields" value="true"/>
            <entry key="sfdc.writeBatchSize" value="1000"/>
            <entry key="sfdc.readBatchSize" value="750"/>
            <entry key="sfdc.loadBatchSize" value="10"/>
            <entry key="sfdc.bulkApiZipContent" value="true"/>
         </map>
      </property>
   </bean>
VinayVinay (Salesforce Developers) 
Hi Tien,

This might be related to max overall size the bulk upload API can handle at any given time. Larger individual attachments to upload will require smaller batch sizes try to set the batch size limit to 1 and check.

https://salesforce.stackexchange.com/questions/251842/how-to-upload-contentversion-records-using-bulkapi

Thanks,
Tien Tran 29Tien Tran 29
Hi Vinay,

Thanks for your answer. I tried batch_size = 1, it runs for a while, then got many failed.

For someone who encounters issues like me. Here is the suggestion: Use SOAP API for ContentVersion (disable BulkAPI) because BulkAPI has many limits:

- You have to enable bulkApiZipContent, then
- The length of any file name can't exceed 512 bytes
- A zip file can't exceed 10 MB
- A maximum of 1,000 files can be contained in a zip file. It means you have to split your data into multiple CSV files
- The total size of the unzipped content can't exceed 20 MB

Note: Even when you use SOAP API, the limit size for a request is 52MB. So you have to decrease your batch size if you get this error "'Maximum size of request reached' when you import Attachment or Content"

Also, if you are testing in sandbox env, the default limit for the number of published content is 2500 / 24hours. You need to contact Salesforce support to increase this limit.

Hope it helps.

 
Wilson GreyWilson Grey
Thank you so much for providing the complete comprehensive answer. As I tried it for welding review (https://welderchoice.com/best-passive-welding-helmet/) batch_size=1 which is working properly. Although some time it shows bugs but after restoring the basic and cleaning the extra caches. It start working. 
colleen camachocolleen camacho
As I tried it for welding review which is working properly. Although some time it shows bugs but after restoring the basic and cleaning the extra caches. It start workingLooking for the same solution for my new program. I want to add this scirpt in my courier tracking website
https://couriertrackingfinder.com/