• sforcenewbie
  • NEWBIE
  • 0 Points
  • Member since 2010

  • Chatter
    Feed
  • 0
    Best Answers
  • 0
    Likes Received
  • 0
    Likes Given
  • 5
    Questions
  • 9
    Replies

Hello,


We are loading close to 10 million records and to help Dataloader out I figured we could get take advantage of the 64 bit JVM and allocate heap sizes north of 2g. However I was dissapointed to see that Dataloader throws some spring error when I just swap the SF Dataloader packaged JVM with the 1.6_21 JVM. I am attaching the stacktrace. This is a real bummer and would appreciate any help.

 

Here are the JVM options I was planning on using since the packaged JVM does not allow a heap size of greater than 1 ~ 1.5 GB.

 

%JAVA_HOME%\bin\java.exe -Xms3g -Xmx3g -Xmn1g -XX:+UseConcMarkSweepGC -server

 


scripts\..\conf\database-conf.xml]: Error setting property values; nested exception is org.springframework.beans.PropertyAccessExceptionsException: PropertyAccessExceptionsException (1 errors); nested
 propertyAccessExceptions are: [org.springframework.beans.TypeMismatchException: Failed to convert property value of type [org.apache.commons.collections.map.LinkedMap] to required type [java.util.Has
hMap] for property 'sqlParams']
PropertyAccessExceptionsException (1 errors)
org.springframework.beans.TypeMismatchException: Failed to convert property value of type [org.apache.commons.collections.map.LinkedMap] to required type [java.util.HashMap] for property 'sqlParams'
        at org.springframework.beans.BeanWrapperImpl.doTypeConversionIfNecessary(BeanWrapperImpl.java:839)
        at org.springframework.beans.BeanWrapperImpl.setPropertyValue(BeanWrapperImpl.java:584)
        at org.springframework.beans.BeanWrapperImpl.setPropertyValue(BeanWrapperImpl.java:469)
        at org.springframework.beans.BeanWrapperImpl.setPropertyValue(BeanWrapperImpl.java:626)
        at org.springframework.beans.BeanWrapperImpl.setPropertyValues(BeanWrapperImpl.java:653)
        at org.springframework.beans.BeanWrapperImpl.setPropertyValues(BeanWrapperImpl.java:642)
        at org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.applyPropertyValues(AbstractAutowireCapableBeanFactory.java:1027)
        at org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.populateBean(AbstractAutowireCapableBeanFactory.java:824)
        at org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.createBean(AbstractAutowireCapableBeanFactory.java:345)
        at org.springframework.beans.factory.support.AbstractBeanFactory.getBean(AbstractBeanFactory.java:226)
        at org.springframework.beans.factory.support.AbstractBeanFactory.getBean(AbstractBeanFactory.java:147)
        at org.springframework.beans.factory.support.BeanDefinitionValueResolver.resolveReference(BeanDefinitionValueResolver.java:176)
        at org.springframework.beans.factory.support.BeanDefinitionValueResolver.resolveValueIfNecessary(BeanDefinitionValueResolver.java:105)
        at org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.applyPropertyValues(AbstractAutowireCapableBeanFactory.java:1013)
        at org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.populateBean(AbstractAutowireCapableBeanFactory.java:824)
        at org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.createBean(AbstractAutowireCapableBeanFactory.java:345)
        at org.springframework.beans.factory.support.AbstractBeanFactory.getBean(AbstractBeanFactory.java:226)
        at org.springframework.beans.factory.support.AbstractBeanFactory.getBean(AbstractBeanFactory.java:147)
        at com.salesforce.dataloader.dao.database.DatabaseConfig.getInstance(DatabaseConfig.java:23)
        at com.salesforce.dataloader.dao.database.DatabaseWriter.<init>(DatabaseWriter.java:74)
        at com.salesforce.dataloader.dao.database.DatabaseWriter.<init>(DatabaseWriter.java:58)
        at com.salesforce.dataloader.dao.DataAccessObjectFactory.getDaoInstance(DataAccessObjectFactory.java:60)
        at com.salesforce.dataloader.controller.Controller.createDao(Controller.java:183)
        at com.salesforce.dataloader.process.ProcessRunner.run(ProcessRunner.java:111)
        at com.salesforce.dataloader.process.ProcessRunner.main(ProcessRunner.java:222)
2010-10-11 12:17:16,318 FATAL [extractrecordtypetargetProcess] controller.Controller createDao (Controller.java:185) - Error creating data access object
org.springframework.beans.factory.BeanCreationException: Error creating bean with name 'insertrecordtypetarget' defined in file [C:\Program Files (x86)\salesforce.com\Apex Data Loader 18.0\xxx\scr
ipts\..\conf\database-conf.xml]: Can't resolve reference to bean 'insertrecordtypetargetSql' while setting property 'sqlConfig'; nested exception is org.springframework.beans.factory.BeanCreationExcep
tion: Error creating bean with name 'insertrecordtypetargetSql' defined in file [C:\Program Files (x86)\salesforce.com\Apex Data Loader 18.0\xxx\scripts\..\conf\database-conf.xml]: Error setting p
roperty values; nested exception is org.springframework.beans.PropertyAccessExceptionsException: PropertyAccessExceptionsException (1 errors); nested propertyAccessExceptions are: [org.springframework
.beans.TypeMismatchException: Failed to convert property value of type [org.apache.commons.collections.map.LinkedMap] to required type [java.util.HashMap] for property 'sqlParams']
org.springframework.beans.factory.BeanCreationException: Error creating bean with name 'insertrecordtypetargetSql' defined in file [C:\Program Files (x86)\salesforce.com\Apex Data Loader 18.0\xxx\
scripts\..\conf\database-conf.xml]: Error setting property values; nested exception is org.springframework.beans.PropertyAccessExceptionsException: PropertyAccessExceptionsException (1 errors); nested
 propertyAccessExceptions are: [org.springframework.beans.TypeMismatchException: Failed to convert property value of type [org.apache.commons.collections.map.LinkedMap] to required type [java.util.Has
hMap] for property 'sqlParams']
PropertyAccessExceptionsException (1 errors)
org.springframework.beans.TypeMismatchException: Failed to convert property value of type [org.apache.commons.collections.map.LinkedMap] to required type [java.util.HashMap] for property 'sqlParams'
        at org.springframework.beans.BeanWrapperImpl.doTypeConversionIfNecessary(BeanWrapperImpl.java:839)
        at org.springframework.beans.BeanWrapperImpl.setPropertyValue(BeanWrapperImpl.java:584)
        at org.springframework.beans.BeanWrapperImpl.setPropertyValue(BeanWrapperImpl.java:469)
        at org.springframework.beans.BeanWrapperImpl.setPropertyValue(BeanWrapperImpl.java:626)
        at org.springframework.beans.BeanWrapperImpl.setPropertyValues(BeanWrapperImpl.java:653)
        at org.springframework.beans.BeanWrapperImpl.setPropertyValues(BeanWrapperImpl.java:642)
        at org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.applyPropertyValues(AbstractAutowireCapableBeanFactory.java:1027)
        at org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.populateBean(AbstractAutowireCapableBeanFactory.java:824)
        at org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.createBean(AbstractAutowireCapableBeanFactory.java:345)
        at org.springframework.beans.factory.support.AbstractBeanFactory.getBean(AbstractBeanFactory.java:226)
        at org.springframework.beans.factory.support.AbstractBeanFactory.getBean(AbstractBeanFactory.java:147)
        at org.springframework.beans.factory.support.BeanDefinitionValueResolver.resolveReference(BeanDefinitionValueResolver.java:176)
        at org.springframework.beans.factory.support.BeanDefinitionValueResolver.resolveValueIfNecessary(BeanDefinitionValueResolver.java:105)
        at org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.applyPropertyValues(AbstractAutowireCapableBeanFactory.java:1013)
        at org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.populateBean(AbstractAutowireCapableBeanFactory.java:824)
        at org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.createBean(AbstractAutowireCapableBeanFactory.java:345)
        at org.springframework.beans.factory.support.AbstractBeanFactory.getBean(AbstractBeanFactory.java:226)
        at org.springframework.beans.factory.support.AbstractBeanFactory.getBean(AbstractBeanFactory.java:147)
        at com.salesforce.dataloader.dao.database.DatabaseConfig.getInstance(DatabaseConfig.java:23)
        at com.salesforce.dataloader.dao.database.DatabaseWriter.<init>(DatabaseWriter.java:74)
        at com.salesforce.dataloader.dao.database.DatabaseWriter.<init>(DatabaseWriter.java:58)
        at com.salesforce.dataloader.dao.DataAccessObjectFactory.getDaoInstance(DataAccessObjectFactory.java:60)
        at com.salesforce.dataloader.controller.Controller.createDao(Controller.java:183)
        at com.salesforce.dataloader.process.ProcessRunner.run(ProcessRunner.java:111)
        at com.salesforce.dataloader.process.ProcessRunner.main(ProcessRunner.java:222)
2010-10-11 12:17:16,355 FATAL [extractrecordtypetargetProcess] process.ProcessRunner run (ProcessRunner.java:147) - Error creating data access object
2010-10-11 12:17:16,355 FATAL [extractrecordtypetargetProcess] process.ProcessRunner run (ProcessRunner.java:147) - Error creating data access object

 

Thanks

We are moving data from one SF org to another. We decided to use mysql as an intermediate sink. We pull from Source load into mysql and then pump to destination using DataLoader ver 19.

 

I read all there is to read in the documentation regarding bulk api and built my sql to display column headers as "Relationship Name.External Id field" however it just fails saying "failed to create batch"

 

Also where does the SDL file picture in this? Nowhere in the bulkapi documentation is there a mention of SDL. If SDL is a Dataloader concept then someone needs to talk about that in the DataLoader manual. Typically I have an SDL file and specified my relationship in the following manner for eg say CreatedById field as

 

CreatedById = CreatedBy\:ExternalId__c

 

This works fine without bulkapi true but when I turn BulkApi to true it does not like the colon sign. I then changed it to . and then also changed my query from mysql to reflect column names as "Rel.ExtId field" and commented out the SDL mapping. But nothing works.

 

What am I missing? Can someone point me in the right direction specifically related to relationships and SDL mappings?

 

We need to load a total of about 10 to 15 million records.

 

Thanks

Hello,

 

Command Line Data Loader throws a Null Pointer exception at arbitrary points of file processing. We are trying to upsert account records and have about 200,000 records in a mysql database. We are using the command line dataloader to query our mysql DB and then push/upsert data into Salesforce. We have set the bulk api to false.

 

java.lang.NullPointerException at com.salesforce.dataloader.dyna.SObjectReference.addReferenceToSObject(SObjectReference.java:67) at com.salesforce.dataloader.dyna.SforceDynaBean.getSObject(SforceDynaBean.java:290) at com.salesforce.dataloader.dyna.SforceDynaBean.getSObjectArray(SforceDynaBean.java:258) at com.salesforce.dataloader.client.PartnerClient.loadUpserts(PartnerClient.java:172) at com.salesforce.dataloader.action.visitor.UpsertVisitor.executeClientAction(UpsertVisitor.java:53) at com.salesforce.dataloader.action.visitor.PartnerLoadVisitor.loadBatch(PartnerLoadVisitor.java:73) at com.salesforce.dataloader.action.visitor.DAOLoadVisitor.visit(DAOLoadVisitor.java:131) at com.salesforce.dataloader.action.AbstractLoadAction.visitRowList(AbstractLoadAction.java:202) at com.salesforce.dataloader.action.AbstractLoadAction.execute(AbstractLoadAction.java:148) at com.salesforce.dataloader.controller.Controller.executeAction(Controller.java:122) at com.salesforce.dataloader.process.ProcessRunner.run(ProcessRunner.java:126) at com.salesforce.dataloader.process.ProcessRunner.main(ProcessRunner.java:229)2010-10-01 19:06:26,034 FATAL [queryaccountProcess] action.AbstractLoadAction execute (AbstractLoadAction.java:172) - Exception occured durin

 

Can anyone throw some light on how to resolve this issue?

Hello,

 

I had launched a command line upsert of Case object. It was running fine and then it suddenly failed with the following error? Is it something to do with the data?

 

2010-09-23 22:33:46,385 DEBUG [insertCaseProcess] visitor.BulkLoadVisitor writeSingleColumn (BulkLoadVisitor.java:193) -
 No value provided for field: Product_Family_del__c
2010-09-23 22:33:46,385 DEBUG [insertCaseProcess] visitor.BulkLoadVisitor writeSingleColumn (BulkLoadVisitor.java:193) -
 No value provided for field: Severity__c
2010-09-23 22:33:46,385 DEBUG [insertCaseProcess] visitor.BulkLoadVisitor writeSingleColumn (BulkLoadVisitor.java:193) -
 No value provided for field: acDate__c
2010-09-23 22:33:46,385 DEBUG [insertCaseProcess] visitor.BulkLoadVisitor writeSingleColumn (BulkLoadVisitor.java:193) -
 No value provided for field: SuppliedName
2010-09-23 22:33:46,606 INFO  [insertCaseProcess] progress.NihilistProgressAdapter setSubTask (NihilistProgressAdapter.j
ava:68) - Aborting job
2010-09-23 22:33:46,897 FATAL [insertCaseProcess] action.AbstractLoadAction execute (AbstractLoadAction.java:172) - Exce
ption occured during loading
com.salesforce.dataloader.exception.LoadException: Failed to create batch
        at com.salesforce.dataloader.action.visitor.DAOLoadVisitor.handleException(DAOLoadVisitor.java:171)
        at com.salesforce.dataloader.action.visitor.BulkLoadVisitor.handleException(BulkLoadVisitor.java:114)
        at com.salesforce.dataloader.action.visitor.BulkLoadVisitor.loadBatch(BulkLoadVisitor.java:98)
        at com.salesforce.dataloader.action.visitor.DAOLoadVisitor.visit(DAOLoadVisitor.java:131)
        at com.salesforce.dataloader.action.AbstractLoadAction.visitRowList(AbstractLoadAction.java:202)
        at com.salesforce.dataloader.action.AbstractLoadAction.execute(AbstractLoadAction.java:148)
        at com.salesforce.dataloader.controller.Controller.executeAction(Controller.java:122)
        at com.salesforce.dataloader.process.ProcessRunner.run(ProcessRunner.java:126)
        at com.salesforce.dataloader.process.ProcessRunner.main(ProcessRunner.java:229)
Caused by: [AsyncApiException  exceptionCode='ClientInputError'
 exceptionMessage='Failed to create batch'
]

        at com.sforce.async.RestConnection.createBatchFromStream(RestConnection.java:146)
        at com.salesforce.dataloader.action.visitor.BulkLoadVisitor.createBatch(BulkLoadVisitor.java:252)
        at com.salesforce.dataloader.action.visitor.BulkLoadVisitor.doOneBatch(BulkLoadVisitor.java:156)
        at com.salesforce.dataloader.action.visitor.BulkLoadVisitor.createBatches(BulkLoadVisitor.java:133)
        at com.salesforce.dataloader.action.visitor.BulkLoadVisitor.loadBatch(BulkLoadVisitor.java:94)
        ... 6 more
Caused by: com.sforce.ws.ConnectionException: unable to find end tag at:  START_TAG seen ..."http://www.force.com/2009/0
6/asyncapi/dataload">\n <exceptionCode>... @3:17
        at com.sforce.ws.bind.TypeMapper.consumeEndTag(TypeMapper.java:399)
        at com.sforce.async.BatchInfo.load(BatchInfo.java:284)
        at com.sforce.async.BatchRequest.loadBatchInfo(BatchRequest.java:77)
        at com.sforce.async.RestConnection.createBatchFromStream(RestConnection.java:140)
        ... 10 more
2010-09-23 22:33:46,904 ERROR [insertCaseProcess] progress.NihilistProgressAdapter doneError (NihilistProgressAdapter.ja
va:51) - Failed to create batch

Hello,

 

I am running DataLoader through the command line and everything works with following messages however when I open the csv file I only see headers with no data rows?

 

4467 [extractAsset] INFO com.salesforce.dataloader.process.ProcessRunner  - Checking the data access object connection
4468 [extractAsset] INFO com.salesforce.dataloader.process.ProcessRunner  - Setting field types
9407 [extractAsset] INFO com.salesforce.dataloader.process.ProcessRunner  - Setting object reference types
23059 [extractAsset] INFO com.salesforce.dataloader.process.ProcessRunner  - Creating Map
23062 [extractAsset] INFO com.salesforce.dataloader.action.OperationInfo  - Instantiating operation ui action: extract
23063 [extractAsset] INFO com.salesforce.dataloader.controller.Controller  - executing operation: extract
23070 [extractAsset] DEBUG com.salesforce.dataloader.client.PartnerClient  - Beginning web service operation: query
25817 [extractAsset] INFO com.salesforce.dataloader.action.progress.NihilistProgressAdapter  - Processed 10 of 10 total records. Rate: 18000000 record
s per hour. Estimated time to complete: 0 minutes and 0 seconds.  There are 10 successes and 0 errors.
25830 [extractAsset] INFO com.salesforce.dataloader.action.progress.NihilistProgressAdapter  - The extract has fully completed.  There were 10 success
ful extracts and 0 errors.

 

<bean id="extractAsset"
      class="com.salesforce.dataloader.process.ProcessRunner"
      singleton="false">
  <description>Extract Asset from salesforce and saves info into a CSV file."</description>
    <property name="name" value="extractAsset"/>
    <property name="configOverrideMap">
        <map>
            <entry key="sfdc.debugMessages" value="false"/>
            <entry key="sfdc.debugMessagesFile" value="C:\dev\work\xxx\logs\sfdcSoapTrace.log"/>
            <entry key="sfdc.timeoutSecs" value="600"/>
            <entry key="sfdc.loadBatchSize" value="200"/>
            <entry key="sfdc.entity" value="Asset"/>
            <entry key="sfdc.extractionRequestSize" value="2000"/>
            <entry key="sfdc.extractionSOQL" value="Select a.UsageEndDate, a.SystemModstamp, a.Status, a.SerialNumber, a.Quantity, a.PurchaseDate, a.Product2Id, a.Price, a.Name, a.LastModifiedDate, a.LastModifiedById, a.IsDeleted, a.IsCompetitorProduct, a.InstallDate, a.Id, a.Description, a.CreatedDate, a.CreatedById, a.ContactId, a.AccountId From Asset a LIMIT 10"/>
            <entry key="process.operation" value="extract"/>
            <entry key="dataAccess.type" value="csvWrite"/>
            <entry key="dataAccess.name" value="C:\dev\work\xxx\data\assetData1.csv"/>
        </map>
    </property>
</bean>

 

I have the other properties in config.properties.

 

Thanks

Hello,


We are loading close to 10 million records and to help Dataloader out I figured we could get take advantage of the 64 bit JVM and allocate heap sizes north of 2g. However I was dissapointed to see that Dataloader throws some spring error when I just swap the SF Dataloader packaged JVM with the 1.6_21 JVM. I am attaching the stacktrace. This is a real bummer and would appreciate any help.

 

Here are the JVM options I was planning on using since the packaged JVM does not allow a heap size of greater than 1 ~ 1.5 GB.

 

%JAVA_HOME%\bin\java.exe -Xms3g -Xmx3g -Xmn1g -XX:+UseConcMarkSweepGC -server

 


scripts\..\conf\database-conf.xml]: Error setting property values; nested exception is org.springframework.beans.PropertyAccessExceptionsException: PropertyAccessExceptionsException (1 errors); nested
 propertyAccessExceptions are: [org.springframework.beans.TypeMismatchException: Failed to convert property value of type [org.apache.commons.collections.map.LinkedMap] to required type [java.util.Has
hMap] for property 'sqlParams']
PropertyAccessExceptionsException (1 errors)
org.springframework.beans.TypeMismatchException: Failed to convert property value of type [org.apache.commons.collections.map.LinkedMap] to required type [java.util.HashMap] for property 'sqlParams'
        at org.springframework.beans.BeanWrapperImpl.doTypeConversionIfNecessary(BeanWrapperImpl.java:839)
        at org.springframework.beans.BeanWrapperImpl.setPropertyValue(BeanWrapperImpl.java:584)
        at org.springframework.beans.BeanWrapperImpl.setPropertyValue(BeanWrapperImpl.java:469)
        at org.springframework.beans.BeanWrapperImpl.setPropertyValue(BeanWrapperImpl.java:626)
        at org.springframework.beans.BeanWrapperImpl.setPropertyValues(BeanWrapperImpl.java:653)
        at org.springframework.beans.BeanWrapperImpl.setPropertyValues(BeanWrapperImpl.java:642)
        at org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.applyPropertyValues(AbstractAutowireCapableBeanFactory.java:1027)
        at org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.populateBean(AbstractAutowireCapableBeanFactory.java:824)
        at org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.createBean(AbstractAutowireCapableBeanFactory.java:345)
        at org.springframework.beans.factory.support.AbstractBeanFactory.getBean(AbstractBeanFactory.java:226)
        at org.springframework.beans.factory.support.AbstractBeanFactory.getBean(AbstractBeanFactory.java:147)
        at org.springframework.beans.factory.support.BeanDefinitionValueResolver.resolveReference(BeanDefinitionValueResolver.java:176)
        at org.springframework.beans.factory.support.BeanDefinitionValueResolver.resolveValueIfNecessary(BeanDefinitionValueResolver.java:105)
        at org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.applyPropertyValues(AbstractAutowireCapableBeanFactory.java:1013)
        at org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.populateBean(AbstractAutowireCapableBeanFactory.java:824)
        at org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.createBean(AbstractAutowireCapableBeanFactory.java:345)
        at org.springframework.beans.factory.support.AbstractBeanFactory.getBean(AbstractBeanFactory.java:226)
        at org.springframework.beans.factory.support.AbstractBeanFactory.getBean(AbstractBeanFactory.java:147)
        at com.salesforce.dataloader.dao.database.DatabaseConfig.getInstance(DatabaseConfig.java:23)
        at com.salesforce.dataloader.dao.database.DatabaseWriter.<init>(DatabaseWriter.java:74)
        at com.salesforce.dataloader.dao.database.DatabaseWriter.<init>(DatabaseWriter.java:58)
        at com.salesforce.dataloader.dao.DataAccessObjectFactory.getDaoInstance(DataAccessObjectFactory.java:60)
        at com.salesforce.dataloader.controller.Controller.createDao(Controller.java:183)
        at com.salesforce.dataloader.process.ProcessRunner.run(ProcessRunner.java:111)
        at com.salesforce.dataloader.process.ProcessRunner.main(ProcessRunner.java:222)
2010-10-11 12:17:16,318 FATAL [extractrecordtypetargetProcess] controller.Controller createDao (Controller.java:185) - Error creating data access object
org.springframework.beans.factory.BeanCreationException: Error creating bean with name 'insertrecordtypetarget' defined in file [C:\Program Files (x86)\salesforce.com\Apex Data Loader 18.0\xxx\scr
ipts\..\conf\database-conf.xml]: Can't resolve reference to bean 'insertrecordtypetargetSql' while setting property 'sqlConfig'; nested exception is org.springframework.beans.factory.BeanCreationExcep
tion: Error creating bean with name 'insertrecordtypetargetSql' defined in file [C:\Program Files (x86)\salesforce.com\Apex Data Loader 18.0\xxx\scripts\..\conf\database-conf.xml]: Error setting p
roperty values; nested exception is org.springframework.beans.PropertyAccessExceptionsException: PropertyAccessExceptionsException (1 errors); nested propertyAccessExceptions are: [org.springframework
.beans.TypeMismatchException: Failed to convert property value of type [org.apache.commons.collections.map.LinkedMap] to required type [java.util.HashMap] for property 'sqlParams']
org.springframework.beans.factory.BeanCreationException: Error creating bean with name 'insertrecordtypetargetSql' defined in file [C:\Program Files (x86)\salesforce.com\Apex Data Loader 18.0\xxx\
scripts\..\conf\database-conf.xml]: Error setting property values; nested exception is org.springframework.beans.PropertyAccessExceptionsException: PropertyAccessExceptionsException (1 errors); nested
 propertyAccessExceptions are: [org.springframework.beans.TypeMismatchException: Failed to convert property value of type [org.apache.commons.collections.map.LinkedMap] to required type [java.util.Has
hMap] for property 'sqlParams']
PropertyAccessExceptionsException (1 errors)
org.springframework.beans.TypeMismatchException: Failed to convert property value of type [org.apache.commons.collections.map.LinkedMap] to required type [java.util.HashMap] for property 'sqlParams'
        at org.springframework.beans.BeanWrapperImpl.doTypeConversionIfNecessary(BeanWrapperImpl.java:839)
        at org.springframework.beans.BeanWrapperImpl.setPropertyValue(BeanWrapperImpl.java:584)
        at org.springframework.beans.BeanWrapperImpl.setPropertyValue(BeanWrapperImpl.java:469)
        at org.springframework.beans.BeanWrapperImpl.setPropertyValue(BeanWrapperImpl.java:626)
        at org.springframework.beans.BeanWrapperImpl.setPropertyValues(BeanWrapperImpl.java:653)
        at org.springframework.beans.BeanWrapperImpl.setPropertyValues(BeanWrapperImpl.java:642)
        at org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.applyPropertyValues(AbstractAutowireCapableBeanFactory.java:1027)
        at org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.populateBean(AbstractAutowireCapableBeanFactory.java:824)
        at org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.createBean(AbstractAutowireCapableBeanFactory.java:345)
        at org.springframework.beans.factory.support.AbstractBeanFactory.getBean(AbstractBeanFactory.java:226)
        at org.springframework.beans.factory.support.AbstractBeanFactory.getBean(AbstractBeanFactory.java:147)
        at org.springframework.beans.factory.support.BeanDefinitionValueResolver.resolveReference(BeanDefinitionValueResolver.java:176)
        at org.springframework.beans.factory.support.BeanDefinitionValueResolver.resolveValueIfNecessary(BeanDefinitionValueResolver.java:105)
        at org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.applyPropertyValues(AbstractAutowireCapableBeanFactory.java:1013)
        at org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.populateBean(AbstractAutowireCapableBeanFactory.java:824)
        at org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.createBean(AbstractAutowireCapableBeanFactory.java:345)
        at org.springframework.beans.factory.support.AbstractBeanFactory.getBean(AbstractBeanFactory.java:226)
        at org.springframework.beans.factory.support.AbstractBeanFactory.getBean(AbstractBeanFactory.java:147)
        at com.salesforce.dataloader.dao.database.DatabaseConfig.getInstance(DatabaseConfig.java:23)
        at com.salesforce.dataloader.dao.database.DatabaseWriter.<init>(DatabaseWriter.java:74)
        at com.salesforce.dataloader.dao.database.DatabaseWriter.<init>(DatabaseWriter.java:58)
        at com.salesforce.dataloader.dao.DataAccessObjectFactory.getDaoInstance(DataAccessObjectFactory.java:60)
        at com.salesforce.dataloader.controller.Controller.createDao(Controller.java:183)
        at com.salesforce.dataloader.process.ProcessRunner.run(ProcessRunner.java:111)
        at com.salesforce.dataloader.process.ProcessRunner.main(ProcessRunner.java:222)
2010-10-11 12:17:16,355 FATAL [extractrecordtypetargetProcess] process.ProcessRunner run (ProcessRunner.java:147) - Error creating data access object
2010-10-11 12:17:16,355 FATAL [extractrecordtypetargetProcess] process.ProcessRunner run (ProcessRunner.java:147) - Error creating data access object

 

Thanks

Hello All

 

Please help me in this issue. When i am tring to load data in salesforce.com through Command line (DataLoader), its showing me this below error, which is related to class. I am using the DataLoader version 20.0:: ...........

 

 

C:\Program Files\salesforce.com\Apex Data Loader 20.0\bin>process ..\conf Insert RecordsinAccount 2010-10-01 12:42:24,887 INFO [main] controller.Controller initLog (Controller.j ava:375) - The log has been initialized 2010-10-01 12:42:24,887 INFO [main] process.ProcessConfig getBeanFactory (Proce ssConfig.java:78) - Loading process configuration from config file: C:\Program F iles\salesforce.com\Apex Data Loader 20.0\bin\..\conf\process-conf.xml 2010-10-01 12:42:24,966 INFO [main] xml.XmlBeanDefinitionReader loadBeanDefinit ions (XmlBeanDefinitionReader.java:163) - Loading XML bean definitions from file [C:\Program Files\salesforce.com\Apex Data Loader 20.0\bin\..\conf\process-conf .xml] 2010-10-01 12:42:25,012 INFO [main] core.CollectionFactory (Collection Factory.java:66) - JDK 1.4+ collections available 2010-10-01 12:42:25,028 INFO [main] core.CollectionFactory (Collection Factory.java:71) - Commons Collections 3.x available 2010-10-01 12:42:25,106 INFO [InsertAccount] controller.Controller initConfig ( Controller.java:336) - The controller config has been initialized 2010-10-01 12:42:25,122 INFO [InsertAccount] process.ProcessRunner run (Process Runner.java:91) - Initializing process engine 2010-10-01 12:42:25,122 INFO [InsertAccount] process.ProcessRunner run (Process Runner.java:94) - Loading parameters 2010-10-01 12:42:26,481 INFO [InsertAccount] config.LastRun load (LastRun.java: 101) - Last run info will be saved in file: C:\Program Files\salesforce.com\Apex Data Loader 20.0\bin\..\conf\InsertAccount_lastRun.properties 2010-10-01 12:42:26,497 FATAL [main] process.ProcessRunner topLevelError (Proces sRunner.java:212) - Unable to run process InsertAccount java.lang.RuntimeException: java.lang.IllegalArgumentException: No enum const cl ass com.salesforce.dataloader.action.OperationInfo. at com.salesforce.dataloader.process.ProcessRunner.ru​n(ProcessRunner.jav a:137) at com.salesforce.dataloader.process.ProcessRunner.ru​n(ProcessRunner.jav a:75) at com.salesforce.dataloader.process.ProcessRunner.ma​in(ProcessRunner.ja va:227) Caused by: java.lang.IllegalArgumentException: No enum const class com.salesforc e.dataloader.action.OperationInfo. at java.lang.Enum.valueOf(Enum.java:192) at com.salesforce.dataloader.config.Config.getEnum(Co​nfig.java:439) at com.salesforce.dataloader.config.Config.getOperati​onInfo(Config.java: 975) at com.salesforce.dataloader.process.ProcessRunner.ru​n(ProcessRunner.jav a:98) ... 2 more C:\Program Files\salesforce.com\Apex Data Loader 20.0\bin> ...........................

 

 

The below is my process-conf.xml:::

 

Insert Records in Account Object from CSV file through Command Line

 

 "

<!DOCTYPE beans PUBLIC "-//SPRING//DTD BEAN//EN" "http://www.springframework.org/dtd/spring-beans.dt​d">
<beans default-lazy-init="false" default-autowire="no" default-dependency-check="none">
   
  <bean
      class="com.salesforce.dataloader.process.ProcessRu​nner"
      singleton="false"  lazy-init="default" autowire="default" dependency-check="default">
    <description> Insert Records in Account Object from CSV file through Command Line </description>
 
  <property value="InsertAccount" />
  <property>
    <map>
      
        <entry key="sfdc.connectionTimeoutSecs" value="60"/>
        <entry key="sfdc.username" value="mohit.bansal@hcl.in.test1" />
        <entry key="sfdc.password" value="9fbaf20cfb2a22e8ad8a3b39836bc71c" />
        <entry key="process.encryptionKeyFile" value="C:\Program Files\salesforce.com\Apex Data Loader 20.0\conf\KeyFile.txt"  />
        <entry key="process.outputSuccess" value="C:\Program Files\salesforce.com\Apex Data Loader 20.0\conf\SuccessFile.csv" />
        <entry key="sfdc.loadBatchSize" value="100" />
        <entry key="dataAccessName" value="C:\Program Files\salesforce.com\Apex Data Loader 20.0\conf\dumpydata1.csv" />
        <entry key="process.useEuropeanDates" value="true" />
        <entry key="dataAccess.readBatchSize" value="100" />
        <entry key="sfdc.maxRetries" value="3" />
        <entry key="process.outputError" value="C:\Program Files\salesforce.com\Apex Data Loader 20.0\conf\ErrorFile.csv" />
        <entry key="sfdc.endpoint" value="https://test.salesforce.com"/>
        <entry key="dataAccess.Type" value="csvRead" />
        <entry key="process.Operation" value="insert" />
        <entry key="sfdc.extractionRequestSize" value="200" />
        <entry key="truncateFields" value="true" />
        <entry key="sfdc.entity" value="Account" />
        <entry key="sfdc.enableRetries" value="true" />
        <entry key="sfdc.timeoutsecs" value="1000" />
        <entry key="process.enableLastRunOutput" value="true" />
        <entry key="process.mappingfile" value="C:\Program Files\salesforce.com\Apex Data Loader 20.0\conf\AccountMapping.sdl" />
   </map>
  
  </property>

 </bean> 

</beans>"

 

Please let me know, why i am getting this error. .

Hello,

 

Command Line Data Loader throws a Null Pointer exception at arbitrary points of file processing. We are trying to upsert account records and have about 200,000 records in a mysql database. We are using the command line dataloader to query our mysql DB and then push/upsert data into Salesforce. We have set the bulk api to false.

 

java.lang.NullPointerException at com.salesforce.dataloader.dyna.SObjectReference.addReferenceToSObject(SObjectReference.java:67) at com.salesforce.dataloader.dyna.SforceDynaBean.getSObject(SforceDynaBean.java:290) at com.salesforce.dataloader.dyna.SforceDynaBean.getSObjectArray(SforceDynaBean.java:258) at com.salesforce.dataloader.client.PartnerClient.loadUpserts(PartnerClient.java:172) at com.salesforce.dataloader.action.visitor.UpsertVisitor.executeClientAction(UpsertVisitor.java:53) at com.salesforce.dataloader.action.visitor.PartnerLoadVisitor.loadBatch(PartnerLoadVisitor.java:73) at com.salesforce.dataloader.action.visitor.DAOLoadVisitor.visit(DAOLoadVisitor.java:131) at com.salesforce.dataloader.action.AbstractLoadAction.visitRowList(AbstractLoadAction.java:202) at com.salesforce.dataloader.action.AbstractLoadAction.execute(AbstractLoadAction.java:148) at com.salesforce.dataloader.controller.Controller.executeAction(Controller.java:122) at com.salesforce.dataloader.process.ProcessRunner.run(ProcessRunner.java:126) at com.salesforce.dataloader.process.ProcessRunner.main(ProcessRunner.java:229)2010-10-01 19:06:26,034 FATAL [queryaccountProcess] action.AbstractLoadAction execute (AbstractLoadAction.java:172) - Exception occured durin

 

Can anyone throw some light on how to resolve this issue?

Hi,

 

I am trying to load new users using dataloader,I am able to load everything (all the required fields) but some of the boolean fields are by defaults checked.

 

Receive Salesforce CRM Content Email Alerts,Receive Salesforce CRM Content Alerts as Daily Digest.

 

I don't want to these two fields becomes True, How to make this two fields false while loading the new users.

 

Any thoughts would be helpfull.

 

Thanks

 

Hello,

 

I had launched a command line upsert of Case object. It was running fine and then it suddenly failed with the following error? Is it something to do with the data?

 

2010-09-23 22:33:46,385 DEBUG [insertCaseProcess] visitor.BulkLoadVisitor writeSingleColumn (BulkLoadVisitor.java:193) -
 No value provided for field: Product_Family_del__c
2010-09-23 22:33:46,385 DEBUG [insertCaseProcess] visitor.BulkLoadVisitor writeSingleColumn (BulkLoadVisitor.java:193) -
 No value provided for field: Severity__c
2010-09-23 22:33:46,385 DEBUG [insertCaseProcess] visitor.BulkLoadVisitor writeSingleColumn (BulkLoadVisitor.java:193) -
 No value provided for field: acDate__c
2010-09-23 22:33:46,385 DEBUG [insertCaseProcess] visitor.BulkLoadVisitor writeSingleColumn (BulkLoadVisitor.java:193) -
 No value provided for field: SuppliedName
2010-09-23 22:33:46,606 INFO  [insertCaseProcess] progress.NihilistProgressAdapter setSubTask (NihilistProgressAdapter.j
ava:68) - Aborting job
2010-09-23 22:33:46,897 FATAL [insertCaseProcess] action.AbstractLoadAction execute (AbstractLoadAction.java:172) - Exce
ption occured during loading
com.salesforce.dataloader.exception.LoadException: Failed to create batch
        at com.salesforce.dataloader.action.visitor.DAOLoadVisitor.handleException(DAOLoadVisitor.java:171)
        at com.salesforce.dataloader.action.visitor.BulkLoadVisitor.handleException(BulkLoadVisitor.java:114)
        at com.salesforce.dataloader.action.visitor.BulkLoadVisitor.loadBatch(BulkLoadVisitor.java:98)
        at com.salesforce.dataloader.action.visitor.DAOLoadVisitor.visit(DAOLoadVisitor.java:131)
        at com.salesforce.dataloader.action.AbstractLoadAction.visitRowList(AbstractLoadAction.java:202)
        at com.salesforce.dataloader.action.AbstractLoadAction.execute(AbstractLoadAction.java:148)
        at com.salesforce.dataloader.controller.Controller.executeAction(Controller.java:122)
        at com.salesforce.dataloader.process.ProcessRunner.run(ProcessRunner.java:126)
        at com.salesforce.dataloader.process.ProcessRunner.main(ProcessRunner.java:229)
Caused by: [AsyncApiException  exceptionCode='ClientInputError'
 exceptionMessage='Failed to create batch'
]

        at com.sforce.async.RestConnection.createBatchFromStream(RestConnection.java:146)
        at com.salesforce.dataloader.action.visitor.BulkLoadVisitor.createBatch(BulkLoadVisitor.java:252)
        at com.salesforce.dataloader.action.visitor.BulkLoadVisitor.doOneBatch(BulkLoadVisitor.java:156)
        at com.salesforce.dataloader.action.visitor.BulkLoadVisitor.createBatches(BulkLoadVisitor.java:133)
        at com.salesforce.dataloader.action.visitor.BulkLoadVisitor.loadBatch(BulkLoadVisitor.java:94)
        ... 6 more
Caused by: com.sforce.ws.ConnectionException: unable to find end tag at:  START_TAG seen ..."http://www.force.com/2009/0
6/asyncapi/dataload">\n <exceptionCode>... @3:17
        at com.sforce.ws.bind.TypeMapper.consumeEndTag(TypeMapper.java:399)
        at com.sforce.async.BatchInfo.load(BatchInfo.java:284)
        at com.sforce.async.BatchRequest.loadBatchInfo(BatchRequest.java:77)
        at com.sforce.async.RestConnection.createBatchFromStream(RestConnection.java:140)
        ... 10 more
2010-09-23 22:33:46,904 ERROR [insertCaseProcess] progress.NihilistProgressAdapter doneError (NihilistProgressAdapter.ja
va:51) - Failed to create batch

I have a query that returns info that I need to paginate. I can return all records or limit to however many records I'd like, but I'd like to set up a page system that allows me to return a specified number of records and then dynamically create a page numbering system of links at the bottom of the report to go to additional pages of records. I have the whole system set up and working, but cannot figure out how to retrieve records starting after the records I've already retrieved.
 
For example,...
 
Page one shows the first 20 records.
 
Page two should show the next 20, but I need something equivilent to SQL's LIMIT 20,20 to start my query at record #20 and then retrieve 20 records. Does SOQL have a function like this?
 
Thank you all very much in advance.