function readOnly(count){ }
Starting November 20, the site will be set to read-only. On December 4, 2023,
forum discussions will move to the Trailblazer Community.
+ Start a Discussion
sforcenewbiesforcenewbie 

DL 18 gives an error when I use the latest JDK/JVM 1.6_21

Hello,


We are loading close to 10 million records and to help Dataloader out I figured we could get take advantage of the 64 bit JVM and allocate heap sizes north of 2g. However I was dissapointed to see that Dataloader throws some spring error when I just swap the SF Dataloader packaged JVM with the 1.6_21 JVM. I am attaching the stacktrace. This is a real bummer and would appreciate any help.

 

Here are the JVM options I was planning on using since the packaged JVM does not allow a heap size of greater than 1 ~ 1.5 GB.

 

%JAVA_HOME%\bin\java.exe -Xms3g -Xmx3g -Xmn1g -XX:+UseConcMarkSweepGC -server

 


scripts\..\conf\database-conf.xml]: Error setting property values; nested exception is org.springframework.beans.PropertyAccessExceptionsException: PropertyAccessExceptionsException (1 errors); nested
 propertyAccessExceptions are: [org.springframework.beans.TypeMismatchException: Failed to convert property value of type [org.apache.commons.collections.map.LinkedMap] to required type [java.util.Has
hMap] for property 'sqlParams']
PropertyAccessExceptionsException (1 errors)
org.springframework.beans.TypeMismatchException: Failed to convert property value of type [org.apache.commons.collections.map.LinkedMap] to required type [java.util.HashMap] for property 'sqlParams'
        at org.springframework.beans.BeanWrapperImpl.doTypeConversionIfNecessary(BeanWrapperImpl.java:839)
        at org.springframework.beans.BeanWrapperImpl.setPropertyValue(BeanWrapperImpl.java:584)
        at org.springframework.beans.BeanWrapperImpl.setPropertyValue(BeanWrapperImpl.java:469)
        at org.springframework.beans.BeanWrapperImpl.setPropertyValue(BeanWrapperImpl.java:626)
        at org.springframework.beans.BeanWrapperImpl.setPropertyValues(BeanWrapperImpl.java:653)
        at org.springframework.beans.BeanWrapperImpl.setPropertyValues(BeanWrapperImpl.java:642)
        at org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.applyPropertyValues(AbstractAutowireCapableBeanFactory.java:1027)
        at org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.populateBean(AbstractAutowireCapableBeanFactory.java:824)
        at org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.createBean(AbstractAutowireCapableBeanFactory.java:345)
        at org.springframework.beans.factory.support.AbstractBeanFactory.getBean(AbstractBeanFactory.java:226)
        at org.springframework.beans.factory.support.AbstractBeanFactory.getBean(AbstractBeanFactory.java:147)
        at org.springframework.beans.factory.support.BeanDefinitionValueResolver.resolveReference(BeanDefinitionValueResolver.java:176)
        at org.springframework.beans.factory.support.BeanDefinitionValueResolver.resolveValueIfNecessary(BeanDefinitionValueResolver.java:105)
        at org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.applyPropertyValues(AbstractAutowireCapableBeanFactory.java:1013)
        at org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.populateBean(AbstractAutowireCapableBeanFactory.java:824)
        at org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.createBean(AbstractAutowireCapableBeanFactory.java:345)
        at org.springframework.beans.factory.support.AbstractBeanFactory.getBean(AbstractBeanFactory.java:226)
        at org.springframework.beans.factory.support.AbstractBeanFactory.getBean(AbstractBeanFactory.java:147)
        at com.salesforce.dataloader.dao.database.DatabaseConfig.getInstance(DatabaseConfig.java:23)
        at com.salesforce.dataloader.dao.database.DatabaseWriter.<init>(DatabaseWriter.java:74)
        at com.salesforce.dataloader.dao.database.DatabaseWriter.<init>(DatabaseWriter.java:58)
        at com.salesforce.dataloader.dao.DataAccessObjectFactory.getDaoInstance(DataAccessObjectFactory.java:60)
        at com.salesforce.dataloader.controller.Controller.createDao(Controller.java:183)
        at com.salesforce.dataloader.process.ProcessRunner.run(ProcessRunner.java:111)
        at com.salesforce.dataloader.process.ProcessRunner.main(ProcessRunner.java:222)
2010-10-11 12:17:16,318 FATAL [extractrecordtypetargetProcess] controller.Controller createDao (Controller.java:185) - Error creating data access object
org.springframework.beans.factory.BeanCreationException: Error creating bean with name 'insertrecordtypetarget' defined in file [C:\Program Files (x86)\salesforce.com\Apex Data Loader 18.0\xxx\scr
ipts\..\conf\database-conf.xml]: Can't resolve reference to bean 'insertrecordtypetargetSql' while setting property 'sqlConfig'; nested exception is org.springframework.beans.factory.BeanCreationExcep
tion: Error creating bean with name 'insertrecordtypetargetSql' defined in file [C:\Program Files (x86)\salesforce.com\Apex Data Loader 18.0\xxx\scripts\..\conf\database-conf.xml]: Error setting p
roperty values; nested exception is org.springframework.beans.PropertyAccessExceptionsException: PropertyAccessExceptionsException (1 errors); nested propertyAccessExceptions are: [org.springframework
.beans.TypeMismatchException: Failed to convert property value of type [org.apache.commons.collections.map.LinkedMap] to required type [java.util.HashMap] for property 'sqlParams']
org.springframework.beans.factory.BeanCreationException: Error creating bean with name 'insertrecordtypetargetSql' defined in file [C:\Program Files (x86)\salesforce.com\Apex Data Loader 18.0\xxx\
scripts\..\conf\database-conf.xml]: Error setting property values; nested exception is org.springframework.beans.PropertyAccessExceptionsException: PropertyAccessExceptionsException (1 errors); nested
 propertyAccessExceptions are: [org.springframework.beans.TypeMismatchException: Failed to convert property value of type [org.apache.commons.collections.map.LinkedMap] to required type [java.util.Has
hMap] for property 'sqlParams']
PropertyAccessExceptionsException (1 errors)
org.springframework.beans.TypeMismatchException: Failed to convert property value of type [org.apache.commons.collections.map.LinkedMap] to required type [java.util.HashMap] for property 'sqlParams'
        at org.springframework.beans.BeanWrapperImpl.doTypeConversionIfNecessary(BeanWrapperImpl.java:839)
        at org.springframework.beans.BeanWrapperImpl.setPropertyValue(BeanWrapperImpl.java:584)
        at org.springframework.beans.BeanWrapperImpl.setPropertyValue(BeanWrapperImpl.java:469)
        at org.springframework.beans.BeanWrapperImpl.setPropertyValue(BeanWrapperImpl.java:626)
        at org.springframework.beans.BeanWrapperImpl.setPropertyValues(BeanWrapperImpl.java:653)
        at org.springframework.beans.BeanWrapperImpl.setPropertyValues(BeanWrapperImpl.java:642)
        at org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.applyPropertyValues(AbstractAutowireCapableBeanFactory.java:1027)
        at org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.populateBean(AbstractAutowireCapableBeanFactory.java:824)
        at org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.createBean(AbstractAutowireCapableBeanFactory.java:345)
        at org.springframework.beans.factory.support.AbstractBeanFactory.getBean(AbstractBeanFactory.java:226)
        at org.springframework.beans.factory.support.AbstractBeanFactory.getBean(AbstractBeanFactory.java:147)
        at org.springframework.beans.factory.support.BeanDefinitionValueResolver.resolveReference(BeanDefinitionValueResolver.java:176)
        at org.springframework.beans.factory.support.BeanDefinitionValueResolver.resolveValueIfNecessary(BeanDefinitionValueResolver.java:105)
        at org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.applyPropertyValues(AbstractAutowireCapableBeanFactory.java:1013)
        at org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.populateBean(AbstractAutowireCapableBeanFactory.java:824)
        at org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.createBean(AbstractAutowireCapableBeanFactory.java:345)
        at org.springframework.beans.factory.support.AbstractBeanFactory.getBean(AbstractBeanFactory.java:226)
        at org.springframework.beans.factory.support.AbstractBeanFactory.getBean(AbstractBeanFactory.java:147)
        at com.salesforce.dataloader.dao.database.DatabaseConfig.getInstance(DatabaseConfig.java:23)
        at com.salesforce.dataloader.dao.database.DatabaseWriter.<init>(DatabaseWriter.java:74)
        at com.salesforce.dataloader.dao.database.DatabaseWriter.<init>(DatabaseWriter.java:58)
        at com.salesforce.dataloader.dao.DataAccessObjectFactory.getDaoInstance(DataAccessObjectFactory.java:60)
        at com.salesforce.dataloader.controller.Controller.createDao(Controller.java:183)
        at com.salesforce.dataloader.process.ProcessRunner.run(ProcessRunner.java:111)
        at com.salesforce.dataloader.process.ProcessRunner.main(ProcessRunner.java:222)
2010-10-11 12:17:16,355 FATAL [extractrecordtypetargetProcess] process.ProcessRunner run (ProcessRunner.java:147) - Error creating data access object
2010-10-11 12:17:16,355 FATAL [extractrecordtypetargetProcess] process.ProcessRunner run (ProcessRunner.java:147) - Error creating data access object

 

Thanks

sforcenewbiesforcenewbie

Ok I am not sure if it is ok but I opened Dataloader.jar and replaced the sorg.springframework folder with the latest release from spring.

 

I then repackaged the jar and the error went away :)

 

I will post back if I run into any other errors due to this.

 

Cheers

 

dkadordkador

You're certainly welcome to make any changes you'd like locally, with the caveat that if you run into additional problems we might not be able to provide the same level of support.  Glad you were able to work around this for now.

sforcenewbiesforcenewbie

I am still listening for a solution if there is one :)

 

Thanks anyways for responding...

SuperfellSuperfell

Can you expand on why you want to do this at all? while you're loading a lot of records, the data loader does everything in chunks, and is certainly not loading them all into memory at once, so 2GB+ sized heaps shouldn't be needed, have you run into a specific issue?

sforcenewbiesforcenewbie

Our lead table has over 370 columns. We have for simplicity and to avoid having to figure our column sizes have kept all the mysql columns to LONGTEXT except the relevant id fields which we have kept at varchar(18).

 

When we limit 100K(lead table has 900K records) records it loads grudingly(10K batch size) after a big spike in memory etc. When I put this limit to 200K the jvm fails with a out of memory error.

 

This forces us to chunk our lead table load into several chunks. We felt having a server(64 bit with more RAM) will at least hasten the process with less load but was disappointed to see that a spring error was thrown when we used the latest JDK.

 

Please let me know if there is a better way to approach this.