• Hari Sharma
  • NEWBIE
  • -1 Points
  • Member since 2010

  • Chatter
    Feed
  • 0
    Best Answers
  • 0
    Likes Received
  • 0
    Likes Given
  • 0
    Questions
  • 7
    Replies

Hi,

I have enabled the feature in my org. unfortunatelly, I encountered some issues. for example: I use Marketo for the leads Managment,

when Marketo tries to sync leads to salesforce, with no state mentioned, I recieve this error message:  

FIELD_INTEGRITY_EXCEPTION: There's a problem with this state, even though it may appear correct. Please select a state from the list of valid states.: State/Province

Does someone know what it means and how can I fix it ?
Also, if the country is United States, is it must to have a state value ? can I have lead from United States but with empty state ? I really think this should be an option.

Thanks,
Shiran.

Hi,

 

I want to make a checkbox true if my contact time zone is between something.

So i created a timezone field (formula field to set time zone based on State) on my contact and specified some timeranges in my custm setting.So i made custom list variable.

Now i want to make my checkbox true if my contact.timezone is between my global list timezone.range1 and range2

I am not understanding how to write formula for this.Any help??

 

Thanks in advance.!!

hello friends

can u pls tell me how many fields can we track thru audit trails.....

thanks

I'am trying to load data into Salesforce.com using Data Loader v.24 & Bulk API.

My CSV file have 100K record so I have configured Dataloader to use Bulk API with Batchsize = 10000

Executing the import I have noticed that creates a single Batch Job with a lot Batch process but the record processed for each batch process is not 10000,

 

I do not understand why but is random causing a large number of batch (some batch have 1 record!)

 

I tried to do the same with version 23 and I got the same result but with version 19 everything works fine (10000 record for each batch process).


Has anyone else had the same problem?
It's problem recognized by Salesforce.com?
We are forced to use the ver. 19?

 

Thanks
Simone

HI

   I am automating a migration from a lecay CRM to salesforce.com using apex data loader CLI. I  want to schedule three jobs J1,J2 and J3. J2 & J3 should run only after the sucessful execution of J1.I tried doing by setting time lag in windows task scheduler. Is there any other way of going about this?

  • October 18, 2011
  • Like
  • 0

Since I started using Data Loader v 21.0 I appear to be getting bad error*.csv reports. I believe that the number of errors returned is correct, and that the error messages are probably correct, but Data Loader is not returning the rows that actually generated the errors. It is returning rows, but I have yet to discover the relationship between the rows returned and the rows that actually generated the error (it doesn't appear to be the previous row in the processed csv, nor the row following the row reported). ????

 

This is very strange--I count on Data Loader to report the erroneous rows. I often can correct the errors in excel and reload them quickly. But, at this point, I don't have any way to identify the bad rows from the data loader error files.

 

I'm considering reloading v 21.0--maybe there is a bug in my version that has been corrected?

  • May 31, 2011
  • Like
  • 0

Hi,

 

I am trying to upsert the records from MySQL to Salesforce.com using Data Loader command Line Interpreter(CLI). There are 3 records in MySql table and during upsert 3 empty records are getting inserted in Salesforce Object but custom fields content is not getting inserted. Using .csvfile I am able to do that.

 

Please find process-conf and database-conf files below :

 

Process-Conf file:

<bean id="SFAProcess"
          class="com.salesforce.dataloader.process.ProcessRunner"
          singleton="false">
        <description>AccountMaster job gets the Customer record updates from ERP (Oracle financials) and uploads them to salesforce using 'upsert'.</description>
        <property name="name" value="SFAProcess"/>
        <property name="configOverrideMap">
            <map>
                <entry key="sfdc.debugMessages" value="true"/>
                <entry key="sfdc.debugMessagesFile" value="c:\dataloader\samples\status\accountMasterSoapTrace.log"/>

                <entry key="sfdc.endpoint" value="https://www.salesforce.com"/>
                <entry key="sfdc.username" value="sam@co.com.beta"/>
                <entry key="sfdc.password" value="cf644d195b9f45cddf7054e686c35921c0497a0e96867fe689"/>
                <entry key="sfdc.timeoutSecs" value="600"/>
                <entry key="sfdc.loadBatchSize" value="200"/>
                <entry key="sfdc.externalIdField" value="SF_Id__c"/>
                <entry key="sfdc.entity" value="SFA_Test__c"/>
                <entry key="process.operation" value="upsert"/>
                <entry key="process.mappingFile" value="C:\Program Files\salesforce.com\Apex Data Loader 17.0\conf\accountMasterMap.sdl"/>
                <entry key="dataAccess.name" value="C:\Program Files\salesforce.com\Apex Data Loader 17.0\extract.csv"/>
                <entry key="dataAccess.type" value="csvRead"/>
            </map>
        </property>
    </bean>
   

database-conf file:

<bean id="dbDataSource" class="org.apache.commons.dbcp.BasicDataSource"
      destroy-method="close">
    <property name="driverClassName" value="com.mysql.jdbc.Driver"/>
    <property name="url" value="jdbc:mysql://10.195.70.101:3306/SFA_Test"/>
    <property name="username" value="SFA"/>
    <property name="password" value="SFA"/>
</bean>

<bean id="querySFATestAll"
      class="com.salesforce.dataloader.dao.database.DatabaseConfig"
      singleton="true">
    <property name="sqlConfig" ref="querySFA_testAllSql"/>
    <property name="dataSource" ref="dbDataSource"/>
</bean>

<bean id="querySFA_testAllSql"
      class="com.salesforce.dataloader.dao.database.SqlConfig" singleton="true">
    <property name="sqlString">
        <value>
            SELECT NAME, NUMBER, ID FROM SFA_Test.SFA_TEST
        </value>
    </property>
    <property name="columnNames">
        <list>
            <value>Name</value>
            <value>Number</value>
            <value>ID</value>
           
        </list>
    </property>
</bean>

 

Mapping file:

 

#Mapping values
#Fri Mar 05 10:04:53 EST 2010
OWNERID=
CREATEDDATE=
ISDELETED=
SNAME__C=SName__c
LASTMODIFIEDBYID=
ID=Id
SF_ID__C=SF_Id__c
CREATEDBYID=
SYSTEMMODSTAMP=
NUMBER__C=Number__c
LASTMODIFIEDDATE=
NAME=
 

In the log file Success.csv it says success by giving the salesforce Id for the records in MySQL.

 

Please someone help me... Thanks in advance.