• melchisholm22
  • NEWBIE
  • 0 Points
  • Member since 2009

  • Chatter
    Feed
  • 0
    Best Answers
  • 0
    Likes Received
  • 0
    Likes Given
  • 0
    Questions
  • 7
    Replies

Is there any way of getting the field types dynamically by just having a String containing the name of an sObject type? (such as Contact or Account)

 

How would you do the following dynamically rather than put in the object type statically (in this case Contact)

 

 

Contact.sObjectType.getDescribe().fields.getMap();

 So I pretty much need to do the following

 

<sObject type HERE>.getDescribe().fields.getMap();

 Only having a string of the object type name Contact.

 

 

 

 

 

I am trying to follow the link

 

 Best Practice #6: Querying Large Data Sets states:

 

SOQL queries that return multiple records can only be used if the query results do not exceed 1,000 records, the maximum size limit of a list. If the query results return more than 1,000 records, then a SOQL query for loop must be used instead, since it can process multiple batches of records through the use of internal calls to query and queryMore.

For example, if the results are too large, the syntax below causes a runtime exception:

//A runtime exception is thrown if this query returns 1001 or more records.
Account[] accts = [SELECT id FROM account];

Instead, use a SOQL query for loop as in one of the following examples:

// Use this format for efficiency if you are executing DML statements 
// within the for loop
for (List<Account> accts : [SELECT id, name FROM account
WHERE name LIKE 'Acme']) {
// Your code here
update accts;
}

Let the Force.com platform chunk your large query results into batches of 1000 records by using this syntax where the SOQL query is in the for loop definition, and then handle the individual datasets in the for loop logic.

 

 --------------------------------------------------------------------------------------------------------------------

 

I have a test object with more than 1000 test records on it.  The test trigger is

 

trigger testProfileEffect on ProfileBasedTest__c (before insert)
{
for(List<ProfileBasedTest__c> pbt:[SELECT description__c FROM ProfileBasedTest__c])
{
pbt[0].description__c = 'Hello world!';
update pbt;
}


}

 

As a test, i am trying to update the existing record's description fields while inserting a new record & get exception:

 

Error: Invalid Data.
Review all error messages below to correct your data.
Apex trigger Test.testProfileEffect caused an unexpected exception, contact your administrator: Test.testProfileEffect: execution of BeforeInsert caused by: System.Exception: Too many query rows: 1001: Trigger.Test.testProfileEffect: line 3, column 38

 

 

 The exception is because of List limitation but how can I make the Best Practice #6: Querying Large Data Sets workable in this scenerio?

 

Thanks in advance.

 

 

 

Hi ,

 

Can anyone tell me how to get system IP address (logged in system ip) using apex. are there any DNS related classes (like java)

or any other approaches. ?

 

thanks for your time and help,

SVD 

  • June 26, 2009
  • Like
  • 0
Would you please help me to display Notes & Attachments on a Visual Force page having CUSTOM controller ?
Is there any tag for it ?

 

 
I am using apex data loader 11.0 command line to extract data to CSV file, it's works, I would like to extract data from salesforce directly into MS SQL 2005, I was spent a lot of time to try and search the knowadge in this community but still not able to get this works, below is my code in database-con.xml:
Code:
<!DOCTYPE beans PUBLIC "-//SPRING//DTD BEAN//EN" "http://www.springframework.org/dtd/spring-beans.dtd">
<beans>
<bean id="dbDataSource"
      class="org.apache.commons.dbcp.BasicDataSource"
      destroy-method="close">
    <property name="driverClassName" value="com.microsoft.sqlserver.jdbc.SQLServerDriver"/>
    <property name="url" value="jdbc:sqlserver://localhost;databaseName=CRMReport;"/>
    <property name="username" value="test"/>
    <property name="password" value="test"/>
</bean>


<bean id="tmpAccount"
      class="com.salesforce.lexiloader.dao.database.DatabaseConfig"
      singleton="true">
    <property name="sqlConfig" ref="queryAccount"/>
    <property name="dataSource" ref="dbDataSource"/>
</bean>
<bean id="queryAccount"
      class="com.salesforce.lexiloader.dao.database.SqlConfig" singleton="true">
    <property name="sqlString">
        <value>
 select Id, IsDeleted, MasterRecordId, 
  Name, Type, RecordTypeId, ParentId, 
  BillingStreet, BillingCity, BillingState, 
  BillingPostalCode, BillingCountry, ShippingStreet, 
  ShippingCity, ShippingState, ShippingPostalCode, 
  ShippingCountry, Phone, Fax, 
  Website, Industry, AnnualRevenue, 
  NumberOfEmployees, 
  Description, CurrencyIsoCode, 
  OwnerId, CreatedDate, CreatedById, 
  LastModifiedDate, LastModifiedById, 
  SystemModstamp, LastActivityDate, 
  Region__c, Annual_Procurment__c, 
  Trial_Balance_Code__c, Customer_Group__c, 
  SAP_VEGA_1_Account_Number__c, 
  SAP_VEGA_2_Account_Number__c, 
  QAD_Account_Number__c, 
  BAAN_Account_Number__c, Brief_Name__c,Customer_PBU__c
 from tmpAccount
        </value>
    </property>
</bean>
</beans>


 
and then I have process-conf.xml as well:
 
Code:
<!DOCTYPE beans PUBLIC "-//SPRING//DTD BEAN//EN" "http://www.springframework.org/dtd/spring-beans.dtd">
<beans>
    <!-- Extract Account to CSV File -->
    <bean id="csvAccountExtractProcess"
          class="com.salesforce.lexiloader.process.ProcessRunner"
          singleton="false">
      <description>csvAccountExtract job gets account info from salesforce and saves info into a CSV file."</description>
        <property name="name" value="csvAccountExtract"/>
        <property name="configOverrideMap">
            <map>
                <entry key="sfdc.entity" value="Account"/>
                <entry key="process.operation" value="extract"/>
                <entry key="process.mappingFile" value="C:\Program Files\salesforce.com\Apex Data Loader 11.0\Delphi\Conf\ExtractAccountMap.sdl"/>
                <entry key="dataAccess.type" value="databaseWrite"/>
                <entry key="dataAccess.name" value="tmpAccount"/>
                <entry key="sfdc.extractionSOQL" value="Select Id, IsDeleted, MasterRecordId, Name, Type, 
                     RecordTypeId, ParentId, BillingStreet, BillingCity, 
                     BillingState, BillingPostalCode, BillingCountry, 
                     ShippingStreet, ShippingCity, ShippingState, ShippingPostalCode, 
                     ShippingCountry, Phone, Fax, Website, Industry, AnnualRevenue, 
                     NumberOfEmployees, Description, CurrencyIsoCode, OwnerId, 
                     CreatedDate, CreatedById, LastModifiedDate, LastModifiedById, 
                     SystemModstamp, LastActivityDate, Region__c, Annual_Procurment__c, 
                     Trial_Balance_Code__c, Customer_Group__c, 
                     SAP_VEGA_1_Account_Number__c, SAP_VEGA_2_Account_Number__c, 
                     QAD_Account_Number__c, BAAN_Account_Number__c, Brief_Name__c, 
                     Customer_PBU__c FROM Account"/>
            </map>
        </property>
    </bean>
</beans>

 

I downloaded the Microsoft SQL Server 2005 JDBC 1.2 into my C:\MSJDBC folder, and changed the original proccess.bat file to:
Code:
@echo off
if not [%1]==[] goto run
echo.
echo Usage: process ^<configuration directory^> ^[process name^]
echo.
echo      configuration directory -- directory that contains configuration files,
echo          i.e. config.properties, process-conf.xml, database-conf.xml
echo.
echo      process name -- optional name of a batch process bean in process-conf.xml,
echo          for example:
echo.
echo              process ../myconfigdir AccountInsert
echo.
echo          If process name is not specified, the parameter values from config.properties
echo          will be used to run the process instead of process-conf.xml,
echo          for example:
echo.
echo              process ../myconfigdir
echo.

goto end

:run
set PROCESS_OPTION=
if not [%2]==[] set PROCESS_OPTION=process.name=%2

..\_jvm\bin\java.exe -classpath ..\DataLoader.jar;C:\MSJDBC\sqljdbc_1.2\enu\sqljdbc.jar -Dsalesforce.config.dir=%1 

com.salesforce.lexiloader.process.ProcessRunner %PROCESS_OPTION%

:end

 
and then we I execute the extract in command line:
 
process.bat ./conf csvAccountExtractProcess
 
I was get below errors:
 

177924 [csvAccountExtract] DEBUG com.salesforce.lexiloader.client.PartnerClient - Beginning web service operation: query
320410 [csvAccountExtract] ERROR com.salesforce.lexiloader.dao.database.Database Reader  - Database error encountered while writing row#1 through row#215 (execute batch update). Database configuration: tmpAccount.  Sql error: The SELECT statement is not permitted in a batch..com.microsoft.sqlserver.jdbc.SQLServerException: The SELECT statement is not permitted in a batch.
 at com.microsoft.sqlserver.jdbc.SQLServerException.makeFromDriverError(Unknown Source)
        at com.microsoft.sqlserver.jdbc.SQLServerPreparedStatement.doExecutePreparedStatementBatch(Unknown Source)
        at com.microsoft.sqlserver.jdbc.SQLServerPreparedStatement$PrepStmtBatchExecCmd.doExecute(Unknown Source)
        at com.microsoft.sqlserver.jdbc.TDSCommand.execute(Unknown Source)
        at com.microsoft.sqlserver.jdbc.SQLServerConnection.executeCommand(Unknown Source)
        at com.microsoft.sqlserver.jdbc.SQLServerStatement.executeCommand(Unknown Source)
        at com.microsoft.sqlserver.jdbc.SQLServerStatement.executeStatement(Unknown Source)
        at com.microsoft.sqlserver.jdbc.SQLServerPreparedStatement.executeBatch(Unknown Source)
        at org.apache.commons.dbcp.DelegatingStatement.executeBatch(DelegatingStatement.java:294)
        at com.salesforce.lexiloader.dao.database.DatabaseWriter.writeRowList(DatabaseWriter.java:180)
        at com.salesforce.lexiloader.action.visitor.QueryVisitor.writeExtraction(QueryVisitor.java:205)
        at com.salesforce.lexiloader.action.visitor.QueryVisitor.visit(QueryVisitor.java:111)
        at com.salesforce.lexiloader.action.ExtractAction.execute(ExtractAction.java:108)
        at com.salesforce.lexiloader.controller.Controller.executeAction(Controller.java:126)
        at com.salesforce.lexiloader.process.ProcessRunner.run(ProcessRunner.java:136)
        at com.salesforce.lexiloader.process.ProcessRunner.main(ProcessRunner.java:228)
 at com.salesforce.lexiloader.process.ProcessRunner.main(ProcessRunner.java:228)

320410 [csvAccountExtract] INFO com.salesforce.lexiloader.action.progress.NihilistProgressAdapter  - Processed 214 of 214 total records. Rate: 24851000 records per hour. Estimated time to complete: 0 minutes and 0 seconds.  There are 0 successes and 214 errors.
320410 [csvAccountExtract] INFO com.salesforce.lexiloader.action.progress.NihilistProgressAdapter  - The extract has fully completed.  There were 0 successful extracts and 214 errors.

Looks like the SELECT statement have some problems, I am looking the solution here, any leads very be high appericated, Thanks
 


Message Edited by Qingsong on 04-15-2008 06:57 PM

Message Edited by Qingsong on 04-15-2008 07:09 PM

Message Edited by Qingsong on 04-15-2008 07:13 PM

Does anyone know why the following query line:

List<Contact> contacts = new List<Contact> ([Select Id, Email from Contact where IsPersonAccount = false]);

Yields this error message:

Error: Invalid Data.
Review all error messages below to correct your data.
Apex trigger LeadAssignmentFromImagitas caused an unexpected exception, contact your administrator: LeadAssignmentFromImagitas: execution of BeforeInsert caused by: System.QueryException: Non-selective query against large object type (more than 100000 rows). Consider an indexed filter or contact salesforce.com about custom indexing. Even if a field is indexed a filter might still not be selective when: 1. The filter value includes null (for instance binding with a list that contains null) 2. Data skew exists whereby the number of matching rows is very large (for instance, filtering for a particular foreign key value that occurs many times): Trigger.LeadAssignmentFromImagitas: line 18, column 9

?

The query returns only 58 rows.

Any help is greatly appreciated.  Thank you.
  • February 19, 2008
  • Like
  • 0