• dp_derek
  • NEWBIE
  • 0 Points
  • Member since 2012

  • Chatter
    Feed
  • 0
    Best Answers
  • 0
    Likes Received
  • 0
    Likes Given
  • 3
    Questions
  • 6
    Replies

Hi,everone! 

I want to update records from oracle to salesforce.

Please help me !!!

Thanks a lot !!!

 

<!DOCTYPE beans PUBLIC "-//SPRING//DTD BEAN//EN" "http://www.springframework.org/dtd/spring-beans.dtd">
<beans>
<bean id="accountUpdate" class="com.salesforce.dataloader.process.ProcessRunner" singleton="false">
<description>
accountInsert job gets the account record from the CSV file and inserts it into Salesforce.
</description>
<property name="name" value="accountUpdate"/>
<property name="configOverrideMap">
<map>
<entry key="sfdc.debugMessages" value="true"/>
<entry key="sfdc.debugMessagesFile" value="C:\DLTest\Log\accountInsertSoapTrace.log"/>
<entry key="sfdc.endpoint" value="https://login.salesforce.com/services/Soap/u/25.0"/>
<entry key="sfdc.username" value=***@***/>
<entry key="sfdc.password" value="*************************"/>
<entry key="process.encryptionKeyFile" value="C:\DLTest\key.txt"/>
<entry key="sfdc.timeoutSecs" value="600"/>
<entry key="sfdc.loadBatchSize" value="200"/>
<entry key="sfdc.entity" value="Account"/>
<entry key="process.operation" value="update"/>
<entry key="process.mappingFile" value="C:\DLTest\accountUpdateMap.sdl"/>
<entry key="dataAccess.name" value="updateAccounts"/>
<entry key="process.outputSuccess" value="c:\DLTest\Log\accountInsert_success.csv"/>
<entry key="process.outputError" value="c:\DLTest\Log\accountInsert_error.csv"/>
<entry key="dataAccess.type" value="databaseRead"/>
<entry key="process.initialLastRunDate"
value="2005-12-01T00:00:00.000-0800"/>
</map>
</property>
</bean>
</beans>

Hi
I created a VF page with Ajax. It can get fields value from SF with JavaScript.
But I created a custom code(Page Elements) on site.com's page, it can not get any data.
Who can help me?
Thanks

In site.com custom code, can use Ajax to access object? If you can, please give details。thank you!!!

Hi,everone! 

I want to update records from oracle to salesforce.

Please help me !!!

Thanks a lot !!!

 

<!DOCTYPE beans PUBLIC "-//SPRING//DTD BEAN//EN" "http://www.springframework.org/dtd/spring-beans.dtd">
<beans>
<bean id="accountUpdate" class="com.salesforce.dataloader.process.ProcessRunner" singleton="false">
<description>
accountInsert job gets the account record from the CSV file and inserts it into Salesforce.
</description>
<property name="name" value="accountUpdate"/>
<property name="configOverrideMap">
<map>
<entry key="sfdc.debugMessages" value="true"/>
<entry key="sfdc.debugMessagesFile" value="C:\DLTest\Log\accountInsertSoapTrace.log"/>
<entry key="sfdc.endpoint" value="https://login.salesforce.com/services/Soap/u/25.0"/>
<entry key="sfdc.username" value=***@***/>
<entry key="sfdc.password" value="*************************"/>
<entry key="process.encryptionKeyFile" value="C:\DLTest\key.txt"/>
<entry key="sfdc.timeoutSecs" value="600"/>
<entry key="sfdc.loadBatchSize" value="200"/>
<entry key="sfdc.entity" value="Account"/>
<entry key="process.operation" value="update"/>
<entry key="process.mappingFile" value="C:\DLTest\accountUpdateMap.sdl"/>
<entry key="dataAccess.name" value="updateAccounts"/>
<entry key="process.outputSuccess" value="c:\DLTest\Log\accountInsert_success.csv"/>
<entry key="process.outputError" value="c:\DLTest\Log\accountInsert_error.csv"/>
<entry key="dataAccess.type" value="databaseRead"/>
<entry key="process.initialLastRunDate"
value="2005-12-01T00:00:00.000-0800"/>
</map>
</property>
</bean>
</beans>

Hi
I created a VF page with Ajax. It can get fields value from SF with JavaScript.
But I created a custom code(Page Elements) on site.com's page, it can not get any data.
Who can help me?
Thanks

In site.com custom code, can use Ajax to access object? If you can, please give details。thank you!!!

Hi

 

I am getting a popup error "connection was cancelled here", when i run my class. below is the debug log. (125756072000)|CUMULATIVE_LIMIT_USAGE (125755962000)|FATAL_ERROR|Internal Salesforce.com Error.

 

Any idea why i am getting this error. I hope this is some query issue.



14.0 APEX_CODE,DEBUG;APEX_PROFILING,INFO;CALLOUT,INFO;DB,INFO;SYSTEM,DEBUG;VALIDATION,INFO;VISUALFORCE,INFO;WORKFLOW,INFO
13:12:43.027 (27564000)|EXECUTION_STARTED
13:12:43.027 (27675000)|CODE_UNIT_STARTED|[EXTERNAL]|01p70000000CpqE|CheckoutService.checkOutNext
13:12:43.027 (27779000)|METHOD_ENTRY|[1]|01p70000000CpqE|CheckoutService.CheckoutService()
13:12:43.027 (27848000)|METHOD_EXIT|[1]|CheckoutService
13:12:43.028 (28034000)|METHOD_ENTRY|[3]|01p70000000CpqE|CheckoutService.getCheckoutQueue(String)
13:12:43.030 (30472000)|SOQL_EXECUTE_BEGIN|[39]|Aggregations:0|SELECT Object_Name__c, Conditions__c, Sorts__c FROM Checkout_Queue__c WHERE View_Id__c = :viewId
13:12:43.040 (40704000)|SOQL_EXECUTE_END|[39]|Rows:1
13:12:43.040 (40929000)|SYSTEM_METHOD_ENTRY|[40]|LIST.size()
13:12:43.040 (40972000)|SYSTEM_METHOD_EXIT|[40]|LIST.size()
13:12:43.041 (41040000)|SYSTEM_METHOD_ENTRY|[43]|LIST.get(Integer)
13:12:43.041 (41091000)|SYSTEM_METHOD_EXIT|[43]|LIST.get(Integer)
13:12:43.041 (41168000)|METHOD_EXIT|[3]|01p70000000CpqE|CheckoutService.getCheckoutQueue(String)
13:12:43.041 (41631000)|SYSTEM_METHOD_ENTRY|[15]|Database.query(String)
13:12:43.043 (43594000)|SOQL_EXECUTE_BEGIN|[15]|Aggregations:0|SELECT Id, Checked_Out_To__c FROM NSF_History__c WHERE Checked_Out_To__c = null AND SPA_Check_Date__c >= 2010-10-14
AND Status__c = NULL 
AND non_Priority__c = TRUE
AND IsNSF__c = 'Yes' 
AND Next_Step_Date__c = NULL
AND clientstatus__c = 'Client'
AND NSF_Queue_Ready__c = 'True' ORDER BY Last_Check_In_Time__c ASC NULLS FIRST, CreatedDate LIMIT 1
13:14:48.753 (125753523000)|SYSTEM_METHOD_EXIT|[15]|Database.query(String)
13:14:48.755 (125755962000)|FATAL_ERROR|Internal Salesforce.com Error
13:14:49.165 (125756072000)|CUMULATIVE_LIMIT_USAGE
13:14:49.165|LIMIT_USAGE_FOR_NS|(default)|
  Number of SOQL queries: 2 out of 100
  Number of query rows: 1 out of 50000
  Number of SOSL queries: 0 out of 20
  Number of DML statements: 0 out of 150
  Number of DML rows: 0 out of 10000
  Number of script statements: 11 out of 200000
  Maximum heap size: 0 out of 3000000
  Number of callouts: 0 out of 10
  Number of Email Invocations: 0 out of 10
  Number of fields describes: 0 out of 100
  Number of record type describes: 0 out of 100
  Number of child relationships describes: 0 out of 100
  Number of picklist describes: 0 out of 100
  Number of future calls: 0 out of 10
13:14:49.165 (125756072000)|CUMULATIVE_LIMIT_USAGE_END
13:14:48.756 (125756225000)|CODE_UNIT_FINISHED|CheckoutService.checkOutNext
13:14:48.756 (125756284000)|EXECUTION_FINISHED
13:14:48.755 (125755962000)|FATAL_ERROR|Internal Salesforce.com Error
Thanks
Sid

I’m the developer of a package that has a heavy dependence on a scheduled Batch Apex job. The package currently runs in a dozen or so orgs, some of which have fairly large amounts of data. One org in particular has over 3 million records that are processed by the Batch Apex job.

 

Over the past 3 months, we’ve been encountering a lot of stability problems with Batch Apex.  We’ve opened cases for several of these issues, and they’ve been escalated to Tier 3 Support, but it consistently takes 2 weeks or more to get a case escalated, and then it can several more weeks to get a meaningful reply form Tier 3.

 

We really need to talk with the Product Manager responsible for Batch Apex. We asked Tier 3 to make that introduction, but they said they couldn’t. We’re trying to work with Sales to set up a discussion with a Product Manager, but so far, we haven’t had any luck there either. We’re hoping that a Product Manager might see this post and get in touch with us. We need to find out whether Batch Apex is a reliable-enough platform for our application.

 

Here are a few examples of the problems we’ve been having:

 

  • The batch job aborts in the start() method. Tier 3 Support told us that the batch job was occasionally timing out because its initial  query was too complex. We simplified the query (at this point, there are no WHERE or ORDER BY clauses), but we occasionally see timeouts or near timeouts. However, from what we can observe in the Debug Logs, actually executing the query (creating the QueryLocator) takes only a few seconds, but then it can take many minutes for the rest of the start() method to complete. This seems inconsistent with the “query is too complex” timeout scenario that Tier 3 support described.  (Case 04274732.)
  • We get the “Unable to write to ACS Stores” problem. We first saw this error last Fall, and once it was eventually fixed, Support assured us that the situation would be monitored so it couldn’t happen again. Then we saw it happen in January, and once it was eventually fixed, Support assured us (again) that the situation would be monitored so it couldn’t happen again. However, having seen this problem twice, we have no confidence that it won’t arise again. (Case 04788905.)
  • In one run of our job, we got errors that seemed to imply that the execute() method was being called multiple times concurrently. Is that possible? If so, (a) the documentation should say so, and (b) it seems odd that after over 6 months of running this batch job in a dozen different orgs, it suddenly became a problem.

 

  • We just got an error saying, “First error: SQLException [java.sql.SQLException: ORA-00028: your session has been killed. SQLException while executing plsql statement: {?=call cApiCursor.mark_used_auto(?)}(01g3000000HZSMW)] thrown but connection was canceled.” We aborted the job and ran it again, and the error didn’t happen again.
  • We recently got an error saying, “Unable to access query cursor data; too many cursors are in use.” We got the error at a time when the only process running on behalf of that user was the Batch Apex process itself. (Perhaps this is symptomatic of the “concurrent execution” issue, but if the platform is calling our execute() method multiple times at once, shouldn’t it manage cursor usage better?)
  • We have a second Batch Apex job that uses an Iterable rather than a QueryLocator. When Spring 11 was released, that Batch Apex job suddenly began to run without calling the execute() method even once. Apparently, some support for the way we were creating the Iterable changed, and even though we didn’t change the API version of our Apex class, that change caused our Batch Apex job to stop working. (Case 04788905.)
  • We just got a new error, "All attempts to execute message failed, message was put on dead message queue."

 

We really need to talk with a Product Manager responsible for Batch Apex. We need to determine whether Batch Apex is sufficiently stable and reliable for our needs. If not, we’ll have to find a more reliable platform, re-implement our package, and move our dozen or more customers off of Salesforce altogether.

 

If you’re responsible for Batch Apex or you know who is, please send me a private message so we can make contact. Thank you!

 

  • April 04, 2011
  • Like
  • 0