• anoopasok
  • NEWBIE
  • 0 Points
  • Member since 2009

  • Chatter
    Feed
  • 0
    Best Answers
  • 0
    Likes Received
  • 0
    Likes Given
  • 1
    Questions
  • 4
    Replies

Hi All,

 

i'm facing a wierd scenario... my account object contains around 2 to 3 million records... i fired a soql query on this object for getting the total number of records, and it has been running for the past two hours... my org has been locked since that time and i'm not able to make any code changes now... is there any way to forcefully stop the execution of the soql query and release the org from the locked state? its ok even if i dont have the query result, but i just want my org to be unlocked... please help...

 

Thanks,
Anoop

Hello, I need to capture in a custom field the date of the most recent activity on a record, as contained in the activity history.  Further, I need to focus on only certain activities, such as a "Call" or "Email" from my inside sales reps.  Does anyone have some guidance on this?

 

Regards,

-Chris

Hi All,

 

i'm facing a wierd scenario... my account object contains around 2 to 3 million records... i fired a soql query on this object for getting the total number of records, and it has been running for the past two hours... my org has been locked since that time and i'm not able to make any code changes now... is there any way to forcefully stop the execution of the soql query and release the org from the locked state? its ok even if i dont have the query result, but i just want my org to be unlocked... please help...

 

Thanks,
Anoop

Hi all;

Is there a way for an administrator to prevent a user from accessing certain opportunities(not all opportunities). It means the user could access only the opportunities that the administrator has allowed.

Hello,

 

I've just recently jumped into trying to write Triggers and am having difficulty getting this trigger and test class to work.

 

It seems the test class doesn't like the Double I'm trying to add into the new record and the trigger doesn't like one of my variable statements.

 

Can someone help point me in the right direction?

 

Here's the error when trying to deploy the test class.  Note: the test class worked for another trigger until I added the "rPosition__c = 5".

 

Run Failures:

  resultsFetchTestClass.myUnitTest System.DmlException: Insert failed. First exception on row 0; first error: CANNOT_INSERT_UPDATE_ACTIVATE_ENTITY, calculatePointsOnResults: execution of BeforeInsert


caused by: System.TypeException: Invalid double: 5.0


Trigger.calculatePointsOnResults: line 4, column 17: []

  Average test coverage across all Apex Classes and Triggers is 66%, at least 75% test coverage is required

 

Here's the test class.  You can see that Member and Test are lookups.  This test class deployed fine for another trigger I wrote but when I added rPosition__c it came back with the "Invalid Double" error.  rPosition is a number(2,0) field.

 

 

@isTestprivate class resultsFetchTestClass { static testMethod void myUnitTest() { // TO DO: implement unit test Results__c rResults = new Results__c(Member__c = 'a0K80000001EQri', Test__c = 'a0M80000003pvbD', rPosition__c = 5); insert rResults; }}

 

 Here's the trigger.  In the code you can see one statement that is commented out that worked (replacing the rPos variable with the value 2).  If I deploy the test class without the rPosition I get an error saying there were no items in the list created.

 

 

trigger calculatePointsOnResults on Results__c (before insert, before update) { for (Results__c rRes : Trigger.new){ Double rPos = Double.valueOf(rRes.rPosition__c); String group = String.valueOf(rRes.Group__c); Points__c p = [Select s.Points__c From Points__c s where Position__c = :rPos AND Group__c = :group LIMIT 1];// Points__c p = [Select s.Points__c From Points__c s where Position__c = 2 AND Group__c = :group LIMIT 1]; THIS ONE WORKS rRes.Points__c = p.Points__c; }}

 

 I'm sure I'm doing a bunch of stuff wrong, just trying to move along in my learning and take care of a few app issues.

 

Thanks!