• runger
  • NEWBIE
  • 478 Points
  • Member since 2009

  • Chatter
    Feed
  • 15
    Best Answers
  • 0
    Likes Received
  • 0
    Likes Given
  • 0
    Questions
  • 120
    Replies

According to the documentation when you call getOrgDefaults on a Custom Setting it should return null if there are no Org Defaults defined.  Since the Spring 12 release this method is not returning null but instead is returning a blank object.  Try the following code from the Developer Console on a custom setting with no values defined.  This causes problems if you have code that has defaults defined in the case of the Custom Setting not being configured yet (i.e. getOrgDefaults returning NULL).

 

System.debug(YourCustomSetting__c.getOrgDefaults());

 

The debug output will look like below instead of saying Null

 

10:05:08:098 USER_DEBUG [1]|DEBUG|YourCustomSetting__c:{}

 

Has anyone else noticed this?  I am going to log a case with Developer Support as well.

 

 

  • February 14, 2012
  • Like
  • 0

Just about every time I go to Setup | Develop | Apex Test Execution, I get the "Is Your Code Covered?" pop-up, with the buttons inviting me to take the tour of this functionality. I've taken the tour. Many times. And yet, with each new browser session, I get prompted to take the tour. Again.

 

I've tried clicking each of the buttons in the prompt window, to no avail. The page just can't seem to remember, at least not past the current browser session, that I've taken the tour. It's getting pretty annoying. 

 

Has anybody else seen this behavior? Is there any way to get it to stop asking me to take tour?

 

Thanks!

  • October 14, 2011
  • Like
  • 0

Hi,

 

we have a managed package installed in our organization.

The managed package has a field called status. I am writing a trigger on the managed package object to send email alert whenever the status is updated.

 

For some reason i am unable to reference the status variable in my trigger. 

 

ErrorError: Compile Error: Variable does not exist: echosign_dev1__Status__c at line 7 column 12

Can't i write a trigger on my Managed package Object?

if i can, can't i reference the variable?

 

Any help/idea on this is highly appreciated.

 

Trigger:


 

trigger sendEmail on echosign_dev1__SIGN_Agreement__c (after Update) {
    List<Id> agrmtId=New List<Id>();
    
    for(echosign_dev1__SIGN_Agreement__c agrmt:Trigger.New){
        
        if(echosign_dev1__Status__c=='Signed')
            agrmtId.add(agrmt.Id);   
    }
}

 

trigger sendEmail on echosign_dev1__SIGN_Agreement__c (after Update) {
    List<Id> agrmtId=New List<Id>();      

 for(echosign_dev1__SIGN_Agreement__c agrmt:Trigger.New){              

 if(echosign_dev1__Status__c=='Signed')            

agrmtId.add(agrmt.Id);     

 }
}

 

Thanks,

Sales4ce

Hello.  the documentation for each DML statement says that "You can pass a maximum of 1000 sObject records to a single  

method."

 

Does this apply in Batch Apex as well?  I need to iterate through thousands of imported records and create a list of 2000+ opportunities that will be inserted on a monthly basis.

 

I understand that heap size could be an issue if my opps list gets even larger than 2000, so perhaps I'll just have to manually chunk the insert anyway, but I'd like to know if the 1000 obj DML limit applies in batch Apex.

 

Thanks

David

When I try to execute this code snippet:

 

String strVal = '';
List <String> lstVals = strVal.split(',');
System.debug ('***** List: ' + lstVals + ', size: ' + lstVals.size ());
for (String s : lstVals)
  System.debug ('***** List element [' + s + ']');

 

 

String strVal = '';

List <String> lstVals = strVal.split(',');

System.debug ('***** List: ' + lstVals + ', size: ' + lstVals.size ());

for (String s : lstVals)

  System.debug ('***** List element [' + s + ']');

 

I get the following output (some lines omitted for clarity:

 

15:24:15.186|USER_DEBUG|[3,1]|DEBUG|***** List: (), size: 1

15:24:15.186|USER_DEBUG|[5,3]|DEBUG|***** List element []

So, it seems that somehow I'm getting back an empty list of size '1'.  I would expect back a list of size 0, since there are no tokens in my string being split (it's an empty string).
Am I missing something, or is this a peculiar bug?  If so, is it a bug in 'split' or 'size'?
Thanks!

 

In runtime composed querry, that get result as List<sObject> when sObject type is specified also in runtime.

 

After this i need to make same manipulation with data that contains in that's sObject fields. So i need to get type for each of them.

 

I use part of code like this

 

List<sObject> QuerryResult = Database.querry(....); 

Schema.DescribeSObjectResult ObjDescribe = QuerryResult[0].getSObjectType().getDescribe();

Schema.DescribeFieldResult FieldDescribe =  ObjDescribe.fields.*Name Of Field*.getDescribe();

Schema.SOAPType FieldType = FieldDescribe.getSOAPType();

 So the first question is: How can i get the field from fields array, when FieldName contains in another string variable and I don't know it when write code, just in runtime. (mb smth like $ in php or Type.getType(string typeName) in C#)

 

And the second(closely wired with the first): How can i get Schema.DescribeSObjectResult when name of object contains in another string variable. Becouse make querry before getting  all info about recieving fields crossing with understandible of program.

 

Smth like this:

 

Schema.DescribeSObjectResult ObjDescribe = Schema.SObjectType.*Name of sObject*

 

 

 

 

 

 

Hello,

 

I'm trying to avoid duplicate sobjects in my list of sobjects that I submit for update/delete/insert.  I realized that Sets are great for this.  The documentation states the following concerning uniqueness of sObjects in Sets:

 

http://www.salesforce.com/us/developer/docs/apexcode/index.htm

 

"Uniqueness of sObjects is determined by IDs, if provided. If not, uniqueness is determined by comparing fields. For example, if you try to add two accounts with the same name to a set, only one is added"

 

However, in my experience, this is not the case.  In the following example, I have provided the ID of the opportunities, but after changing one field, both opportunities are still added to the Set.  This is not the expected behavior because the ID of the opportunities are supplied and identical.

 

 

Opportunity opp1 = [Select Id from Opportunity Where  Id = '006Q00000054J7u'];
Set<Opportunity> opps = new Set<Opportunity>(); 
opps.add(opp1); 
opp1.Name = 'Something new';
opps.add(opp1);
System.debug('SIZE: ' + opps.size()); //prints 2, expect 1

 

 

 

What am I doing wrong?  Is this an API version issue?  I believe I'm using api version 19.0.

 

I will need to rewrite a lot of code if the Set uniqueness does not work as advertised.

 

Thanks for any help you might provide,

 

Andrew

I am wortking on a conversion tool for a custom object (Customer Bulk Density). the thought process is that if a user create a bulk density record in Lbs/ft3, Salesforce will create a duplicate record in Kg/m3. Currently I am trying to write the trigger to create the new record, but am running into the following error:

 

Error: Compile Error: Method does not exist or incorrect signature: Customer_Bulk_Density__c.put(Id, SOBJECT:Customer_Bulk_Density__c) at line 15 column 7

 

Below is the code:

 

trigger BulkDensityConversion on Customer_Bulk_Density__c (after insert) {
    Set<Id> bIds = new Set<Id>();
    for(Customer_Bulk_Density__c c:trigger.new){
    bIds.add(c.id);
    }
    
    System.debug('****1 : B Id size '+ bIds.size());
    List<Customer_Bulk_Density__c> c = [Select id, name, Customer_Product__c, Customer_Product_2__c, Customer_Product_3__c from Customer_Bulk_Density__c where id in:bIds];
    
    System.debug('****2 : b size '+ bIds.size());
    if (c.size() > 0) {
    List <Customer_Bulk_Density__c> design= new List <Customer_Bulk_Density__c>();
    Map<id, Customer_Bulk_Density__c> capacity = new map<id, Customer_Bulk_Density__c>();
    for(Customer_Bulk_Density__c m: c) {
      Customer_Bulk_Density__c.put(m.id, m);
    }
    
    //Loop through the records and create a Bulk Density Record

    Id n1;

        for(Customer_Bulk_Density__c y : trigger.new) {
        n1 = bulk.get(y.id).WhatId;
        
        Customer_Bulk_Density__c s = new Customer_Bulk_Density__c(
        Name = trigger.new[0].name,
        Customer_Product__c=trigger.new[0].Customer_Product__c,
        Customer_Product_2__c=trigger.new[0].Customer_Product_2__c,
        Customer_Product_3__c=trigger.new[0].Customer_Product_3__c
        );
        
        insert s;


       }
        
    }
    }

 

How do I solve this error?

So I'm working on a component which uses quotas (via RevenueForecast object), but I'd like the app to be able to operate without quotas.

 

I have a method in my controller like this:

  public RevenueForecasts[] getRevenueForecasts() { ... }

 

But if I try to deploy the app into an org that doesn't have customizable forecasting enabled I get a compile error because the org doesn't know about RevenueForecast.

 

Is there some way to code to a generic type (maybe sObject?) so that I can just return a null quota if the org doesn't have RevenueForecast objects available?

 

Thanks,

Mat

Hi Community,

 

who can provide a good and effective way to remove duplicates from a List of sObjects?

 

Of course without another SOQL ;)

 

Cheers,

//Hannes

Which for loop here is more efficient, is there even a difference when it comes to avoiding governor limits?

 

 

Task[] tasks = new Task[]{};
for(Integer i=0;i<tasks.size();i++) {
//some code
}

 

 

OR

 

 

Task[] tasks = new Task[]{};
for(Task activity: tasks) {
//some code
}

 

 

 

 

I have a few long lines of code that I need to wrap onto 2 lines rather than just using one line.  Is there a statement continuation character or something that will let me do that?

 

Thanks,

 

Doug

Hi All,

 

I am here below copying a paragraph from the Metadata API user guide that it is confusing me and I wonder if someone could shed some light.

 

You can modify metadata in test organizations on Developer Edition or sandbox, and then deploy tested changes to production organizations on Enterprise Edition or Unlimited Editions. You can also create scripts to populate a new organization with your custom objects, custom fields, and other components.

 

Does this means that the create(), update() and delete() calls of the Metadata API in a client running elsewhere outside salesforce would only work when calling a test/sandbox or developer orgs, and will not work for an Enterprise and Unlimited orgs ?.

 

Thank you very much in advance.

 

According to the documentation you should be able to create a map of Id or String to SObject from a list of sObjects and Database.query() returns a list of sObjects.   However, the following code will not compile/execute:

Map<Id, sObject> accounts = new Map<Id, sObject>( Database.query( 'SELECT Id FROM Account LIMIT 5' ) );

Also, map.putAll() should work similarly but gives the same error message.   Example of broken code:

Map<Id, sObject> accounts = new Map<Id, sObject>();
accounts.putAll( Database.query( 'SELECT Id FROM Account LIMIT 5' ) );

It looks like the map constructor and putall are expecting a list of explicit object types, not generic sObjects.  According to the documentation, a generic sObject list should work.

See the definition of Map.putAll() here:

http://www.salesforce.com/us/developer/docs/apexcode/Content/apex_methods_system_map.htm

It states:

If the map is of IDs or Strings to sObjects, adds the list of sObject records l to the map in the same way as the Map constructor with this input.

 

Actually, with further testing it doesn't appear that I can get putAll() to work at all on a map of Id to sObject whether the input is a list of explicit types (say List<Account> ) or a list of sObjects.

  • February 11, 2010
  • Like
  • 0

Can we get a proper log without any limitations? I have a batch that works fine when the record count is low but fails at high counts. In the latter case I cannot see what the error was because the debug log cuts off somewhere in the middle. Up until now I have been dealing with this just by making guesses and trial and error changes but I really don't think I should be doing that!! :)

 

Is it possible to see what the error was? Thanks.

  • December 16, 2009
  • Like
  • 1

Hi All, I'm trying to package my release managed package, and i'm getting lot of failures related to the same error: 
"System.LimitException: Your runAllTests request is using too many DB resources." 

I was able to package before, but seems like we have a new limitation now.... 
I really need this package to go live ASAP and of course, this is a BLOCKER. 

Notice that ALL MY TEST CASES HAVE 0 FAILURES if i run them individually. 

Thanks in advance, 
J.

According to the documentation when you call getOrgDefaults on a Custom Setting it should return null if there are no Org Defaults defined.  Since the Spring 12 release this method is not returning null but instead is returning a blank object.  Try the following code from the Developer Console on a custom setting with no values defined.  This causes problems if you have code that has defaults defined in the case of the Custom Setting not being configured yet (i.e. getOrgDefaults returning NULL).

 

System.debug(YourCustomSetting__c.getOrgDefaults());

 

The debug output will look like below instead of saying Null

 

10:05:08:098 USER_DEBUG [1]|DEBUG|YourCustomSetting__c:{}

 

Has anyone else noticed this?  I am going to log a case with Developer Support as well.

 

 

  • February 14, 2012
  • Like
  • 0

Greetings,

 

Not sure if this is expected behavior or not but:

 

Test Method that looks at the share records for an object, in v24, fails unless seeAllData is set to true. Even if you create all records in the test methods and use test.startTest() and test.stopTest() appropriatly.

 

Here is sample code to run to show no records are returned for the share (Not even rowcause = owner). In v23 this works). This means that you must set seeAllData = true in your test methods to test the sares.....One would think you would be able to see shares created as a result of your test records...

 

@isTest
private class TestBug{

    private static testmethod void testAccount(){
    
        Account a = New Account(
            Name = 'Testing',
            shippingcity = 'test',
            shippingstate = 'OH',
            shippingstreet = 'test street',
            shippingpostalcode = '41111',
             ownerID = userinfo.getUserID()
            );
         test.startTest();  
         insert a;
         test.stopTest();
         Accountshare[] ashare = [Select ID, RowCause, AccountID, UserOrGroupID From Accountshare where Accountid =:a.id];
         system.debug(ashare);
           
    
    }



}

 

 

Here's a fun one that I"m hoping someone may have some insight on...

 

Apex Test Execution in Spring runs a bunch of tests at once (as compared to winter when it only ran 3 or 4 at a time).  I already logged a case to address a problem of sporadic limit exceptions occuring during Apex Test Execution.

 

Well, today I noticed that while running tests under Apex Test Execution, many of our VisualForce pages start returning:

 

"To protect all customers from excessive usage and Denial of Service attacks, we limit the number of long-running requests that are processed at the same time by an organization. Your request has been denied because this limit has been exceeded by your organization. Please try your request again later. "

 

It looks like SFDC isn't excluding those tests from whatever limits it uses to detect a DOS attack.

 

Has anyone seen this?

 

Our application has about 130 unit tests in 25 classes - some include bulk testing and complex scenarios - so they can take a while to run even individually.

 

Dan

Hello,

 

I recently had a test started rendering an Internal Salesforce Error without any code changes on my end.  I have managed to write a simple test class that demonstrates the issue.  Basically, this seems to be related to deserializing JSON to a class and then attempting to insert a list of Contacts in a variable on the class.

 

Here is a simple example which will render the error: 

 

public with sharing class InsertJSONContactList {
    
    public string someVar;
    public list<Contact> insertContacts;
    
    public InsertJSONContactList () {
        insertContacts = new list<Contact>();
        someVar = 'hello';
    }
    
    //Test
    public static testMethod void testSerialize() {
        // Instantiate the class
        InsertJSONContactList ext = new InsertJSONContactList();  
        
        // Make sure we are connected to reality
        system.assertEquals('hello',ext.someVar);
        
        // Add a contact to the list
        ext.insertContacts.add(new Contact(LastName='Rogerson', Email='rogerson@superfaketestemail.org'));
        
        // Insert the list, this will work
        insert ext.insertContacts;
        
        // Reset the list for later...
        ext.insertContacts = new list<Contact>();
        
        // Serialize the class
        string jsonExt = JSON.serialize(ext); 
        
        // Deserialize the string
        JSONParser parser = JSON.createParser(jsonExt);
        Type wrapperType = Type.forName('InsertJSONContactList'); 
        InsertJSONContactList nwExt = (InsertJSONContactList) parser.readValueAs(wrapperType); 
        
        // Make sure we are still connected to reality
        system.assertEquals('hello',nwExt.someVar); 
        
        // Add a contact to our list
        nwExt.insertContacts.add(new Contact(LastName='Davidson', Email='davidson@superfaketestemail.org'));  
        
        // Insert the list. Internal Salesforce Error
        insert nwExt.insertContacts;  
        
    }
}

 

This looks like a bug to me, especially since it only recently started happening after working fine for some months. However, I would love to hear of any suggestions for workarounds.  I will have to refactor quite a bit of code to get around this problem.

Hey Guys,

I've just been having terrible luck recently. Everything I do seems to casue an internal salesforce error. What I am working on at this point is upserting a list of person accounts based on an external id field (SSN__C). I've never done an upsert before (I know, I'm lame) so I am unsure if the error I am getting is due to me getting the process wrong, or if I found yet another bug. The list of accounts comes from parsing a CSV file, and if I change the upsert to just an insert, it works fine so I know the data it's trying to pass in for creating is legit. Here is the relevant code.

//get a list of people to upsert by parsing the CSV file into a list of account objects.
 list<Account> accounts = csvTosObject(parseResult, 'Account');
 

 system.debug('========================= List of Accounts');
 system.debug(accounts); //shows all accounts parsed fine
                         
//upsert them (need to get a reference to the external field to use for id)
Schema.SObjectField f = Account.Fields.SSN__c;                        
List<Database.upsertResult> accountSaveResult = Database.upsert(accounts,f); // <---- Explodes

//if instead of an upsert I have an insert, it works fine. So i know the data is good.
//List<Database.saveResult> accountSaveResult = Database.Insert(accounts,false);

//iterate the person save results to create the import result log
for(database.UpsertResult r : accountSaveResult)
{
    if(r.isSuccess())
    {
        if(r.isCreated())
        {
            personCreates++;
        }
        else
        {
            personUpdates++;
        }
    }
    else
    {
       personErrors++;
        for(Database.Error e : r.getErrors())
        {
            personErrorText.add(e.getMessage());
        }
    }
}

 
Otherwise I can break the operation into two separate insert and update statments probably (seems hackish but it would work) but I'd prefer to just upsert if I can. Any help is appreciated. Thanks!

 

When JSON support was added in APEX, I was one of those guys who jumped up and down. Started using heavily for one of my integration project and everything was fine and dandy for couple week. Since yesterday, I have been noticing some weired behavior. The first time I noticed it, I thought it was one of those APEX issues I love to call "APEX weirdness" and hoped that it will fix itself (READ: getting fixed without us knowing). That hasn't happened. :(

 

Here is the issue. 

 

My JSON parsing code looks like this:

class SomeApexWrapper {

public static SomeApexWrapper getInstance(String jsonString)

JSONParser parser = JSON.createParser(jsonString);       

SomeApexWrapper w = (SomeApexWrapper) parser.readValueAs(SomeApexWrapper.class);

}

}

 

This code was working fine until two days ago. It stops working If I change any class that uses this class to parse json string. The error I get is "[Source: java.io.StringReader@21363a13; line: 1, column: 1] Don't know the type of the Apex object to deserialize"

 

Just saving the SomeApexWrapper  class again fixes the issue. 

 

Has anyone had/having this issue? Is there a permanent solution for this?

  • November 07, 2011
  • Like
  • 0

Hello.  the documentation for each DML statement says that "You can pass a maximum of 1000 sObject records to a single  

method."

 

Does this apply in Batch Apex as well?  I need to iterate through thousands of imported records and create a list of 2000+ opportunities that will be inserted on a monthly basis.

 

I understand that heap size could be an issue if my opps list gets even larger than 2000, so perhaps I'll just have to manually chunk the insert anyway, but I'd like to know if the 1000 obj DML limit applies in batch Apex.

 

Thanks

David

Hi All,

 

I am here below copying a paragraph from the Metadata API user guide that it is confusing me and I wonder if someone could shed some light.

 

You can modify metadata in test organizations on Developer Edition or sandbox, and then deploy tested changes to production organizations on Enterprise Edition or Unlimited Editions. You can also create scripts to populate a new organization with your custom objects, custom fields, and other components.

 

Does this means that the create(), update() and delete() calls of the Metadata API in a client running elsewhere outside salesforce would only work when calling a test/sandbox or developer orgs, and will not work for an Enterprise and Unlimited orgs ?.

 

Thank you very much in advance.

 

According to the documentation you should be able to create a map of Id or String to SObject from a list of sObjects and Database.query() returns a list of sObjects.   However, the following code will not compile/execute:

Map<Id, sObject> accounts = new Map<Id, sObject>( Database.query( 'SELECT Id FROM Account LIMIT 5' ) );

Also, map.putAll() should work similarly but gives the same error message.   Example of broken code:

Map<Id, sObject> accounts = new Map<Id, sObject>();
accounts.putAll( Database.query( 'SELECT Id FROM Account LIMIT 5' ) );

It looks like the map constructor and putall are expecting a list of explicit object types, not generic sObjects.  According to the documentation, a generic sObject list should work.

See the definition of Map.putAll() here:

http://www.salesforce.com/us/developer/docs/apexcode/Content/apex_methods_system_map.htm

It states:

If the map is of IDs or Strings to sObjects, adds the list of sObject records l to the map in the same way as the Map constructor with this input.

 

Actually, with further testing it doesn't appear that I can get putAll() to work at all on a map of Id to sObject whether the input is a list of explicit types (say List<Account> ) or a list of sObjects.

  • February 11, 2010
  • Like
  • 0