• Richie Landingham
  • NEWBIE
  • 15 Points
  • Member since 2017

  • Chatter
    Feed
  • 0
    Best Answers
  • 0
    Likes Received
  • 0
    Likes Given
  • 3
    Questions
  • 3
    Replies
Hello Everyone,

Right now I am working on an IRS Standards calculator for our internal users, what this does is takes the clients zip code, and from there using the Google Geocode API will be getting the correct county.

The problem I am running into is about how to structure the table to look up the standards. Esentially what I will have to look up is every County in the United States (So about 3222 Counties), Each county has a standard for how many household members, up to 5. So there would be 7 Columns.

State:
County:
Standards for 1 member:
Standards for 2 member:
Standards for 3 member
Standards for 4 member
Standards for 5 member


But, is this too many records to be using for a custom metadata type? What would be best practice for this? I tried to find an API I can reference for the standards, but unfortunatley I couldn't find anything.
Hello!,

I was wondering if anyone has any experience with this, or if anyone knows if this is even possible?

We get an excel document automatically created on our local server every day with information we have pulled via another API (That we can't intergrate directly with Salesforce).

What we want to do, is every day at X Time take the excel sheet, and upload it to salesforce using a mapping into a custom object.
 
Has anyone encounted a need to upload a CSV automatically like this? Would love to hear some potental ideas as well.
Hello,

I am having issues running into this 50 future limit.

Currently, we have a portal that we created inhouse for our clients, and we send a lot of information from our salesforce database to our portal database. 

In this particular instance we are sending multiple objects to our portal, and per client we can get up to 100 of these objects, but obviouslly this is a problem because of the 50 max limit on @future.

If anyone has an example of how they get around this, or what is a effective way of getting this done. Is it possible to make a queue that executes in batches of 10, waits for them to return, and executes the next batch of 10? Or something simillar.
Please help me add a line of code to my trigger.  I don't want the trigger to run if Date_Pushed_to_Production__c is populated.  This field resides on the custom object Project__c.  Thank you for looking at my question.  
Here is the trigger:


Trigger MoveToProduction on Opportunity (before insert, before update){

   List<ID> ProjIds = New List<ID>();

  for(Opportunity o : Trigger.new){
    if(o.Move_to_Production__c == true && o.RecordTypeId == '012a0000001FqTn'){
    
      ProjIds.add(o.Project__c);
  }

  List<Project__c> ProjList = [SELECT id, Move_to_ProductionP__c, Date_Pushed_to_Production__c FROM Project__c WHERE id in :ProjIds];
  for(integer i = 0 ; i < ProjList.size(); i++){
    If(ProjList[i].Date_Pushed_to_Production__c == null)
     
     ProjList[i].Move_to_ProductionP__c = true;
       ProjList[i]. Date_Pushed_to_Production__c = system.today();
      
  update ProjList;
  • August 16, 2017
  • Like
  • 0
Hello,

I am having issues running into this 50 future limit.

Currently, we have a portal that we created inhouse for our clients, and we send a lot of information from our salesforce database to our portal database. 

In this particular instance we are sending multiple objects to our portal, and per client we can get up to 100 of these objects, but obviouslly this is a problem because of the 50 max limit on @future.

If anyone has an example of how they get around this, or what is a effective way of getting this done. Is it possible to make a queue that executes in batches of 10, waits for them to return, and executes the next batch of 10? Or something simillar.