• Jesus.Arcas
  • NEWBIE
  • 5 Points
  • Member since 2011

  • Chatter
    Feed
  • 0
    Best Answers
  • 0
    Likes Received
  • 0
    Likes Given
  • 1
    Questions
  • 4
    Replies

Hi,


We have an Apex Job for which we would like to have some control whenever user uses Abort standard link. Does anybody know if it is possible to execute some code as a result of an Apex Job abort?


Thanks in advance.


Regards,

Jesus Arcas.



Hi,

 

I've just started to learn sales force.  Our company already has some custom dashboard pages and a iframed page on a tab a previous developer created.  

 

Since I'm new to this API and figure I don't need to learn everything at once, I was wondering if someone could point me in the right direction on what modules to learn to accomplish the following task.

 

 

Build a report for a dashboard which takes each account and gives the last 6 month sales history. Right now we don't enter sales data in SalesForce. This is in our proprietary web based system. We input our unique account number in sales force, so this number could be used to grab sales data from our system.

What would be the best way to push our sales data into some custom data types (whatever those types are), so I can create a datasource on the custom data types to pull 6 months worth of data? Figure the custom data type would contain (account number, sales date (Y-m-d), and sales amount).

 

Any help here would be grately appreciated.

 

Hi,


We have an Apex Job for which we would like to have some control whenever user uses Abort standard link. Does anybody know if it is possible to execute some code as a result of an Apex Job abort?


Thanks in advance.


Regards,

Jesus Arcas.



Hi,

 

I am getting a apex heap size error in my batch apex class. - System.LimitException: Apex heap size too large: 6030143.

The records that needs to be processed are in the range of 10k to 50K. I think it is because of the map that I am using in my code.

 This is how my code looks like

global class CreatePortfolioReport implements Database.Batchable<sObject>, Database.Stateful{
	public String query;
	global String email;

	Map<String, Purchase_Sales_Report__c[]> purchaseMap = new Map<String, Purchase_Sales_Report__c[]>();
	Map<String, Purchase_Sales_Report__c[]> salesMap = new Map<String, Purchase_Sales_Report__c[]>();
	
	
	global Database.QueryLocator start(Database.BatchableContext BC){
        return Database.getQueryLocator(query);
    }
    
    global void execute(Database.BatchableContext BC, List<sObject> scope){
    	 for(sObject s: scope){
    	 	Purchase_Sales_Report__c ps = (Purchase_Sales_Report__c)s;
    	 	if(ps.Transaction__c != null){
	    	 	if(ps.Transaction__c.equals('Purchase')){
		    	 	if(!purchaseMap.isEmpty() && purchaseMap.containsKey(ps.Unique_Name__c)){
		    	 		purchaseMap.get(ps.Unique_Name__c).add(ps);
		    	 	}
		    	 	else{
		    	 		List<Purchase_Sales_Report__c> newList = new List<Purchase_Sales_Report__c>();
		    	 		newList.add(ps);
		    	 		purchaseMap.put(ps.Unique_Name__c, newList);
		    	 	}
	    	   }
	    	   else if(ps.Transaction__c.equals('Sales')){
	    	   		if(!salesMap.isEmpty() && salesMap.containsKey(ps.Unique_Name__c)){
		    	 		salesMap.get(ps.Unique_Name__c).add(ps);
		    	 	}
		    	 	else{
		    	 		List<Purchase_Sales_Report__c> newList = new List<Purchase_Sales_Report__c>();
		    	 		newList.add(ps);
		    	 		salesMap.put(ps.Unique_Name__c, newList);
		    	 	}
	    	   }
    	 	}
    	 } 
    	 System.debug('Purchase Map size'+purchaseMap.size());
    	 System.debug('Sales Map size'+salesMap.size());
    }
    
    global void finish(Database.BatchableContext BC){
    	Map<String, Double> salesUnits = new Map<String, Double>();
    	Map<String, Double> purchaseUnits = new Map<String, Double>();
    	Map<String, Portfolio_Report__c> portfolioMap = new Map<String, Portfolio_Report__c>();
    	List<Temp_Purchase_Report__c> purchaseList = new List<Temp_Purchase_Report__c>();
    	for(String uniqueName: salesMap.keySet()){
  			Double units = 0;
  			Double purchaseAmount = 0;
  			for(Purchase_Sales_Report__c ps:salesMap.get(uniqueName)){
  				
  				if(ps.Units__c != null){
  					units += ps.Units__c;
  				}
  			}
  			salesUnits.put(uniqueName, units);
  		}
  		System.debug('Sales Map'+salesMap.size());
  		for(String uniqueName: purchaseMap.keySet()){
  			
  	        Double units;
  			if(salesUnits.containsKey(uniqueName)){
  				units = Math.abs(salesUnits.get(uniqueName));
  			}
  			Double pUnits = 0;
  			Double product = 0;
  			Double portUnits = 0;
  			Double portAmount = 0;
  			Double divReinvAmount = 0;
  			Double divAmount = 0;
  			Double stpAmount = 0;
  			Boolean entityFlag = true;
  			Id entity;
  			String folio;
  			String assetClass;
  			String schemeName;
  			for(Purchase_Sales_Report__c ps:purchaseMap.get(uniqueName)){
  				
  				if(units != null && pUnits != units){
  					if(ps.Units__c != null){
  						pUnits += ps.Units__c;
  					}
  					
  				}
  				else{
  					
  					if(ps.Units__c != null){
  						portUnits += ps.Units__c;
  					}
  					if(ps.Amount__c != null && ps.Type__c != null){
  						
  						if(ps.Type__c.equalsIgnoreCase('NOR') || ps.Type__c.equalsIgnoreCase('SIP')){
  							portAmount += ps.Amount__c;
  						}
  						else if(ps.Type__c.equalsIgnoreCase('DIR')){
  							divReinvAmount += ps.Amount__c;
  						}
  						else if(ps.Type__c.equalsIgnoreCase('STI') || ps.Type__c.equalsIgnoreCase('SWI')){
  							stpAmount += ps.Amount__c;
  						}
  						else if(ps.Type__c.equalsIgnoreCase('DVP')){
  							divAmount += ps.Amount__c;
  						}
  					}
  					if(ps.Product__c != null){
  						product += ps.Product__c;
  					}
  					if(entityFlag){
	  					entity = ps.Entity__c;
	  					folio = ps.Folio_Number__c;
	  					assetClass = ps.Asset_Class__c;
	  					entityFlag = false;
	  					schemeName = ps.Scheme_Name__c;
  					}
  					System.debug('Create Port Units'+portUnits+'Amount'+portAmount+'Product'+product);
  				}
  				
  			}
  			if(portUnits != 0 && product != 0 && (portAmount != 0 || divAmount !=0 || divReinvAmount !=0 || divAmount != 0) ){
  				Temp_Purchase_Report__c pr = new Temp_Purchase_Report__c(Entity__c= entity, 
  																		 Folio_Number__c = folio, 
  																		 Asset_Class__c = assetClass,
  																		 UniqueName__c = uniqueName, 
  																		 Purchase_Amount__c= portAmount, 
  																		 Units_Quanitity__c = portUnits, 
  																		 Product__c = product,
  																		 Dividend_Reinvested__c = divReinvAmount,
  																		 Dividend__c = divAmount,
  																		 STP_Switch__c = stpAmount,
  																		 Scheme_Scrip_Name__c = schemeName);
  				purchaseList.add(pr);
  			}
  		}
  		System.debug('Purchase List'+purchaseList.size());
  		
  		upsert purchaseList UniqueName__c;
  		  		
  		AsyncApexJob a = [Select Id, 
                                 Status,
                                 NumberOfErrors, 
                                 JobItemsProcessed,  
                                 TotalJobItems, 
                                 CreatedBy.Email 
                                 from AsyncApexJob 
                                 where Id =:BC.getJobId()];
        // Create and send an email with the results of the batch.
        Messaging.SingleEmailMessage mail = new Messaging.SingleEmailMessage();
        mail.setToAddresses(new String[] {email});
        mail.setReplyTo('');
        mail.setSenderDisplayName('Batch Processing');  
        mail.setSubject('Create Portfolio Report ' + a.Status);
        mail.setPlainTextBody('The batch apex job processed ' + a.TotalJobItems +   ' batches with ' + a.NumberofErrors + ' failures.');
    
        Messaging.sendEmail(new Messaging.SingleEmailMessage[] { mail });
  		
    }
}

 As I require the map for processing the data in my finish method, I cannot think of any other option to implement it.

 

Can you please suggest me ways to avoid the error?

 

Thanks,

Jina