function readOnly(count){ }
Starting November 20, the site will be set to read-only. On December 4, 2023,
forum discussions will move to the Trailblazer Community.
+ Start a Discussion
Rolando EstevesRolando Esteves 

Parameters

Hi,

 

I wanna be able to pass the CSV file to another class in order to process more records from the file since i have a governor for insert statements.

 

CODE:

 

public with sharing class UploadRecordUsingCSV {

    public UploadRecordUsingCSV(ApexPages.StandardController controller) {

    }

    
    public Blob FileRecords{get;set;}
    
    String[] LineNo= new String[]{};
    List<product_opportunity__c> AllUnit;
    
    Public Pagereference UploadFile()
     {
       system.debug('Enter');
       String FileData=FileRecords.toString();
       LineNo=FileData.split('\n');
       AllUnit = new List<product_opportunity__c>();
                system.debug('Entered****'+ AllUnit);
       for(Integer i=1;i<LineNo.size();i++)
        {
          product_opportunity__c pto = new product_opportunity__c();
          String[] ActualData=new String[]{};
          ActualData=LineNo[i].split(',');
          pto.Name = 'TEST';
          
          pto.OpportunityID__c = ActualData[0];
          pto.Part_Number__c = ActualData[1];
          pto.Vendor_Unit_Cost__c = double.valueOf(ActualData[2]);
          
          //AllUnit.add(pto);
          insert pto;
        }
       //insert AllUnit;
       return Null;
     }
}

 

 

Best Answer chosen by Admin (Salesforce Developers) 
JayNicJayNic

Ok.. I'm a little confused by the process at this point.. But I think I get it.

 

You need to import a CSV with enought line items to hit governors if you do it synchronously - and you have a trigger on the object that your CSV is importing into, so you need your trigger to be bulkified as well.

 

Is this correct?

 

To bulk your trigger you would do this:

trigger productsUploader on product_opportunity__c (before insert) {

//Create a set of product Part numbers
set<string> PartNumbers = new set<string>();    
for (product_opportunity__c po : Trigger.New) {
    PartNumbers.add(po.Part_Number__c);
}

//Now query for PriceBookEntries
if (!PartNumbers.isEmpty()) {
    PriceBookEntry[] pbes = [SELECT id, ProductCode FROM PriceBookEntry WHERE ProductCode IN :PartNumbers];

    //Now create a map of recod ids to product codes
    map<string,id> PriceBookProductCodes = new map<string,id>();
	for (PriceBookEntry p : pbes) {
		PriceBookProductCodes.put(p.ProductCode,p.id);
	}
	
	//Now we can assign the proper pricebook enttry id to the product opportunity records
	for (product_opportunity__c po : Trigger.New) {
		po.PricebookEntryID__c = PriceBookProductCodes.get(po.Part_Number__c);
	}

}

 

Now that will bulkify your trigger. It will only use one query.

 

If you still hit limits (as in you are inserting too many rows at once) then you would need to move your csv import to an asynchronous @future class.

All Answers

JayNicJayNic

Is there a reason you are not using the Custom Object Importer? This seems quite simple for users to do.

 

NEVER do DMLs inside for loops!

 

The commented out list.add() method you had going was the right one... How big is your CSV?

 

 

Rolando EstevesRolando Esteves

Ive been trying to do a custom development but now one seems to know the asnwer. I am well aware of the governor limits.

 

In order for me to eliminate the insert statement inside my for loop I would have to bulk this trigger:

 

trigger productsUploader on product_opportunity__c (before insert) {
    
     product_opportunity__c l_PTO = Trigger.new[0];
     l_PTO.PricebookEntryID__c = [select id from PriceBookEntry A 
     									where A.ProductCode = :l_PTO.Part_Number__c].Id;
}

 Basically I have products codes in each record of the custom object. I need to get the correct pricebookentryID for every record. I cant handle my pricebookentry table in memory since i have 300,000 pricebookentrys. Think you can help ?

JayNicJayNic

Ok.. I'm a little confused by the process at this point.. But I think I get it.

 

You need to import a CSV with enought line items to hit governors if you do it synchronously - and you have a trigger on the object that your CSV is importing into, so you need your trigger to be bulkified as well.

 

Is this correct?

 

To bulk your trigger you would do this:

trigger productsUploader on product_opportunity__c (before insert) {

//Create a set of product Part numbers
set<string> PartNumbers = new set<string>();    
for (product_opportunity__c po : Trigger.New) {
    PartNumbers.add(po.Part_Number__c);
}

//Now query for PriceBookEntries
if (!PartNumbers.isEmpty()) {
    PriceBookEntry[] pbes = [SELECT id, ProductCode FROM PriceBookEntry WHERE ProductCode IN :PartNumbers];

    //Now create a map of recod ids to product codes
    map<string,id> PriceBookProductCodes = new map<string,id>();
	for (PriceBookEntry p : pbes) {
		PriceBookProductCodes.put(p.ProductCode,p.id);
	}
	
	//Now we can assign the proper pricebook enttry id to the product opportunity records
	for (product_opportunity__c po : Trigger.New) {
		po.PricebookEntryID__c = PriceBookProductCodes.get(po.Part_Number__c);
	}

}

 

Now that will bulkify your trigger. It will only use one query.

 

If you still hit limits (as in you are inserting too many rows at once) then you would need to move your csv import to an asynchronous @future class.

This was selected as the best answer