function readOnly(count){ }
Starting November 20, the site will be set to read-only. On December 4, 2023,
forum discussions will move to the Trailblazer Community.
+ Start a Discussion
Irene SlessIrene Sless 

Internal Salesforce Error

I have a batch job which used to run fine, but since end of last week is throwing an error: 'FATAL ERROR Internal Salesforce Error' every time. 

Developer script from .... exception  : 'OrderItemCleanup' : common.exception.SfdcSqlException: ORA-20145:  ORA-06512: at "DOPEY.BORDERITEM", line 1830 ORA-06512: at line 1   SQLException while executing plsql statement: {call bOrderItem.check_delete_o...

Apex script unhandled exception by user/organization: .../...
Failed to process batch for class 'OrderItemCleanup' for job id '7079000005qLkrQ'

It's a batch job that runs through existing OrderItems and deletes those of a certain type, then updates the parent Order with text and a roll-up summary field.
I tried to log a case but it seems I can't do so anymore, and am directed to the Community here. Having read other posts, it seems to be something I need to log with SF, but I can't. 
Best Answer chosen by Irene Sless
Irene SlessIrene Sless
Thanks Mahesh. I managed to find the problem - there was a trigger which was automatically activating the Order on certain conditions, and once an Order is activated one cannot update/add/delete an OrderItem. This batch job deletes items, so although the query looks for only Order in draft mode, as soon as it updated the Order to meet the conditions in the trigger, it would activate it, and then the delete of the items would fail. Phew. Thank you for your time and help and I'll have a look at your options to consider too!

All Answers

Mahesh DMahesh D
Hi Irene,

If possible can you paste your code here.

These kind of issue occurrs when you have bunch of records in the batch and one of the record fails as part of processing in the trigger context. To resolve this, would like to see the Batch Apex code so that it will be easy to provide the options. But at this point:

Reduce the batch size while calling the batch job in scheduler and see how it is working.

 
OrderItemBatch oib = new OrderItemBatch(); 
database.executebatch(oib, 50);

Also look into below similar posts:

https://developer.salesforce.com/forums/?id=906F0000000Au0TIAS

https://developer.salesforce.com/forums/?id=906F0000000ApG6IAK

https://developer.salesforce.com/forums/?id=906F000000091enIAA

http://salesforce.stackexchange.com/questions/89618/internal-salesforce-com-error-in-a-query-inside-a-batch

http://salesforce.stackexchange.com/questions/33629/spontaneous-internal-salesforce-com-error-what-do-i-do


These posts may not solve the issue but just FYI.

Please do let me know if it helps you.

Regards,
Mahesh
Irene SlessIrene Sless
Hi Mahesh
Thanks for your response. The query returns 2514 rows. Batch size is 200 at the moment, but I'll try it with 50. This is my batch code:
global class OrderItemCleanup implements Database.Batchable<SObject> {
    
    global Database.queryLocator start(Database.BatchableContext BC) {
        String query = 'Select Id, OrderId, OrderStatus__c, LineType__c, Description, LineNo__c from OrderItem' +
            ' where LineType__c = \'Narrative\' and Order_StatusCode__c = \'D\' ' +
            ' order by OrderId, LineNo__c' ;
        
        
        return Database.getQueryLocator(query);
        
    }
    
    global void execute(Database.BatchableContext BC, List<sObject> scope) {
        
        List<OrderItem> oLines = (List<OrderItem>)scope;    
        
        Map<Id, String> mapNarratives = processLines(oLines);
        if(!mapNarratives.isEmpty()){
            processHeader(mapNarratives);
        }
    }
    
    global Map<Id, String> processLines(List<OrderItem> oLines)
    {
        Map<Id, string> mapNarratives = new Map<Id, string>();
        List<OrderItem> iTextLines = new List<OrderItem>();    
        List<OrderItem> iBlankLines = new List<OrderItem>();    
        
        If(oLines.size() > 0){
            Map<string, string> mapLineTyps = new Map<string, string>();
            string strNarr = '';
            Id ord = oLines[0].OrderId;
            
            for(OrderItem ordln : oLines)
            {
                if(ord != ordln.OrderId)
                {
                    mapNarratives.put(ord, strNarr);
                    ord = ordln.OrderId;
                    strNarr='';
                    if(ordln.Description != null){
                        strNarr = ordln.Description;
                        iTextLines.add(ordln);
                    }
                    else {
                        iBlankLines.add(ordln);
                    }
                }
                else 
                {
                    if(ordln.Description != null){
                        strNarr += '\r\n' + ordln.Description;
                        iTextLines.add(ordln);
                    }
                    else {
                        iBlankLines.add(ordln);
                    }
                }
                
            }
            //add the last entry
            mapNarratives.put(ord, strNarr);
        }        
        If(iTextLines.size() > 0){
            delete iTextLines;    
        }
        If(iBlankLines.size() > 0){
            delete iBlankLines;   
        }

        return mapNarratives;
 
    }
    
    public static void processHeader(Map<Id, String> mapNarratives)
    {
        List<Order> ords = new List<Order>();
        Order updOrd;
        //Add the Narratives to the Order. Remove the blank entry in the map.
        mapNarratives.remove(null);
        for(Id ordId : mapNarratives.keySet())
        {
            updOrd = new Order(Id = ordId, Description = mapNarratives.get(ordId));
            ords.add(updOrd);
        }
        update ords;
    }
    
    global void finish(Database.BatchableContext BC) {
    }
}

 
Mahesh DMahesh D
Hi Irene,

Here we need to consider few things:

Option 1:
As you are using multiple DML operations, I would recommend to go with 1 or 2 record as a batch and proceed. But it all depends on how many records you are expecting in every batch schedule. If the number of records are 100 / 200 then you can proceed with this option.

Option 2:

You can start using the Database class and its methods for performating the DML oprations with the falg as false but this may create inconsistent data as first operation may success 100% but the second one may not success 100% which will makes the data inconsistent.

Option 3:

Look all possible exceptions and handle them properly before proceed with the DML operations.

Option 4: 

Add exception handling for all the DML operations.

Please do let me know if it helps you.

Regards,
Mahesh

 
Irene SlessIrene Sless
Thanks Mahesh. I managed to find the problem - there was a trigger which was automatically activating the Order on certain conditions, and once an Order is activated one cannot update/add/delete an OrderItem. This batch job deletes items, so although the query looks for only Order in draft mode, as soon as it updated the Order to meet the conditions in the trigger, it would activate it, and then the delete of the items would fail. Phew. Thank you for your time and help and I'll have a look at your options to consider too!
This was selected as the best answer
Mahesh DMahesh D
Irene,

Glad you were able to fix the issue and please mark it as solved so that it will be helpful for others in the future.

Regards,
Mahesh