+ Start a Discussion
MJ09MJ09 

Batch Apex - can execute() invocations run concurrently?

I have a Batch Apex process that runs for a long time (several hours) over a large set of records. One of the things the job's execute() method does is to query for a bunch of records, and then do an upsert on them. This job is the *only* bit of Apex code that modifies these records. Users can't modify these records through the UI -- the records are visible to users only in that they contribute to roll-up summary fields on a parent object.

 

I just started getting occasional "Unable to obtain exclusive access to this record" errors. Since the batch job's execute() method is the only code that tries to update (upsert) these records, I'm wondering if it's possible that multiple invocations of the execute() method are running at the same time. That's the only possible reason I can think of for this error to occur.

 

So:

 

1. Is it possible for the execute() method to be running more than once at the same time? (I've already confirmed that the batch job was launched only once.)

 

2. If the only Apex code that touches these records is in the execute() method, what else might be responsible for this error?

 

Thanks!

Best Answer chosen by Admin (Salesforce Developers) 
MJ09MJ09

Filed a Case to ask this question, and got this response from Tech Support:

 

1. Is it possible for the execute() method to be running more than once at the same time? (I've already confirmed that the batch job was launched only once.)

A:

All execute methods for batches within a batch job are synchronous so they will not "trip" over each other.

 

2. If the only Apex code that touches these records is in the execute() method, what else might be responsible for this error?

 

A:

Possible actions which execute outside of the scope of transaction for the batch operation.

: rollup summary fields This is a common occurrence

: Time based workflow which execute outside of the scope of transaction for the batch operation. This is not common.

 

 

For the rollup summary fields, a good practice to avoid this is to process the records grouped by the rollup parent record.

 

Also, an additional good practice is to first try an upsert in a try catch block. If you receive an error, reissue the upsert using the database method and then loop through the results to determine which records failed. You can either save the id's for later processing or logging of failure.

All Answers

Ritesh AswaneyRitesh Aswaney

Fair enough that only your Batch Job has exclusive rights to these records, however, they may have relationships to other objects (master/detail or lookup), which others have access to.

 

If as part of your batch job you're trying to update these related records, then it could so be that someone else is also trying to update these related records, eg. via triggers or workflows or such

 

MJ09MJ09

Thanks for your reply, but I'm about as sure as I can possibly be that only my batch process is updating these records. These records contribute to a roll-up summary, but they're really not exposed to users -- they're not shown on a related lists, and my users don't have any reason to see or update them, and don't even really know they exist. This particular org is very tightly controlled, so there are only a few users anyway, and I *know* they haven't been touching these records.

 

And I've used the IDE to scan all Apex code (classes and triggers), and the only DML statements for these records are in my batch class.

 

Any other ideas?

Ritesh AswaneyRitesh Aswaney

In my experience, (if memory serves me right !), records in the same batch can trip over each other if they rollup to / update the same parent records, as they are vying for an exclusive lock to update. It might be an idea to, is possible check whether batches that fail have records related by the parent / shared object.

paul-lmipaul-lmi

The Bulk Data webservices API supports this, but not Apex.  The whole point of batch apex is for Salesforce to better stabilize spikes in their resources, not to make concurrent data/programming operations work.  It would be excellent if it did though, because we're in the same boat.

 

What you could do, is move your code to a trigger, and then use the bulk data API to fire this trigger with a "fake" object update.  This is the only way I can think of to get concurrency built in.

MJ09MJ09

Paul, thanks for your reply. I'm actually hoping NOT to have concurrent execution -- I don't want two execute() method invocations to trip over each other.

 

The purpose of the batch job is to compute roll-up summary-like values for fields that can't be defined as roll-up summaries because there's no master/detail relationship. Because there can be very large numbers of lookup child records, a trigger won't cut it -- I need a batch process. 

paul-lmipaul-lmi

gotcha.  in that case, in your execute clause, ensure you're checking for "duplicates"/overlapping records to ensure you don't have overlap.  then, each batch will ensure uniqueness in that calculation.

MJ09MJ09

So are you saying that you know for certain that the execute() method could be called in parallel, rather than always being called serially?

paul-lmipaul-lmi

i write assuming it can be parallel. but I don't see it documented either way.  i always code extremely defensively on this platform to avoid gotchas, now, or later on.

MJ09MJ09

Filed a Case to ask this question, and got this response from Tech Support:

 

1. Is it possible for the execute() method to be running more than once at the same time? (I've already confirmed that the batch job was launched only once.)

A:

All execute methods for batches within a batch job are synchronous so they will not "trip" over each other.

 

2. If the only Apex code that touches these records is in the execute() method, what else might be responsible for this error?

 

A:

Possible actions which execute outside of the scope of transaction for the batch operation.

: rollup summary fields This is a common occurrence

: Time based workflow which execute outside of the scope of transaction for the batch operation. This is not common.

 

 

For the rollup summary fields, a good practice to avoid this is to process the records grouped by the rollup parent record.

 

Also, an additional good practice is to first try an upsert in a try catch block. If you receive an error, reissue the upsert using the database method and then loop through the results to determine which records failed. You can either save the id's for later processing or logging of failure.

This was selected as the best answer