+ Start a Discussion
Marry SteinMarry Stein 

create batch job for flexible number of records?

Hello Community,

i have created a class which handles customer activities. It checks all activities within the customer relationship. A separate object (Activity Template) defines which activities should be available. If an activity is missing, the class creates a new one. If there is a unneeded activity, the class marks it as "to be deleted".  The class pushes the records in a list and do an upsert call (create new records and update unneeded record).This class will be run once a week (scheduled).

The problem i have is, that the number of records in the list is really flexibel. As we know, it is not possible to insert more than 10k rows within one call. A batch job should process thousands or millions of records. In our case, the number of records could be between 0 and 15000 per week. Both numbers are the extremes.

I think about different options and would appreciate you opinion:
1. create a batch job which runs once a week and accept the small number of records.
2. create a class which runs several times a week that upserts a maximum of 10000 records.  Since the job runs several times a week, capacity is divided up.  

Option 1 is more comfortable because the batch job basically takes care of all problems. Nevertheless, I tend to go for option 2, because I think you should really only use batch jobs when they are really needed.

Do you have any advice/experience? Is it ok to use a batch job for a few data sets to handle extreme cases?
Best Answer chosen by Marry Stein
ShivankurShivankur (Salesforce Developers) 
Hi Marry,

Yes, the option to use scheduled apex would be kind of simpler solution.But, as you can have flexible number of records to process, you can design a logic to see the number of records to be processed before starting any further code execution for processing and based of the number of records to be processed you can call specific methods which could result as optimized logic for your use case.

So, with this logic in place, you will be able to save the server system resources for larger processings and for less records, you could have a synchronous operation to update the records on daily basis. Also, this may need data analysis or trend to see if the number of records to be processed are always less or always more, and a particular design approach could be applied.

Finally, it will depend on your org structure and data behavior/flow within specific timeframes. After having a good analysis over the trend and best fit, you could take a decision and go for it.

Hope above information helps. Please mark as Best Answer so that it can help others in future.

​​​​​​​Thanks.

All Answers

ShivankurShivankur (Salesforce Developers) 
Hi Marry,

As in your case the number of records is flexible, scheduled batch job could be assumed as perfect option. Batch Apex is used to run large jobs (think thousands or millions of records!) that would exceed normal processing limits. Using Batch Apex, you can process records asynchronously in batches (hence the name, “Batch Apex”) to stay within platform limits. It would always be better option to choose batch apex since you just need to define it once and leave the job to Server to execute it on given time or to run weekly basis. It would surely avoid the issue that might come up with option 2 which you thought of.

There are few cosiderations or best practice which you should keep in mind while designing this implementation:
https://developer.salesforce.com/docs/atlas.en-us.apexcode.meta/apexcode/apex_batch_interface.htm

Hope above information helps. Please mark as Best Answer so that it can help others in future.

Thanks.
Marry SteinMarry Stein

Hi Shivankur,

thanks for the quick response! So you would recommend to "play it save" and use scheduled batch job? I was just not sure if it is ok to start a batch job for few record :)

Thanks Shivankur!

ShivankurShivankur (Salesforce Developers) 
Hi Marry,

Yes, the option to use scheduled apex would be kind of simpler solution.But, as you can have flexible number of records to process, you can design a logic to see the number of records to be processed before starting any further code execution for processing and based of the number of records to be processed you can call specific methods which could result as optimized logic for your use case.

So, with this logic in place, you will be able to save the server system resources for larger processings and for less records, you could have a synchronous operation to update the records on daily basis. Also, this may need data analysis or trend to see if the number of records to be processed are always less or always more, and a particular design approach could be applied.

Finally, it will depend on your org structure and data behavior/flow within specific timeframes. After having a good analysis over the trend and best fit, you could take a decision and go for it.

Hope above information helps. Please mark as Best Answer so that it can help others in future.

​​​​​​​Thanks.
This was selected as the best answer
Marry SteinMarry Stein
Thanks Shivankur,

it is not self-evident that you get a specific answer and not just a link to the docs! I will follow your advice ;)