+ Start a Discussion

callout and scheduled job bulkification


From everything I have seen, if I want to do a REST based callout or schedule or abort a scheduled job programatically, there is no good way to bulkify that process. I realize that batch APEX could be used, but from what I have seen, there is no way to run system.schedule or system.abort using a list as an argument or run something like:
HttpRequest req = new HttpRequest();
            req.setHeader('Authorization', 'Bearer ' + UserInfo.getSessionID()); 
            req.setHeader('Content-Type', 'application/json');
            Http h = new Http();
            HttpResponse res = new HttpResponse();
                res = h.send(req);
            catch(System.CalloutException e){
            	calloutError1 = e.getMessage();

using a list. So if I need to do something like this multiple times due to updates on multiple records, the best way I have seen to deal with it is to do the scheduled job or callout in a loop with an iterator that only allows a maximum number of iterations and then does exception handling if it goes over that number. 

Am I missing anything here? Is there any way to bulkify these sorts of things or any best practices about this type of situation?


Hara SahooHara Sahoo
Hey Chris,
You could try something like below, where you define the batch size?
global class bulkCallouts implements Database.Batchable<Integer> {
 global Iterable<Integer> start(Database.BatchableContext context) { 
Integer[] values = new Integer[0]; 
// get the list size, it could be 10,000
values = get the List size

global void execute(Database.BatchableContext context, Integer[] values) {
 // You have 10000, but Do one callout for each integer in values 
global void finish(Database.BatchableContext context) { } }

// to be in the callout limits of 100 callouts per transaction, execute the batch job by limiting the size, it can be lower as well
Database.executeBatch(new bulkCallouts(), 100);


You could also consider Queueable with Database.AllowsCallouts. This would require chaining the jobs together or using a staging custom object to queue the required calls. 

Whatever approach you take, consider using Limits.getCallouts() and Limits.getLimitCallouts() to monitor the usage within a transaction.
Hi Hara,

Thanks for your reply! I definately considered that, but I think it may be overkill. I am not anticipating updating many records and I have measures in place to insure that each callout or scheduled job is run on no more than 10 records per execution context (with error handling and notifications to deal with a scenario that it somehow happens with more than 10 iterations). It just feels wrong to be doing these callouts or scheduled job initializations one by one in a loop rather than performing the operation on a collection. I am pretty sure you answer is the correct one for this type of scenario, just want to confirm that I am not overlooking anything. 

In your opinion, do you think it would be best practice to implement batchable or queueable if I am already insureing that there are no more than 10 instances of these?

Thanks again,