function readOnly(count){ }
Starting November 20, the site will be set to read-only. On December 4, 2023,
forum discussions will move to the Trailblazer Community.
+ Start a Discussion
Christopher_JChristopher_J 

Limiting Insert Batch Size in API

Hi All,

I've created two trigger flows that run when records are inserted into a Custom Object I've created.  I've tested them with the Apex Data Loader and they work great *IF* I limit the batch size to 10.  If I run it at 200, everything fails saying the flow couldn't be launched.

Eventually this will be part of a 3rd party integration that is being developed.  I'm wondering if there is a way with either the Bulk API or the normal API to define the batch size as 10 records at a time no matter what record set is being uploaded?  Or will the 3rd party devs have handle all of that on their end and divide the record set up?

Thanks,

Chris
Vinit_KumarVinit_Kumar
Try using Bulk API,you can set the batch size there.The minimum is 200 and maximum is 10000.

10k is the maximum batch size that you can set per batch.So say you have 50K records then 5 batches will be the minimum no of batches required .The limit for no of batches is 2000 per 24 hour(on rolling basis).

Go through the below links to learn more :-

https://www.salesforce.com/us/developer/docs/api_asynch/Content/asynch_api_concepts_limits.htm

If this helps,please mark it as best answer to help others.