function readOnly(count){ }
Starting November 20, the site will be set to read-only. On December 4, 2023,
forum discussions will move to the Trailblazer Community.
+ Start a Discussion
vemurikivemuriki 

BULK API V2 limitation

Hi,

I have a query on the data limits, As per documentation of Bulk API v2, we can upload max 100MB per call to
/services/data/vXX.X/jobs/ingest/jobID/batches

https://developer.salesforce.com/docs/atlas.en-us.api_asynch.meta/api_asynch/upload_job_data.htm

It also seems we can execute only one put call per job, when I executed a second PUT call to the job I received the following response.

[{"errorCode":"BULK_API_ERROR","message":"Found multiple contents for job: <jobID>, please 'Close' / 'Abort' / 'Delete' the current Job then create a new Job and make sure you only do 'PUT' once on a given Job."}]

Does this mean if I have upload 1GB of data using Bulk API v2, I need to create 10+ jos? Is my understanding correct.

Thank you.
SwethaSwetha (Salesforce Developers) 
HI vemuriki,
Looks like having multiple PUT calls is the only possible way to upload more than 100 MB

Actually, the limit is 150MB per call. The documentation asks you to keep the size close to 100MB because the file size increases during Base64 conversion. If you need to upload 1GB of data, then you need to create multiple jobs to handle the upload. If you have an ETL in your architecture, it can easily be automated to do this.

Hope this helps you. Please mark this answer as best so that others facing the same issue will find this information useful. Thank you