You need to sign in to do that
Don't have an account?
Jon Sheldon 42
Is the Bulk API appropriate?
Here is my scenario:
- Will receive several X12 834 files per day via SFTP to an on premise server. There is potential for dozens or 100s of files that need to be imported. - Each file needs to be imported into a custom salesforce object
- Some files may contain only 5000 records and some may contain 100,000+ records
To me it seems the Bulk API is the best means to import such data.
The process would be roughly:
1. Transform an 834 file into a CSV file with upsert records for the custom object.
2. Create a Bulk API job for importing the CSV file.
3. Separate the CSV upsert file into separate chunks (10MB or 10000 rows, whichever comes first) submitting each as a batch to the job.
4. Check the state of the job/result of each batch.
Is the Bulk API a good solution for this, or is there another API which might be better suited?
- Will receive several X12 834 files per day via SFTP to an on premise server. There is potential for dozens or 100s of files that need to be imported. - Each file needs to be imported into a custom salesforce object
- Some files may contain only 5000 records and some may contain 100,000+ records
To me it seems the Bulk API is the best means to import such data.
The process would be roughly:
1. Transform an 834 file into a CSV file with upsert records for the custom object.
2. Create a Bulk API job for importing the CSV file.
3. Separate the CSV upsert file into separate chunks (10MB or 10000 rows, whichever comes first) submitting each as a batch to the job.
4. Check the state of the job/result of each batch.
Is the Bulk API a good solution for this, or is there another API which might be better suited?
1. Is there any relationship on custom object? Relationship may cause record lock.
2. An error handling needs to be planned at on-premise server side for failed records. (you may have a scenario when batch is successful but some of the records may have failed)