• System Administrator 893
  • NEWBIE
  • 10 Points
  • Member since 2019

  • Chatter
    Feed
  • 0
    Best Answers
  • 0
    Likes Received
  • 0
    Likes Given
  • 2
    Questions
  • 0
    Replies
Hi, 

Our team have encountered some problem with AWS S3 to Salesforce integration and we need some suggestion.

CSV file has been integrated from AWS S3 to Salesforce via Bulk API, S3 has used this library >> https://github.com/heroku/salesforce-bulk to implement their integration. S3 also created their own mapping file (Contains more than 150 fields - for every objects mapping in Salesforce will be used this single mapping file) to map their column in CSV file with Salesforce field.

However, when we investigate Bulk API jobs after data integration, only 90 Columns can be upsert to Salesforce (Without order - we had tried to move those 91 and later on column to first column but it's still not updated).

Therefore, we still have no clue whether it is because of the library or not, or is it all about salesforce Bulk API limitation ? I have go through some documentation and it's said that Bulk API limit is 5000 fields. Do you have any ideas or suggestion on this issue ?

Best Regards,
Thank you very much in advance
Hi,

I have a question regarding Salesforce and AWS - S3 integration. Do they have any standard approach to integrate with Amazon AWS - S3 or not ? If any, Can you please specify and suggest to us, which method we could use ?

Thank you very much in advance,