function readOnly(count){ }
Starting November 20, the site will be set to read-only. On December 4, 2023,
forum discussions will move to the Trailblazer Community.
+ Start a Discussion
monsterdustinmonsterdustin 

SF Governor Limits work for this project? Suggested best practice?

Hi all. Really need some help with this one.We want to see if SF has the power to harness the multitude of API calls we need to complete a record of data in as near as real time as possible, without getting close to/exceeding any apex / batch governer limits.

 

 

I've referenced the image to give a general idea.

 

2) We already have a connector using apatar that is sending just one transaction ID to salesforce every 5 minutes. It makes sure no duplicates and stores in an object.

3-4) This is where we need SF to pick up on. Based on this transction ID that comes in from "2", we need to complete this records data by calling an external server via an API. Each transaction requires a seperate API call/response. If this we're run every 20 minutes, tehre are times when there could be 25-100 tranasctions that need a seperate API call to this "Data Server 2". These responses for each transaction are returned to salesforce and complete the object where the trasnaction ID lies.

 

5-6) Based on the response in 3-4, if one column = "X" then we need to call to another external server via API to add more details to this records based on this "X".

 

My worries are getting this into Salesforce in a timely manner based scheduling / limits / governors. THe data is not large in size so it fits here, so the worry is calling these out. 5-6, are dependant on 3-4 , and 5-6 & 3-4 all require making a seperate call out to Dataserver 2/3 for each tranasaction we want to search.

 

 

Can anyone make a suggestion as to whre we should be looking at SF to do this? Previously, if we wanted this every hour, we couldnt do it because we could only schedule so many data sets per hour. THe issue as well was in the scheduling apex, we only saw where we could make a schedule every hour. Moving this to hourly, there could be 50-100 tranactions that need two seperate call outs, meaning 100-200 individual API calls would need to be made and seemed to complex to do that. It also made the data only 1 hour up-to-date.

 

 

Any suggestions?

 

My thought was if in 3-4 & 4&5, we can somehow "bulk" an outbound API call that would only count as 1 outgoing call, but when submitted would actually make the individual calls/responses.

 

If this would work and we could shorten the interval to round trip, 30 minutes to complete the data, it would work for us.

 

Thank you so much!