function readOnly(count){ }
Starting November 20, the site will be set to read-only. On December 4, 2023,
forum discussions will move to the Trailblazer Community.
+ Start a Discussion
Tom SimmonsTom Simmons 

Help with callout batches

Need help. Im new to Apex (with decent exp on Java) and Im working on callout class to get data from external system. Below code works fine, however while making a callout to larger data set, I get "Error: System.LimitException: Too many DML rows: 10001". Im assuming this is due to large data being returned from webservice. Is there a way to process this data in batches? I know Batch class is used for operations like this however I`m not able to get it work. How can I process JSON reponse in batches? Can someone please give me a working example?

 
HttpRequest obj = new HttpRequest();
        HttpResponse res = new HttpResponse();
        Http http = new Http();
        String reqBody = '{ "user": "user_name", "pswd": "user_password" }';
        obj.setMethod('POST');
        obj.setHeader('Content-Type','application/json');
        obj.setEndPoint('https://test.samplepoint.com/api/UserSrvs.svc/Login');
        obj.setBody(reqBody);
        obj.getheader('Auth-Token');
        res = http.send(obj);

                authtoken objAuthenticationInfo = (authtoken)JSON.deserialize(res.getbody(), authtoken.class);
                System.debug('objAuthenticationInfo: '+objAuthenticationInfo);


                                String token = res.getHeader('Auth-Token');
                                system.debug('token: '+token);    


        Http h1 = new Http();
        HttpRequest req1 = new HttpRequest();
        String reqBody2 = '{"Accountype" : "workforce"}'; 
        req1.setHeader('Auth-Token', token);
        req1.setHeader('Content-Type','application/json');
        req1.setMethod('POST');
         req1.setBody(reqBody2);

        req1.setEndpoint('https://test.samplepoint.com/api/accservices.svc/accountfeed');
        system.debug('======req1========'+req1);
        HttpResponse res1 = h1.send(req1);
        system.debug('==========res1============'+res1.getBody());



    DataParser deserializeRes =  new DataParser();
    deserializeRes = (DataParser)System.JSON.deserialize(replaceJson, DataParser.class);
     List <DataParser.cls_account> advisorList = new List<DataParser.cls_account>();
    advisorList = deserializeRes.rut.accounts.account;



List<Funtional_Account__c> lstAccount = new List<Funtional_Account__c>();
for(DataParser.cls_account c : advisorList){
    Funtional_Account__c PFA = New Funtional_Account__c();
    PFA.payment_Unique_ID__c =  c.account_id;
   PFA.Advisor_ID__c = c.advisor_id;   
    PFA.Unique_ID__c =  c.account_id;
    PFA.Financial_Account_Number__c =  c.account_num;
    PFA.Account_Type__c =  c.account_type;
    PFA.Client__c =  '0015C000003VqWh';

    lstAccount.add(PFA);
}


Boolean isUpsertfirstTime = true;
try {
    upsert lstAccount Financial_Account_Number__c;
}catch (DMLException e) {
                System.debug('Re-trying');
                if(isUpsertfirstTime){
                        upsert lstAccount Financial_Account_Number__c;
                                isUpsertfirstTime = false;
              }
}

 
Best Answer chosen by Tom Simmons
bob_buzzardbob_buzzard
You won't be able to process the JSON response in batches as it won't exist outside of your Salesforce transaction, instead you need to request the data in chunks that you can process within governor limits, but it will be up to you to keep track of how many records you have retrieved and processed etc.  Batch apex allows you to process a large number of Salesforce records by breaking them up into smaller batches that are each processed inside their own transaction. 

If you can't request external data in chunks then you probably want something in the middle (middleware or heroku app, for example) to process the big response and send it in small chunks to Salesforce.

All Answers

bob_buzzardbob_buzzard
You won't be able to process the JSON response in batches as it won't exist outside of your Salesforce transaction, instead you need to request the data in chunks that you can process within governor limits, but it will be up to you to keep track of how many records you have retrieved and processed etc.  Batch apex allows you to process a large number of Salesforce records by breaking them up into smaller batches that are each processed inside their own transaction. 

If you can't request external data in chunks then you probably want something in the middle (middleware or heroku app, for example) to process the big response and send it in small chunks to Salesforce.
This was selected as the best answer
Tom SimmonsTom Simmons
Thank you Bob! This helps...