0

I have a scheduled process which runs every hour and call a method of another class which has a call to external wsdl. This method is annotated with @Future(callout = true) When I get data back from wsdl I process it and insert it into salesforce object. As the data is very huge I am encountering issue: Too Many DML statements: 151

Now I want each of the response record to be processed in different transaction so that I don't hit the salesforce limit. I tried annotating it with @future so that a separate transaction is created everytime. But now I get different issue that Future cannot be called from Future.

Code: Scheduled Class:

class HourlySchedule implements Schedulable {
    global void execute(SchedulableContext SC) {
        Client.call();
    }
}

Class which does a callout to external wsdl and get the response:

class Client {
    @future(callout = true) // this has to be there as Schedule class 
   // cannot do a callout to external service
    public static void call() {
       // callout to wsdl 
       // get response and process each client and its data
       // here we can get 100's of client and each client can have 
       ProcessClass.process();
    }
}

Class which process the data

class ProcessClass {
   public static void process(String data) {
      // multiple insert statments
      // cannot reduce them further as I have to insert the parent object first and then insert child object for creating master-detail relationship.
   }
}
DALJIT SINGH
  • 111
  • 1
  • 11
  • Do you have 150 different objects you are inserting or 150 records? If it's records, just store them and insert them all at once. If it's 150 different objects, you could look to kick off a batch job to perform the inserts. – Psymn Oct 04 '22 at 14:45

1 Answers1

0

You may have a bug in that code, you shouldn't do queries/inserts/updates/deletes in a loop. If you're parsing that service's response and updating data in SF - try to add your objects to a list and run 1 update outside of the main work loop. Hard to say more without seeing your code.

If you absolutely need "process what you can, then submit next chunk of work in next transaction"... As you know how much data you got (X rows from that external service) it might be most "natural" to make a batch job. Database.Batchable can work on custom iterators, not just rows queried from database. This is decent start, what it doesn't show is how start could fetch data from your external service or you'd even pass the data to iterate over. (nothing stops you from making a public field in that class with list of objects and pass that, then pretend you return that thing from start())

If it's unknown amount of rows you could also look into Queueable. Again, let's say you pass to it 1000 of your records, each Queueable executes() insert of 150 and submits (enqueues) itself to process next chunk.


Edit

It's hard without seeing the data but what I'm suggesting is something like this. Let's say you have parent nodes, each with some unique identifier in them (or even without identifier) and each with child nodes. Do something like that:

Map<String, Parent> parents = new Map<String, Parent>();
Map<String, List<Child>> childrenByParent = new Map<String, List<Child>>();

for(Parent p : yourDataParsedOrSomething){
    parents.put(p.uniqueClientNumber, p);
    List<Child> temp = new List<Child>();
    for(Child ch : p.children){
        temp.add(ch);
    }
    childrenByParent.put(p.uniqueClientNumber, temp);
}

insert parents.values();

List<Child> allChildren = new List<Child>();
for(String key : parents.keyset()){
   Parent p = parents.get(key);
   List<Child> children = childrenByParent.get(key);
   for(Child ch : children){
      ch.Parent__c = p.Id; // populate lookup
   }
   allChildren.addAll(children);
}
insert allChildren;

Here you go, 2 inserts. All parents, then map the Ids back, all children. You can pull something like that off even if you don't have unique identifier. It'd be bit more messy but List counts as index too, you could have List<Parent> and List<List<Child>> and you know that id of 7th parent has to be set on 7th list of children...

If you do have unique id and flagged the field as external id in salesforce this gets even easier, you'd need to read up about upsert by external id. https://stackoverflow.com/a/60024078/313628 for example

eyescream
  • 18,088
  • 2
  • 34
  • 46
  • Hi @eyescream, I updated question with code. Can you please look into it and provide some insight – DALJIT SINGH Oct 08 '22 at 08:02
  • This is too skeleton. Can you post more of that process()? Do you have insert in a loop in there? What's the structure like, multiple parent nodes, each with multiple child nodes? Is there something unique in the parents (and by any chance do you know what an "external id" is in salesforce)? – eyescream Oct 08 '22 at 17:03