+ Start a Discussion

how to deal with datasets larger than limits

I'im hitting an api that returns a variable amount of data. In our initial testing, we were getting around 5-6k records back and I would handle them all with a single dml upsert. Now that we are closer to production, some of the batches are coming back with more than 10k records, which means dml failure. Since we already have code that runs for all records, is there a simple fix to look for the number of records and possibly loop until all records are updated? Or do I have to refactor the whole thing to run as a batches?
daofu hudaofu hu
I am suffering from the similar situation ..