function readOnly(count){ }
Starting November 20, the site will be set to read-only. On December 4, 2023,
forum discussions will move to the Trailblazer Community.
+ Start a Discussion

Number of Total Batches changes during Batch Apex

During a batch apex job I have noticed the number of Total Batches is changing, lowering. It starts high and then as the job progresses the number starts to lower. It would also appear that the batch apex job is missing some of the records that should have been originally queried and are never being processed by the batch apex class.


Here are some images showing the drop as the batch job progresses:

1670 Total Batches


1623 Total Batches


1527 Total Batches after complete


The difference is 143 batches or 29,000 records which is exactly how many records are not being processed.


This is bizarre. Any ideas?





Is there any chance you're selecting records that aren't part of the batch and are modifying them so that they no longer match the query criteria, and salesforce is recalculating how many records are left in the query after each batch?


That's the only thing I can think of. :-) hopefully there's a logical explenation otherwise it's extremely worrisome 





The way I understand how it works is the batch class will query all matcing records and they will be stored in a query locator (basically an index of the matches) then the batch execute method will iterate through these matches. What you do inside the execute method should not remove values from the result of matching records that were originally queried.


Even if some records were edited since the start of the batch job they will still be included as they matched the criteria when the original query executed. But this does bring up an interesting question. What happesn if you have a long running Apex Job and a record that met the criteria at the beging of the exectuion no longer matches the criteria when it's turn or the record to be processed. I read the documentation on batch but it didn't say.


Regardless, in my example I am not using filters, simply selecting all records from Opportunity.




That's what I would of thought as well - I don't think query locators dynamically change over time. If you find out what the cause is I would be very interested to know 





Case has been opened and hopefully they are looking at this post. Will report back when I know more.


If you get a reply can you please post the answer ? I am having the same issue.








When did you start to see this issue? Was it on a new batch apex you recently created or was it something that already existing for some time. Mine is  a recent creation (yesterday) so I have no idea what past behavior was like.



David SchachDavid Schach

Not sure if it is related, but I was seeing AggregateResult queries using SUM in a @future method dropping quite a few records. Made it synchronous, removed "static" and it worked. 

Seems asynchronous stuff is having problems in more than one place.


This was some sort of internal issue with related to sandboxes and refreshed data. All is good for me now.




Hi, it seems to me that I have the same or similar issue. I start an Apex Batch job (by command button from controller, not scheduled) which does nothing but creating a new record in a custom object for every existing opportunity and sending an e-mail to me for every batch. In "Monitoring Apex jobs" I can see the correct amount of batches to run, but at least only some of these batches are executed. Lots of records (every time with a factor of 200) are missing, let's say if I have 1454 Opps only 454 recs in my custom object are created, the records of 5 batches are not. Only tested in sandboxes because this part of code is not in production until today.

Can you give me any advice for narrowing down the reason?

What does x2od you mean with "made it synchronous"?

Thanks - Elke



I see the same behaviour in some of my larger batch jobs: the number of batch jobs (TotalJobItems) changes during the processing of the batch. Looks like SF is changing the batchsize dynamiccaly during the process.


Very frustrating since I use these numbers inside the loop, eg. for checking if we reach the final batchitem or simply for adding '('+a.JobItemsProcessed+'/'+a.TotalJobItems+')' to the title of the files created. This now leads to some random looking list.


Has anyone been able to figure out what is really happening here?


Regards, Marcel.

Prabha RGPrabha RG
Hi, I face a similar situation, where I am running a batch job to update millions of opportunities. I even tried with a subset of opportunities. When I start the batch, the Total Number of Batches is 660, but it processes only 220. The total batch size keeps reducing throughout the batch run. Any info on this is highly appreciated.

@TehNrd Whats the outcome of the case with Salesforce?
mohammad tauseef 1mohammad tauseef 1
I am also having the same issue in my sandbox. When processing a Batch in Apex Job the total size showing 1003 and after some time it reduces to 14 with a complete message. Any info on this is highly appreciated
Prabha RGPrabha RG
My issue got resolved with the help of Salesforce Case.

My case with Salesforce got the below findings: 
There was a broken Roll-Up Summary field (broken for only certain records) in the object upon which the batch job was working. This was broken because of some internal error during sandbox refresh. Whenever the batch job process these particular records, the Apex Batch skips the whole batch. Salesforce support even gave the list of 10 records which were affected (They found it using their internal tool). 
I modified the scope query, to exclude those 10 records. And it worked fine.