function readOnly(count){ }
Starting November 20, the site will be set to read-only. On December 4, 2023,
forum discussions will move to the Trailblazer Community.
+ Start a Discussion
Shree KShree K 

Does Trigger process records in Chunks or 10000 in one transaction?

Hi All, I am aware that trigger can process or perform DML operations on 10000 per transaction but I am still not clear if trigger does all 10000 records at a time or in sepearate chunks similar to batch apex procesing.

Below is my understanding so far about how records are processed in SF, Please feel free to provide your inputs if I am wrong.

For Example, If 1 Million records are inserted through Dataloader. It will push 10000 records in one go and if this leads to any trigger fire in order to update records , trigger will be able to update 10000 records per transaction(no such concept as chunks/batch size like in batch apex). am I correct?

Finally is there any way to perform DML operation on more than 10000 records in a transaction using sysnchronous apex(triggers) other than Aynchronous apex(batch apex)? 
David Zhu 🔥David Zhu 🔥
Salesforce trigger execute on batches of 200 records at one time. So, if DML on 10,000 records, trigger will be running 50 times. It processes 200 records each time.

DML on more than 10,000 records in one transactions will hit governor limit and will fail.

 
Shree KShree K
Hi Davis, Thanks for your quick response, I have a question here. Is there anywhere documented or any reference available that trigger executes on batches?
David Zhu 🔥David Zhu 🔥
You bought up  a good point. We are always told the default chunk size is 200. I remember I was told the same thing when I attended Salesforce training course.

Where is Salesforce document?
I did some search and could not find any document explicity mentions that is the number. 

But I still find a few; The following is one of them.
https://developer.salesforce.com/docs/atlas.en-us.apexcode.meta/apexcode/langCon_apex_dml_limitations.htm

For Apex, the chunking of the input array for an insert or update DML operation has two possible causes: the existence of multiple object types or the default chunk size of 200.