You need to sign in to do that
Don't have an account?
earlbarby
Trigger data chunking on large updates
Hi, it appears that when a trigger runs on a large update, salesforce chunks data into 200 record batches and executes the trigger multiple times, causing any queries in that trigger to run multiple times. Even with our triggers fully bulkified (a single query), we're hitting the query governor limits on a 6000 record update because that single query is running 30 times. Is this chunking behavior new or has it always been like this? Is there a workaround of some sort? A simplified example follows.
trigger CheckBusinessHours on Case (before update) { //this query runs for each of the 200 record chunks BusinessHours DefaultBH = [Select id from BusinessHours where IsDefault = true]; /* ... other code, no more queries ... */ }
Try to use static variable
Create a class like
public class Constants
{
//Constant to block execution of SOQL
public static Boolean runSOQLQuery = true;
public static BusinessHours BusinessHours DefaultBH{get;set;}
}
use it in your trigger like
Now this query will only run once
I hope this will work for youa, please ask if any issues.
Hello Shashikant. Thanks for the reply. I considered that, but I'd prefer to avoid using a separate class to store variables that only need to be local because it's confusing/sloppy coding and because the trigger already complies with salesforce's own best practices on bulkification. I'm having trouble even finding a mention of data chunking in this context though. Is this behavior documented anywhere? Can it be disabled? Or is it really necessary to write all triggers on the assumption that their dataset is going to be broken up into 200 record batches?
I don't think there is any other solution for you to have a static class, yes it is true that your trigger is written as per best practice for bulkifying, but of you know the trigger execution it will again come in for next chunk of 200 ans your query will get executed unnecessarary. I don't think you have any other option to use static class.
To my knowledge, this is the way it has always been. I've written tons of triggers with SOQL queries in them and never run into this issue when doing bulk updates. Are you sure it's this trigger that is causing the issue and not another trigger down the line?
I'm positive; the trigger is definitely executing multiple times for batches larger than 200 records, each time rerunning queries. I probably wouldn't have even noticed this if not for a client doing something non-standard causing a trigger to run on 6000 records when it's typically running only a few hundred.
I made a quick example in a clean dev org with only this trigger and a test class:
Inserting a batch of 1050 cases produces a log that looks like this:
As you can see, it's pretty easy to exhaust the 100 query governor limit if you have a batch of several thousand records and a trigger that needs to pull data from a few separate objects.
Interesting. Perhaps this is a change that came with the increased governor limits for triggers? The query limit used to be much smaller than 100, so I can't imagine it behaved the same way in those days.