+ Start a Discussion

​HELP. getting Maximum CPU time Exception

I have a trigger on the QuotelineItem that has some complex discount logic, it has been through mutiple iterations of optimization.  It will run well within gov limits when dealing with a batch of 200 records.   The problem is when I need to do Apex DML to all quotelineItems on a large quote (500+ items) using Apex.    Ex: update [Select id from QuoteLineItem where id=:quoteid] ).  I get the Maximum CPU time exception.  

Since triggers run in there own transaction  I was under the assumtion that each execution of the triggers CPU time for the trigger would not be aggregated in the transaction where the Apex DML > 200 records is done,  but this does not seem to be the case.  

Can anyone shed some light on the accuracy of this aggregation and possible workarounds without using batch apex.
ShashankShashank (Salesforce Developers) 
Please see if this helps: http://cloudyworlds.blogspot.in/2013/10/battling-cpu-time-out-limit-in-apex-sfdc.html
Hi Shashank,  Thankyou for your response, however I have made the optimizations as it applies to a single trigger execution, (native Apex JSON serialization per record is the CPU culprit).  This issue only comes into play when you invoke the trigger through an APEX DML call that has > 200 records.  It does not seem to reset the CPU limit between each batch of records sent to the trigger.