function readOnly(count){ }
Starting November 20, the site will be set to read-only. On December 4, 2023,
forum discussions will move to the Trailblazer Community.
+ Start a Discussion

Batch Apex CPU limit exceeded

Hi All,
I need to update new fields value for 10 million campaign member records. I have written a batch class for it and it would take around 5000 batches to update all records.

Noticed that each batch is taking around 2 mins to run. Would like to know if there is a way I could optimize below code.

global class batchclass implements Database.Batchable<sObject>,Database.stateful {
    global set<id> allIds=new set<id>();
    set<Id> failIds = new set<Id>();
    global set<id> allFailedId=new set<id>();
    global map<id,string> errormsgMap =  new map<id,String>();
    public List<Exception__c> exceptionLists = new List<Exception__c>();
    global set<id> allSuccessId=new set<id>();
    private String StringQuery;
    public batchclass(String strQuery,set<Id> failedIds) {
        StringQuery = strQuery;
        failIds = failedIds;
    global Database.QueryLocator start(Database.BatchableContext BC) {
        return Database.getQueryLocator(StringQuery);
    global void execute(Database.BatchableContext BC, List<CampaignMember> scope) {
        List<CampaignMember> updateCampList = New List<CampaignMember>();
        map<string,double> campmatrxMap = new map<string,double>();
        List<Campaign_Member_Status_Matrix__mdt> campMatrixList = [SELECT id,Order__c,MasterLabel FROM Campaign_Member_Status_Matrix__mdt];
        for(Campaign_Member_Status_Matrix__mdt mtrx: campMatrixList){
        for(CampaignMember cmpMember: scope){
            cmpMember.CampaignAccountId__c = (cmpMember.type=='contact')?string.valueof(cmpMember.campaignId)+string.valueof( : string.valueof(cmpMember.campaignId)+string.valueof(cmpMember.CompanyOrAccount).toLowerCase();
            cmpMember.Status_Priority__c = (campmatrxMap.containsKey(cmpMember.Status)!=NULL)?campmatrxMap.get(cmpMember.Status):0;
        if (!updateCampList.isempty()) {
            Constants.disableCampaignmembertrigger = true;
            database.SaveResult[] myResult=database.update(updateCampList,false);
            Constants.disableCampaignmembertrigger = false;
            for(integer i=0;i<myResult.size();i++){
                    Database.error error =  myResult.get(i).getErrors().get(0);
                    string errMsg = error.getMessage();
    global void finish(Database.BatchableContext BC) {
        for(Id ids :allIds){
        for(Id recrdId : errormsgMap.keyset()){
            Exception__c exe = new Exception__c();
            exe.Exception_Class_Name__c = 'batchclass';
            exe.Exception_Method_Name__c = 'Execute';
            exe.Exception_Details__c = recrdId+': '+errormsgMap.get(recrdId);
            insert exceptionLists;    
        integer successSize = allSuccessId.size();
        integer FailedSize = allFailedId.size();
        String FailedIds = 'SuccessCount'+successSize +'FailedRecordCount'+FailedSize;
SwethaSwetha (Salesforce Developers) 
HI Ashwin,
You can check from the query planner tool of developer console if any of these fields can be indexed.

SELECT id,Order__c,MasterLabel FROM Campaign_Member_Status_Matrix__mdt 


Note that this error generally occurs If transactions consume too much CPU time. Salesforce has a timeout limit for transactions based on CPU usage. If transactions consume too much CPU time, Salesforce shut them down as a long-running transaction.
Please alter your code based on below

Use the following limit methods in your code to debug the amount of CPU time currently used in the transaction:
Returns the CPU time (in milliseconds) accumulated on the Salesforce servers in the current transaction.

Returns the time limit (in milliseconds) of CPU usage in the current transaction.

Hope this helps you. Please mark this answer as best so that others facing the same issue will find this information useful. Thank you
Hi Sweta,

on click of query plan for the above mentioned query, it did not return any results in query plan. This custom metadata has only 15 records in it.
Is there a way we can reduce the time of each batch from 2 mins to 1.5 mins or 1 min by optimizing the code.