function readOnly(count){ }
Starting November 20, the site will be set to read-only. On December 4, 2023,
forum discussions will move to the Trailblazer Community.
+ Start a Discussion
Rahul SharmaRahul Sharma 

System.TypeException: Cannot have more than 10 chunks in a single operation.

Hi Board,

I'm facing a issue in trigger where in i'm trying to perform a dml statement using List<sObject>.

It throws exception for me when i have more then 900 records in my list.

Error message is stated below:

System.TypeException: Cannot have more than 10 chunks in a single operation. Please rearrange the data to reduce chunking.

 

Best Answer chosen by Rahul Sharma
WilliamDWilliamD

The order of the items seems to play a part in the 'chunking'. It chunks everytime you switch object types. If you are passing 12 items in a list of types A and B, this list contains 2 chunks:

 

A, A, A, A, A, A, B, B, B, B, B, B

 

But this list contains 12 chunks:

 

A, B, A, B, A, B, A, B, A, B, A, B

All Answers

SeAlVaSeAlVa

Hello,

 

How many types of objects are you putting into the List?

 

If less than 10, then you should group them by type. (How? just create lists for each object, for example, and when you are done, loop over all the object-list adding them to the merged list).

 

If more than 10, consider performing more dmls operations, (one per each group of 10 different objects).

 

Regards. (I'm just guessing, hope this work)

Rahul SharmaRahul Sharma

Yup, i have 7 datatypes and they are added in any fashion to sObject list.

Actually I was trying to avoid the grouping as my business logic is so huge :-( . But atleast it will save dml statements so will give it a try.

 

Will mark the post resolved after trying. Thanks for some shedding lights over the exception :-)

Rahul SharmaRahul Sharma

Hi SeAlVa,

 

After debugging i noticed that the lists are already grouped according to Type and only are 7 in number but still facing same exception.

I fear that will have to use different instances of every list for performing DML. :(

amagaamaga

I was facing the same issue when trying to update a list of SOBjects that contained 6 different objects.

 

The solution was to separate the objects into 2 lists of 3 and calling separate updates.

Rahul SharmaRahul Sharma

Yeah, I also implemented it in the similar way. 

I had misunderstood exception messages as 10 chunks = 10 different types of objects in single list of sObject.

Still i'm not aware what exactly the exception message conveys.

WilliamDWilliamD

The order of the items seems to play a part in the 'chunking'. It chunks everytime you switch object types. If you are passing 12 items in a list of types A and B, this list contains 2 chunks:

 

A, A, A, A, A, A, B, B, B, B, B, B

 

But this list contains 12 chunks:

 

A, B, A, B, A, B, A, B, A, B, A, B

This was selected as the best answer
JeeedeeeJeeedeee

Hello all,

 

A bit old topic, but I solved this using following util method.  I think this should work, any remarks or tips about the code?

 

private static void saveSobjectSet(Set <Sobject> setToUpdate) {
   Integer SFDC_CHUNK_LIMIT = 10;

   // Developed this part due to System.TypeException: Cannot have more than 10 chunks in a single operation
   Map<Schema.SObjectType, List<Sobject>> sortedMapPerObjectType = new Map<Schema.SObjectType, List<Sobject>>();
   for (Sobject obj : setToUpdate) {
   	Schema.SObjectType objType = obj.getSObjectType();
   	if (! sortedMapPerObjectType.containsKey(objType)) {
            sortedMapPerObjectType.put(objType, new List<Sobject>());
   	}
   	sortedMapPerObjectType.get(objType).add(obj);
   }
   while(sortedMapPerObjectType.size() > 0) {
   	// Create a new list, which can contain a max of chunking limit, and sorted, so we don't get any errors 
   	List<Sobject> safeListForChunking = new List<Sobject>();
   	List<Schema.SObjectType> keyListSobjectType = new List<Schema.SObjectType>(sortedMapPerObjectType.keySet());
   	for (Integer i = 0;i<SFDC_CHUNK_LIMIT && !sortedMapPerObjectType.isEmpty();i++) {
           List<Sobject> listSobjectOfOneType = sortedMapPerObjectType.remove(keyListSobjectType.remove(0)); 
           safeListForChunking.addAll(listSobjectOfOneType);
   	}
   	update safeListForChunking;
   }
}

 edited: added check for && !sortedMapPerObjectType.isEmpty()

Saravanan @CreationSaravanan @Creation
Hi,

I want to create insert an new list. If anybody knows how to do it.Please let me know

Thanks,
Saravanan @CreationSaravanan @Creation
HI all,

Below post resolved this issues.

http://bartoszborowiec.wordpress.com/2014/06/15/execution-failed-system-typeexception-cannot-have-more-than-10-chunks-in-a-single-operation-please-rearrange-the-data-to-reduce-chunking/

Thanks,
Praveen Kumar 443Praveen Kumar 443
Please use Sort method in list before performing DML operation
Harendra Singh 19Harendra Singh 19
Using List.sort method before doing DML has resolved this issue.
julien Castellucijulien Castelluci

Hello, 
Jeeedeee's code is great but it misses a important scenario which generate chunk: If in your list of sObject you have more than 200 records of the same object it will generate one chunk per 200 records. E.g: you have 1600 A, 50 B, 50 C and 50 D you will reach 10 chunk limit even with 4 objects.

I modified a bit Jeeedeee's code to cover this use case:

private static void saveSobjectSet(List <Sobject> listToUpdate) {
        Integer SFDC_CHUNK_LIMIT = 10;
        
        // Developed this part due to System.TypeException: Cannot have more than 10 chunks in a single operation
        Map<String, List<Sobject>> sortedMapPerObjectType = new Map<String, List<Sobject>>();
        Map<String, Integer> numberOf200ChunkPerObject = new Map<String, Integer>();
        for (Sobject obj : listToUpdate) {
            String objTypeREAL = String.valueOf(obj.getSObjectType());
            
            if (! numberOf200ChunkPerObject.containsKey(objTypeREAL)){
                numberOf200ChunkPerObject.put(objTypeREAL, 1);
            }
            // Number of 200 chunk for a given Object
            Integer numnberOf200Record = numberOf200ChunkPerObject.get(objTypeREAL);
            // Object type + number of 200 records chunk
            String objTypeCURRENT = String.valueOf(obj.getSObjectType()) + String.valueOf(numnberOf200Record);
            // CurrentList
            List<sObject> currentList = sortedMapPerObjectType.get(objTypeCURRENT);
            
            if (currentList == null || currentList.size() > 199) {
               if(currentList != null && currentList.size() > 199){
                    numberOf200ChunkPerObject.put(objTypeREAL, numnberOf200Record + 1);
                    objTypeCURRENT = String.valueOf(obj.getSObjectType()) + String.valueOf(numnberOf200Record);            
                }
                sortedMapPerObjectType.put(objTypeCURRENT, new List<Sobject>());
            }
            sortedMapPerObjectType.get(objTypeCURRENT).add(obj);
        }
        while(sortedMapPerObjectType.size() > 0) {
            // Create a new list, which can contain a max of chunking limit, and sorted, so we don't get any errors 
            List<Sobject> safeListForChunking = new List<Sobject>();
            List<String> keyListSobjectType = new List<String>(sortedMapPerObjectType.keySet());
            for (Integer i = 0;i<SFDC_CHUNK_LIMIT && !sortedMapPerObjectType.isEmpty();i++) {
                List<Sobject> listSobjectOfOneType = sortedMapPerObjectType.remove(keyListSobjectType.remove(0)); 
                safeListForChunking.addAll(listSobjectOfOneType);
            }
            
            update safeListForChunking;
        }
    }
 

It might not be the best optimization, I'm happy to take comments on this.

Hope it helps.