function readOnly(count){ }
Starting November 20, the site will be set to read-only. On December 4, 2023,
forum discussions will move to the Trailblazer Community.
+ Start a Discussion
J BengelJ Bengel 

Are rows from separate SOQL queries in the same Apex class combined for limit testing?

I'm getting a "Too many query rows: 50001" error in a queueable class that executes 5 separate SOQL queries, using data captured in the first one to filter the others. None of these quereis individually returns anywhere near 50,000 records (the largest of them is under 19,000), but collectively, they retrieve around 53,000.

This is oversimplifying, but to illustrate what I'm talking about, we have:
Object0, which is only a vehicle to hold lall the parts together. It stores the working parms for the query, and serves as a recod to attach CSV files to. (It's a long story.)

The live data is stored in several custom objects, with  the starting point being Object1, which has lookups on Object2 and Object 3, which in turn have lookups back from Object4, Object5, and Object6. I query Object1, and while processing the data from that, squirrel away the lookup fields in a set of Ids on Object2 and Object3. Every other query can use those unduplciated lists as a filter on either the Id field or a lookup field I can constrain on. Then I process each recordset turning the selected data into a CSV that I attach to the Object0 "base" record. That way the only DML operations I'm doing happen as I'm updating the data fields in the base record, and when I'm inserting the ContentVersion record. This is a stripped down version for illustration.

public Set<Id> obj2Ids = new Set<Id>(); // unduplicated sets of record Ids
public Set<Id> obj3Ids = new Set<Id>(); // on Object2 and Object3
public List<String> obj1Recs = new List<String>(); // arrays of strings that will become
public List<String> obj2Recs = new List<String>(); // the Version data on records in
public List<String> obj3Recs = new List<String>(); // ContenVersion
public List<String> obj4Recs = new List<String>();
public List<String> obj5Recs = new List<String>();
public List<String> obj6Recs = new List<String>();

public void execute(QueueableContext context){
	buildObj1();
	buildObj2();
	buildObj4();
}

public void buildObj1(){

List<Object1> termEnr = [Select field1, field2, Object2Lookup, Object3Lookup, field3, fieldn From Object1 Where field1 <= :parm1 AND statusField Not IN ('In Progress','Submitted') AND
(field2 = null or field2 >= :parm2]

// then I'm saving the lookups on Object2 and Object 3 in an unduplicated collection like:

	for(Object1 enr:ternEnr){.add(
		String obj1Rec = enr.field1+'|'+enr.field2+'|'+enr.field3+'|'+enr.fieldn;
		obj1Recs.add(obj1Rec);
		obj2Ids.add(enr.Object2Lookup);
		obj3Ids.add(enr.Object3Lookup);
	}
	attachCSV(obj1Recs);
}

public void buildObj2(){
	List<Object2> termOcc = [Select field1, field2, field3, fieldn From Object2 Where Id IN :obj2Ids];

	for(Object2 occ:termOcc){
		String obj2Rec = occ.field1+'|'+occ.field2+'|'+occ.field3+'|'+occ.fieldn;
		obj2Recs.add(obj2Rec);
		}
		
	attachCSV(obj2Recs);
}

public void buildObj4(){
// Object4 has a lookup back Object2, whic allows me to filter Object4 based on the list of unique Ids I compiled way back when I was processing Object1
	List<Object4> termWS = [Select field1, field2, field3, Object2Lookup, fieldn From Object4 Where Object2Lookup IN :obj2Ids];
	for(Object4 ws:termWs){
		String obj4Rec = ws.field1+'|'+ws.field2+'|'+ws.field3+'|'+ws.fieldn;
		obj4Recs.add(obj4Rec);
		}
	attachCSV(obj4Recs);
}
 There's more, but you get the idea. I select a bunch of records loop over them putting together a delimited string that gets added ot a list of such strings, then I join the all, turn the result into a blob, and attach the result as a file on the Object0 record. 

So the query that extracts from Object1 reutrns 12,041 records.
The unduplicated list list of Ids on Object2 is 838, so that query reutrns 838 records.
Object3 returns 857.
Object4 returns 2832
Object5 returns 18,755
and Object 6 returns 18,069.

In total, my manual queries add up to 53,393 records across all queries. But none of the 6 alone is remotely close to the limit.

I ran across the article "Working with very large SOQL Queries (https://developer.salesforce.com/docs/atlas.en-us.apexcode.meta/apexcode/langCon_apex_SOQL_VLSQ.htm)", but none of these queries is even approaching the limit on its own, so I don't see how that helps.

I've considered taking the methods that generate the largest queries and turning them into their own classes and chaining them as new jobs, passing the filter sets along from the main class. I'm not crazy about the idea, but if it will work, it would probably beat having to refactor the whole enterprise as a batch job. And I'm not enrtirely sure that that would solve my problem anyway, since the issue is in the size of the recordset returned by the queries.

Vishwajeet kumarVishwajeet kumar
Hello,
Yes, 50K limit is per transaction. You to need to split the transactions(, since all above code is running in one transaction hence the error).

Looks like Methods : buildObj2(); buildObj4(); are doing similar job on different object.
i would recommand to write a queable class which can work based on Query or objectApiName and List of records Ids to do the job which method - buildObj2(); buildObj4(); are doing then queue those processes from current class execute method.

Example: 
New Queuable Class :
Public Class ProcessContentVersions implements Queueable(){
      public ProcessContentVersions(Set<Id> recordIds, String objectApiName){
           //Intialize/store variables data
      }              

      void execute(){
       //Query records based on Ids, ObjectApiName
       //Create ContentVersion
      }
}


Existing Class updates: 
public void execute(QueueableContext context){
    buildObj1();
    System.enqueueJob(new ProcessContentVersions (obj2Ids, 'Object2' ));                        //Replace call with new class, buildObj2();
    System.enqueueJob(new ProcessContentVersions (obj3Ids, 'Object2Lookup' ));          //Replace call with new class, buildObj4();

}


Thanks
J BengelJ Bengel
Thanks,
That was the direction I was leaning when I posted the question, but I wasn't certain that enqueueing a new job would actually start a new transaction. I suspect that at some future point I'm going to have to refactor this whole enterprise for batch apex, but for now this seems like the omst direct route form where I currently am to where I need to get.
J BengelJ Bengel
This at least has a different problem. It's a little vague in the docuemtnation, but apparenlty you can only chain one job from a given class. Else you get:
"Too many queueable jobs added to the queue: 2"

I think I knew this, but forgot it and tried to do it with the queuable jobs in parallel, which yielded the aforementioned error. (It makes snes that ther ewould be a limit on this, since the whole purpose of goevernor limits in the first place is to make sure nobody's hogging resources. That doesn't make it less frustrating, just more understandable.)

This does solve one problem (afetr a fashion), which was how to fire a "summary job" after all of the other jobs had completed. When I was running them in parallel, even if it had worked I woud'nt have a good way of knowing when they had all finished. Since I have to run them serially, I can simply chain  the summary job from the last "detail" job.

More news as it happens.
J BengelJ Bengel
Solved (for now):
This actually required chaining serially, and forwarding along both Set<Id> collections so they could be used where needed.
So the first job was to initialize everythign and run the first query, building each table, and saving the Ids for the two Set<Id> collections. Then it enqueues the second job as a separated process (new Job Id) passes it the Id of the "control" record, and the two Sets.
The second job (which processes Object2) only uses the first of the two Set collecitons, but passes them on to the third process when it enqueues the next job. Each job reads the contorl record and calls a static routine in a utility class to create and attach the CSV's.

In total, there are 5 queuable classes, 3 of which can handle 2 tables each (either because the bogh tables are derived fomr the same sObject or the query results are small enoguh to not cause a problem). The two monsters got classes of their own, and the whole enterprise took bewteen 1 and 2 minutes to run, which for a fire-and-forget is well within acceptable limits. (Thats the story I'm gonna stick with anyway.)