You need to sign in to do that
Don't have an account?

First error: Too many DML rows: 10001
Hello,
I have a batch job which runs to populate two objects.. Monthly Activity(MonAct) and URL Individual monthly Activity (URLIndMonAct)...
The problem with the response is that I am quering for MonAct records and I am getting back 5 of them . And each of those MonAct records contains with it thousands of URLIndMonAct records...
I have limited my query to just 2 records per batch but I am still getting DML Rows : 10001 error ..
If I redice the size to 1 record, the batch is running for too long...
Can any one please guide me, how can I modify my code to prevent this error and not lose any records while processing...
Any help is appreciated.
Thanks!
I have a batch job which runs to populate two objects.. Monthly Activity(MonAct) and URL Individual monthly Activity (URLIndMonAct)...
The problem with the response is that I am quering for MonAct records and I am getting back 5 of them . And each of those MonAct records contains with it thousands of URLIndMonAct records...
I have limited my query to just 2 records per batch but I am still getting DML Rows : 10001 error ..
If I redice the size to 1 record, the batch is running for too long...
Can any one please guide me, how can I modify my code to prevent this error and not lose any records while processing...
Any help is appreciated.
Thanks!
global class BatchToUpdateGARecords implements Database.Batchable<sObject>, Database.Stateful, Schedulable, Database.AllowsCallouts{ private List<String> EMAIL_BATCH_RESULTS = new List<String>{System.Label.Email_List}; global IndividualMonthlyGARecords.BatchResponse runningBatchResponse; private String query; private Integer queryYear; private Integer queryMonth; global BatchToUpdateGARecords(Integer year, Integer month){ runningBatchResponse = new IndividualMonthlyGARecords.BatchResponse(); // Validate user input, if request to run batch is not for todays date if(month != null && year != null && month >= 0 && month <= 12 && year > 1950){ this.queryYear = year; this.queryMonth = month; } else{ Date yesterdaysDate = Date.today().addDays(-1); this.queryYear = yesterdaysDate.year(); this.queryMonth = yesterdaysDate.month(); } this.query = 'SELECT Id, GID__c '; this.query += 'FROM Monthly_Activity__c '; this.query += 'WHERE Year__c = ' + queryYear + ' '; this.query += 'AND Month__c = ' + queryMonth + ' '; this.query += 'AND GID__c <> null '; this.query += 'AND GID__c > 0 '; } global BatchToUpdateGARecords(){ this(null, null); } global Database.QueryLocator start(Database.BatchableContext BC){ return Database.getQueryLocator(query); } global void execute(Database.BatchableContext BC, List<Monthly_Activity__c> currentBatchRecords){ List<Monthly_Activity__c> MonthlyActivities = [ SELECT Id, GID__c, Month__c, Year__c FROM Monthly_Activity__c WHERE Year__c =: queryYear AND Month__c =: queryMonth AND GID__c IN: Pluck.decimals('GID__c', currentBatchRecords) ]; List<URL_Individual_Monthly_Activity__c> urlIndividualMonthlyActivities = [ SELECT Id, GID__c, URL__c, Month__c, Year__c FROM URL_Individual_Monthly_Activity__c WHERE Year__c =: queryYear AND Month__c =: queryMonth AND GID__c IN: Pluck.decimals('GID__c', currentBatchRecords) ]; if(MonthlyActivities.isEmpty()){ return; } try{ IndividualMonthlyGARecords.batchHandlerToUpdateRecords( runningBatchResponse, MonthlyActivities, urlIndividualMonthlyActivities, queryYear, queryMonth ); }catch(exception ex){ system.debug('exception call :'+ ex.getMessage()); system.debug('exception call line :'+ ex.getStackTraceString()); } if(runningBatchResponse != null && !runningBatchResponse.getSuccessRecords().isEmpty()){ List<Database.SaveResult> updateResults = Database.update(runningBatchResponse.getSuccessRecords(), false); for(Database.SaveResult updateResult : updateResults){ if(!updateResult.isSuccess()){ for(Database.Error err : updateResult.getErrors()){ runningBatchResponse.addDatabaseError(err.getMessage()); } } } } runningBatchResponse.clearSuccessRecords(); if(runningBatchResponse != null && !runningBatchResponse.getSuccessRecordsIMA().isEmpty()){ List<Database.SaveResult> updateResults1 = Database.update(runningBatchResponse.getSuccessRecordsIMA(), false); for(Database.SaveResult updateResult1 : updateResults1){ if(!updateResult1.isSuccess()){ for(Database.Error err : updateResult1.getErrors()){ runningBatchResponse.addDatabaseError(err.getMessage()); } } } } runningBatchResponse.clearSuccessRecords(); } global void execute(SchedulableContext SC){ Database.executeBatch(new BatchToUpdateGARecords(), 2); //Changed from 5 } global void finish(Database.BatchableContext BC){ AsyncApexJob apexBatchResult = [ SELECT Id, Status, NumberOfErrors, JobItemsProcessed, TotalJobItems, CreatedBy.Email FROM AsyncApexJob WHERE Id =: BC.getJobId() ]; // Generate email body String emailBody = 'Apex Batch to Update PageviewSessions processed ' + apexBatchResult.TotalJobItems + ' batches with '+ apexBatchResult.NumberOfErrors + ' failures.\n\n' + 'Database errors (if any): ' + JSON.serialize(runningBatchResponse.getDatabaseErrors()) + '\n'; // Extract error string from batch response //emailBody += runningBatchResponse.generateErrorString(); // Send email Messaging.SingleEmailMessage mail = new Messaging.SingleEmailMessage(); mail.setToAddresses(EMAIL_BATCH_RESULTS); mail.setSenderDisplayName('About.com Experts - Batch Results'); mail.setSubject('About.com - Batch to Update PageviewSessions - status: ' + apexBatchResult.Status); mail.setPlainTextBody('Batch Process has completed\n\n' + emailBody); Messaging.sendEmail(new List<Messaging.SingleEmailMessage>{mail}); } }
The issue will happen while while your records are beeing process for update.
Database.update(runningBatchResponse.getSuccessRecords(), false);--- This line
runningBatchResponse.getSuccessRecords()-- what is return type of this method if its List kinldy check that list size. if its more then 10000 the issue is there. make sure your list contains not more then 10000 record for update process.
Note:
this you should keep in mind that list you are inserting should not have items more than 10000. so while using batch process please keep your small batch size so in each batch more than 10000 record can not come in list to be inersted at a time.
because this is salesforce limiation ( not to execute dml on more than 10000 item at once). so we should customize way of creating list. as one way I suggested above.
Hope this will help you to resolve your issue.
Thanks
karthik
All Answers
The issue will happen while while your records are beeing process for update.
Database.update(runningBatchResponse.getSuccessRecords(), false);--- This line
runningBatchResponse.getSuccessRecords()-- what is return type of this method if its List kinldy check that list size. if its more then 10000 the issue is there. make sure your list contains not more then 10000 record for update process.
Note:
this you should keep in mind that list you are inserting should not have items more than 10000. so while using batch process please keep your small batch size so in each batch more than 10000 record can not come in list to be inersted at a time.
because this is salesforce limiation ( not to execute dml on more than 10000 item at once). so we should customize way of creating list. as one way I suggested above.
Hope this will help you to resolve your issue.
Thanks
karthik
Here is some Example for Database.Insert, just replace the code in excute method at batch class if you want to use Databse.Insert in batch apex. keep governer limits in mind while process bulk record via batch apex.
NOTE:
If we use the DML statement (insert), then in bulk operation if error occurs, the execution will stop and Apex code throws an error which can be handled in try catch block.
If DML database methods (Database.insert) used, then if error occurs the remaining records will be inserted / updated means partial DML operation will be done.
Hope this will clear.
Thanks
karthik
here is my code...
global class batchExample implements Database.Batchable<sObject> {
global Database.QueryLocator start(Database.BatchableContext BC) {
// collect the batches of records or objects to be passed to execute
String query = 'SELECT Id,Name FROM Account limit 2' ;
system.debug('get qry'+query);
return Database.getQueryLocator(query);
}
global void execute(Database.BatchableContext BC, List<Account> scope) {
List<Account> accList = new List<Account>();
// process each batch of records
for (integer i=0;i<n;i++)
{
Account a= new Account(Name ='testA'+i,BillingCity='SA port'+i,Phone='1212'+i);
accList.add(a);
}
system.debug('wat is scope'+scope);
try {
// insert the Account Record
Database.SaveResult[] srList = Database.insert(accList, false);
for (Database.SaveResult sr : srList) {
if (sr.isSuccess()) {
System.debug('thea here');
// Operation was successful, so get the ID of the record that was processed
System.debug('Successfully inserted account. Account ID: ' + sr.getId());
}
else {
// Operation failed, so get all errors
for(Database.Error err : sr.getErrors()) {
System.debug('The following error has occurred.');
System.debug(err.getStatusCode() + ': ' + err.getMessage());
System.debug('Account fields that affected this error: ' + err.getFields());
}
}
}
} catch(Exception e) {
System.debug(e);
}
}
global void finish(Database.BatchableContext BC) {
// execute any post-processing operations
}
}
execution :
batchExample be = new batchExample();
database.executeBatch(be,60000);
execute(Database.BatchableContext BC, List<Account> scope)--> this scope is nothing but the list from Database.QueryLocator ??
Basically i want to insert 60k / 70k/50k new records thru apex code....I will pass the number from execution and records will be inserted in a format getting the i as incremental param.
Please let me know if i am confusing ....Appreciate your inputs...
you code works fine for me. other than that
yes it will insert 60K record but it happens through batch by batch. 1 batch contains 200 records. this is the governer limits.
getQueryLocator(query)- its use to process the records up to 50 mill records for excute method through scope variable.
QueryLocator(query)- only 50000 records can process.
database.executeBatch(be,60000);
default batch size of 200 to a lower value.
You can use the Database.executeBatch(sObject className, Integer scopeSize) method to set the new batch size.
Be sure the scopeSize should less than 200.
Database.executeBatch(sObject className, 50 ) would set the batch size to 50 instead of 200
Hope this will clear.
Thanks
karthik
1.Also Database.executeBatch(sObject className, Integer scopeSize), when i pass integer as param its throwing exception as saying it should be list.
2.I am not clear with the details you mentioned before.... suppose if i giveas below for execution how it will break as batch ?
Database.executeBatch(sObject className, 60000 )
3. if 60000 is what we pass as scope.size... then what is the ppurpose of start method here and what is the imp of that query in this insertion process?
could you please explain... am bit not clear...Thanksss