You need to sign in to do that
Don't have an account?
Vamsi D
Processing more than one million records
I have an object with more than one million records. For same object, I am getting a file for processing every other day. My requirement is to compare all the records(+million) in the file with existing records in the object.(Assume there is a unique key AccountID in the object and file for comparing). If any changes are there like value of any attribute changes, we have to update the existing record.What is the best way to achieve this? .. I know that we can process upto 50 million records using batch. But i have one more concern like how can i fetch more than 50k records in execute method from object(+million) for comparison...? and also i want to know is there any performance issue if we process this much huge data...?
thanks
shashi
Let me know if it works..
thanks
shashi
global class Batch_AccountMatching implements Database.Batchable<SObject>, Schedulable {
/* ********************* Batchable methods below ********************* */
// Batch start with getting records
global Database.QueryLocator start(Database.BatchableContext bc){
String soql = getQuery();
return Database.getQueryLocator(soql);
}
// Batch Execute
global void execute(Database.BatchableContext bc, list<Account> accountRecords) {
logic ---------------
} // end of batch - execute
// Batch finish
global void finish(Database.BatchableContext bc) {
if (bc != null) {
System.debug('finish,job id --> ' + bc.getJobId());
}
}
// query
private static String getQuery() {
string query = 'SELECT Id,Name'
+ 'FROM Account';
return query;
}