function readOnly(count){ }
Starting November 20, the site will be set to read-only. On December 4, 2023,
forum discussions will move to the Trailblazer Community.
+ Start a Discussion
jason.bradleyjason.bradley 

Using Batch Apex to Insert a large number of records, instead of operating on existing records?

Hello,

 

How would I go about using Batch Apex to insert a massive number of records? It seems like it's built for more updating records, as the start method returns a querylocator object based on an SOQL query, meaning the records that would be operated upon in the execute method would already have to exist in order to use Batch Apex to its potential.

I've attempted to simply return a list of objects in the start method, with that list being the records I want to insert, instead of a querylocator in the hopes that it would be divided up and operated upon by the Batch, but it never seems to actually reach the execute method. 

Can somehow return a fake querylocator based on the list of records I want to insert and have it divide it up correctly or will I have to resort to an inbound email handler in order to chain separate batches together that only handle up to 5000 inserts at a time? This seems like something that should have been relatively simple to do, I just haven't found the simple method yet. 

Also, the specific objects I'm trying to insert are FeedItem records, so I would like to avoid my previously mentioned strategy as it involved inserting blank records, strictly so that the query in the batch would pick up a correct number of objects, and that would mean that blank chatter posts would show up in client's feeds. 

Best Answer chosen by Admin (Salesforce Developers) 
sfdcfoxsfdcfox

You can use a custom iterator. Here is an example:

 

global class BatchSObjectFeeder implements Iterator<SObject>, Iterable<SObject> {
    SObject[] source;
    
    global Iterator<SObject> iterator() {
        return this;
    }
    
    global BatchSObjectFeeder(SObject[] source) {
        this.source = source;
    }
    
    global SObject next() {
        return source.remove(0);
    }
    
    global boolean hasNext() {
        return source!=null && !source.isempty();
    }
}

 

global class BatchProcessor implements Database.batchable<SObject> {
    SObject[] source;
    
    global BatchProcessor(SObject[] source) {
        this.source = source;
    }

    global Iterable<SObject> start(Database.BatchableContext bc) {
        BatchSObjectFeeder bf = new BatchSObjectFeeder(source);
        return bf;
    }
    
    global void execute(Database.BatchableContext bc, SObject[] scope) {
        insert scope;
    }
    
    global void finish(Database.BatchableContext bc) {
    
    }
}

To use this, simply construct a new BatchProcessor with a list of unsaved records of any size (even millions of records), then run Database.executeBatch with the class. The only limit on the list size will be the heap. You can also use Database.Stateful if you need to record errors elsewhere, or you can catch errors and log them to a separate object used for recording this sort of information.

 

Edit: Note that for efficiency, this passes the list by reference, and that the code is destructive (the list will shrink to nothing as it consumes the data). If you don't want this behavior, implement an index instead.

All Answers

sfdcfoxsfdcfox

You can use a custom iterator. Here is an example:

 

global class BatchSObjectFeeder implements Iterator<SObject>, Iterable<SObject> {
    SObject[] source;
    
    global Iterator<SObject> iterator() {
        return this;
    }
    
    global BatchSObjectFeeder(SObject[] source) {
        this.source = source;
    }
    
    global SObject next() {
        return source.remove(0);
    }
    
    global boolean hasNext() {
        return source!=null && !source.isempty();
    }
}

 

global class BatchProcessor implements Database.batchable<SObject> {
    SObject[] source;
    
    global BatchProcessor(SObject[] source) {
        this.source = source;
    }

    global Iterable<SObject> start(Database.BatchableContext bc) {
        BatchSObjectFeeder bf = new BatchSObjectFeeder(source);
        return bf;
    }
    
    global void execute(Database.BatchableContext bc, SObject[] scope) {
        insert scope;
    }
    
    global void finish(Database.BatchableContext bc) {
    
    }
}

To use this, simply construct a new BatchProcessor with a list of unsaved records of any size (even millions of records), then run Database.executeBatch with the class. The only limit on the list size will be the heap. You can also use Database.Stateful if you need to record errors elsewhere, or you can catch errors and log them to a separate object used for recording this sort of information.

 

Edit: Note that for efficiency, this passes the list by reference, and that the code is destructive (the list will shrink to nothing as it consumes the data). If you don't want this behavior, implement an index instead.

This was selected as the best answer
jason.bradleyjason.bradley

You are awesome, that is exactly the kind of solution I was looking for! Away with the cumbersome inbound email handlers!