function readOnly(count){ }
Starting November 20, the site will be set to read-only. On December 4, 2023,
forum discussions will move to the Trailblazer Community.
+ Start a Discussion
Klaus SchgagulerKlaus Schgaguler 

Problem with slicing of bulk inserts

I am doing a bulk insert of around 1000 entries.

I have a quite complicated before insert trigger (bulk insert save: no DML and SOQL statements within any loop) which needs around 25 SOQL statements.

Theoretically this should work. But I found out, that the 1000 entries are sliced into parts of 200 each. Therefore the trigger is called 5 times and I am running into the 100 SOQL limit in the last run.

 

Can I force the system to insert all 1000 entries at once? Or is there an other workaround I can use?

Best Answer chosen by Admin (Salesforce Developers) 
Andy BoettcherAndy Boettcher

If you use Batch APEX - you can split those guys into individual batches...and individual governor limit instances per batch...

 

-Andy

All Answers

Andy BoettcherAndy Boettcher

If you use Batch APEX - you can split those guys into individual batches...and individual governor limit instances per batch...

 

-Andy

This was selected as the best answer
Navatar_DbSupNavatar_DbSup

Hi,


Yes you are right that if your trigger is working fine for 200 records then it should not generate any error. For uploading data You are using DataLoader or Export wizard???????? If possible then share your code also.

 

Did this answer your question? If not, let me know what didn't work, or if so, please mark it solved. 

Andy BoettcherAndy Boettcher

Ahhh - good point Navatar!

 

OP - if you're using Data Loader, you can go into settings and increase your batch size from the default 200 to a higher value.

 

-Andy

Klaus SchgagulerKlaus Schgaguler

I am using webservices and apex to create the records I am trying to insert. We are talking about an automatic import.

This is a sketch of the functionality:

String data ='........';

List<MyObject> ol = new List <MyObject>();

List<String> dataLines = data.split(';');
for(String line : dataLines){
    List<String> columns = line.split(',');   
    MyObject o = new MyObject();
    ...some mapping magic...
    ol.add(o);
} 

insert ol; //this is the list of 1000 records

 

Thanks for your help.

Klaus SchgagulerKlaus Schgaguler

The batch apex approach may work (I need to get more familiar with Iterables for this), thanks for your help. Would you define this best practice for heavy triggers and a big number of entries.

 

I still can not understand why this slicing is done.