You need to sign in to do that
Don't have an account?
Klaus Schgaguler
Problem with slicing of bulk inserts
I am doing a bulk insert of around 1000 entries.
I have a quite complicated before insert trigger (bulk insert save: no DML and SOQL statements within any loop) which needs around 25 SOQL statements.
Theoretically this should work. But I found out, that the 1000 entries are sliced into parts of 200 each. Therefore the trigger is called 5 times and I am running into the 100 SOQL limit in the last run.
Can I force the system to insert all 1000 entries at once? Or is there an other workaround I can use?
If you use Batch APEX - you can split those guys into individual batches...and individual governor limit instances per batch...
-Andy
All Answers
If you use Batch APEX - you can split those guys into individual batches...and individual governor limit instances per batch...
-Andy
Hi,
Yes you are right that if your trigger is working fine for 200 records then it should not generate any error. For uploading data You are using DataLoader or Export wizard???????? If possible then share your code also.
Did this answer your question? If not, let me know what didn't work, or if so, please mark it solved.
Ahhh - good point Navatar!
OP - if you're using Data Loader, you can go into settings and increase your batch size from the default 200 to a higher value.
-Andy
I am using webservices and apex to create the records I am trying to insert. We are talking about an automatic import.
This is a sketch of the functionality:
Thanks for your help.
The batch apex approach may work (I need to get more familiar with Iterables for this), thanks for your help. Would you define this best practice for heavy triggers and a big number of entries.
I still can not understand why this slicing is done.