function readOnly(count){ }
Starting November 20, the site will be set to read-only. On December 4, 2023,
forum discussions will move to the Trailblazer Community.
+ Start a Discussion
aj4aj4 

Apexx Trigger Limit when upload/update data through data loader or any other tool

Hi !
 
We have trigger on account object. which work fine when you save records.
 
But When we insert/update the record through data loader or any third party tool  It through errors for governence limit.
 
I dont understand why it is throwing error when mass update ?  
Isnt' when you update the record through data loader or any tool it save one record at a time. Can anyone clarify what is happening in this case.
 
 
 
Thanks in advance.
 
 
 
 
jpizzalajpizzala
When you import with the Data Loader, you are not updating records individually - they are updated in chunks of 200 records.  This is why your trigger is breaching the governor limits.

If you want to learn about the governor limits, check out the Apex Developer's Guide.
aj4aj4
What is the solution if you have 100000 or more.Records ?
 
How to avoid governence limit in this case?
 
 
Any thought....
 
 
jpizzalajpizzala
It's all about how you structure your Apex code.  It sounds like you are performing some sort of query or DML command inside of a loop.  This is a big no-no from Salesforce's standpoint.

If it is a query causing the problem, gather the Salesforce Ids of all the triggered records first and put them into a collection.  Query once using the collection of ids (outside of the loop) - you will get a list of objects as a result.

If it is a DML command causing the error, move the command outside of the loop.  Instead of performing the command for every iteration of the loop, add the finished object to a list.  After the loop has finished executing and has added all the objects to the list, pass the list through the DML command.

There are plenty of examples of avoiding governor limitations throughout these boards if you take a look around.

Also, you may want to check out this wiki post.
jgrenfelljgrenfell
You do want to get your code as efficient as possible to avoid the limits and also be sure to do a test for data loading because code that works as expected when triggered by a single record can have unexpected results when processing records in bulk if it hasn't been "bulk proofed".

That being said, I have some triggers where the business logic is too complex to get the code to not set off the governor limits with inserts/updates coming in chunks of 200.  In those cases, I go to the Settings in the Data Loader and reduce the Batch Size.  I just use trial and error to figure out what the maximum batch size can be for the trigger I'm working with.
aj4aj4

Agreed,

But there is limit on number of object stored in Collection. Can  you add 100000 object in collection or list?

If you hvae 10000 records in SF Account object and trigger is for updating single field. say x.

The trigger will work fine in this stuation if you save the single record.

But when you upload the record with tools It create the problem. How will you breakdown the record chunk if there is limitation on Collection and list?

If you have sample code greatly appreciate.

 

Thanks