You need to sign in to do that
Don't have an account?

Trigger failed
Hi All,
I have got an error in a trigger. I am running a very simple Test Class that change the name of an account:
System.DmlException: Insert failed. First exception on row 0; first error: CANNOT_INSERT_UPDATE_ACTIVATE_ENTITY, AccountAftterUpdate: execution of AfterUpdate
caused by: System.QueryException: Non-selective query against large object type (more than 100000 rows). Consider an indexed filter or contact salesforce.com about custom indexing.
Even if a field is indexed a filter might still not be selective when:
1. The filter value includes null (for instance binding with a list that contains null)
2. Data skew exists whereby the number of matching rows is very large (for instance, filtering for a particular foreign key value that occurs many times)
Trigger.AccountAfterUpdate: line 4, column 14
I don't really understand what is the problem. Someone can help me in solving this?
thanks!
I have got an error in a trigger. I am running a very simple Test Class that change the name of an account:
public class InstertContact {I have a trigger that is launch each time an account is updated and modifie the contacts that match with the criteria in the select:
static testMethod void myTest() {
Account ac= [Select Name from Account where IB_Code__c = 'AA'];
a.Name = 'Test';
System.debug(ac);
update ac;
}
}
trigger AccountAfterUpdate on Account (afterupdate) {However this is the error I have got:
for(Account loopAccount : Trigger.New) {
String Ibcode = loopAccount.IB_Code__c;
List<Contact> matchingContacts = [Select c.LastName From Contact c where c.Introducing_Broker__c='AA'];
System.debug(matchingContacts);
for(Contact loopContact : matchingContacts ){
loopContact.IB_Code__c = 'aaa';
}
}
}
System.DmlException: Insert failed. First exception on row 0; first error: CANNOT_INSERT_UPDATE_ACTIVATE_ENTITY, AccountAftterUpdate: execution of AfterUpdate
caused by: System.QueryException: Non-selective query against large object type (more than 100000 rows). Consider an indexed filter or contact salesforce.com about custom indexing.
Even if a field is indexed a filter might still not be selective when:
1. The filter value includes null (for instance binding with a list that contains null)
2. Data skew exists whereby the number of matching rows is very large (for instance, filtering for a particular foreign key value that occurs many times)
Trigger.AccountAfterUpdate: line 4, column 14
I don't really understand what is the problem. Someone can help me in solving this?
thanks!
Message Edited by lopezc on 06-12-2008 07:19 AM
Hi,
I'm afraid your code is a little dangerous at the moment and will quite easily trigger the gov limits. The initial issue would be if the trigger contained more than 20 accounts by updating through dataloader as you would trigger more than 20 SOQL queries to the Contacts table.
What i suggest you look into is writing your code something along the lines of:
// i case we need to remember the ID that we came from
map<id, String> mapAccountToIBCode;
list <Contact> liContactsToUpdate
for(Account sAccount : trigger.new)
{
if(and conditional logic to exclude unwanted accounts)
{
mapAccountToIBCode.put(sAccount.Id, sAccount.IB_Code);
}
}
for(Contact[] arrContact : ([ select what, i, need
from Contact
where YourString in :mapAccountToIBCode.values()
]))
{
for(Contact sContact : arrContact )
{
// watch the gov
if(liContactsToUpdate.size() >= 200)
{
update liContactsToUpdate;
liContactsToUpdate.clear();
}
// Do your Update Processing here
liContactsToUpdate.add(sContact);
}
}
liContactsToUpdate.add(sContact);
liContactsToUpdate.clear();
Message Edited by Box on 06-13-2008 10:18 AM
Message Edited by lopezc on 06-13-2008 02:35 AM
Hmmm, interesting. I have never needed to process that sort of volumes (yet). I guess the gov limits are kicking in to protect the resources on the platform and that will be the root source of the error.
The only way to process large volumes of data is to initiate the call from a webservice and an SCrontrol, a trigger has much smaller limits but if your dealing with more than 100,000 rows you will probably still have the problem.
Have you looking into asynchronous APEX calls in Summer 08 yet? This could be your solution?
I will have a look to it. I haven't done it yet.
Thanks