function readOnly(count){ }
Starting November 20, the site will be set to read-only. On December 4, 2023,
forum discussions will move to the Trailblazer Community.
+ Start a Discussion
lopezclopezc 

Trigger failed

Hi All,

I have got an error in a trigger. I am running a very simple Test Class that change the name of an account:
public class InstertContact {
static testMethod void myTest() {
Account ac= [Select Name from Account where IB_Code__c = 'AA'];
a.Name = 'Test';
System.debug(ac);
update ac;
}

}
I have a trigger that is launch each time an account is updated and modifie the contacts that match with the criteria in the select:
trigger AccountAfterUpdate on Account (afterupdate) {
for(Account loopAccount : Trigger.New) {
String Ibcode = loopAccount.IB_Code__c;
List<Contact> matchingContacts = [Select c.LastName From Contact c where c.Introducing_Broker__c='AA'];
System.debug(matchingContacts);
for(Contact loopContact : matchingContacts ){
loopContact.IB_Code__c = 'aaa';
}
}
}
However this is the error I have got:

System.DmlException: Insert failed. First exception on row 0; first error: CANNOT_INSERT_UPDATE_ACTIVATE_ENTITY, AccountAftterUpdate: execution of AfterUpdate

caused by: System.QueryException: Non-selective query against large object type (more than 100000 rows). Consider an indexed filter or contact salesforce.com about custom indexing.
Even if a field is indexed a filter might still not be selective when:
1. The filter value includes null (for instance binding with a list that contains null)
2. Data skew exists whereby the number of matching rows is very large (for instance, filtering for a particular foreign key value that occurs many times)

Trigger.AccountAfterUpdate: line 4, column 14

I don't really understand what is the problem. Someone can help me in solving this?

thanks!
lopezclopezc
hi!

I also tried to do it like the followig code, but it isn't working anyway:

for (List<Contact> matchingContacts : [Select c.LastName From Contact c where c.Introducing_Broker__c='AA']) {
............


Message Edited by lopezc on 06-12-2008 07:19 AM
BoxBox

Hi,

I'm afraid your code is a little dangerous at the moment and will quite easily trigger the gov limits.  The initial issue would be if the trigger contained more than 20 accounts by updating through dataloader as you would trigger more than 20 SOQL queries to the Contacts table.

What i suggest you look into is writing your code something along the lines of:

// i case we need to remember the ID that we came from
map<id, String> mapAccountToIBCode;
list <Contact> liContactsToUpdate

for(Account sAccount : trigger.new)
{
    if(and conditional logic to exclude unwanted accounts)
    {
        mapAccountToIBCode.put(sAccount.Id, sAccount.IB_Code);
    }
}

for(Contact[] arrContact : ([  select  what, i, need
                                              from    Contact
                                              where  YourString in :mapAccountToIBCode.values()
                                            ]))
{
    for(Contact sContact : arrContact )
    {
       // watch the gov
       if(liContactsToUpdate.size() >= 200)
       {
             update liContactsToUpdate;
             liContactsToUpdate.clear();
       }
      
       // Do your Update Processing here
   
       liContactsToUpdate.add(sContact);
    }
}

liContactsToUpdate.add(sContact);
liContactsToUpdate.clear();



Message Edited by Box on 06-13-2008 10:18 AM
lopezclopezc
Thanks very much for your feedback. I try to do what you said. I modify just a little the code but not too much and I have still the same problem when I do the query to the Contact table:
caused by: System.QueryException: Non-selective query against large
object type (more than 100000 rows). Consider an indexed filter or
contact salesforce.com about custom indexing.

I have a lot of Contacts that match the criteria in the query and I think It doesn't let me to do a query to retrieve more than a determinate number of records. Do you know if there is any other way to do it?
My code is the following:

trigger AccountBeforeUpdate on Account (before update) {
map<String, Account> mapAccountToIBCode = new Map<String, Account>();
list <Contact> liContactsToUpdate;

for(Account sAccount : trigger.new)
{
if(sAccount.IB_Code__c != null)
{
mapAccountToIBCode.put(sAccount.IB_Code__c, sAccount);
}
}

for(Contact[] arrContact : ([ select IB_Code__c
from Contact
where Introducing_Broker__c in :mapAccountToIBCode.keySet()
]))
{
for(Contact sContact : arrContact )
{
// watch the gov
if(liContactsToUpdate.size() >= 200)
{
update liContactsToUpdate;
liContactsToUpdate.clear();
}

// Do your Update Processing here
Account ac = mapAccountToIBCode.get(sContact.Introducing_Broker__c);
sContact.IB_Code__c = 'CHANGE';
liContactsToUpdate.add(sContact);
}
}

update liContactsToUpdate;
liContactsToUpdate.clear();
}



Thanks a lot for your help


Message Edited by lopezc on 06-13-2008 02:35 AM
Box @ HomeBox @ Home

Hmmm, interesting.  I have never needed to process that sort of volumes (yet).  I guess the gov limits are kicking in to protect the resources on the platform and that will be the root source of the error.

The only way to process large volumes of data is to initiate the call from a webservice and an SCrontrol, a trigger has much smaller limits but if your dealing with more than 100,000 rows you will probably still have the problem.

Have you looking into asynchronous APEX calls in Summer 08 yet?  This could be your solution?



lopezclopezc
Ok, Thanks for your feedback.

I will have a look to it. I haven't done it yet.

Thanks