• MJ09
  • NEWBIE
  • 220 Points
  • Member since 2008

  • Chatter
    Feed
  • 7
    Best Answers
  • 3
    Likes Received
  • 0
    Likes Given
  • 55
    Questions
  • 129
    Replies

I am trying to create a new event using the following trigger. My logic seems right. I am pulling data from a custom object into a new calendar event. Can anyone tell me how to create a new calendar event passing the mandatory parameters from a custom object?

Hello there,

 

I'm getting 70% coverage with the test below, to a class working as a controller to a VF page. Dunno what am I missing and the allways good help is appreciated! :)

 

Class:

public class KeyLeadership
{

 
  private List<Leadership_position_code__c> Leaders;
  public List<Leadership_position_code__c> getLeaders()
    
    

    {
      Leaders = [ Select l.Contact__c, l.Contact__r.AccountId,  l.Contact__r.Email, l.Contact__r.Name, l.Position__c, l.Start_Date__c, l.Thru_Date__c from Leadership_Position_Code__c l WHERE (Contact__r.AccountId = :ApexPages.currentPage().getParameters().get('id') ) and 

    (l.Thru_Date__c = null OR l.Thru_Date__c > :system.today()) and ( l.Position__c = 'President' OR l.Position__c = 'Education Director' OR l.Position__c = 

'Co-President' OR l.Position__c = 'Senior (or only) Rabbi' OR l.Position__c = 'Administrator/Exec director' )
            ];
            
            system.debug('%%%%%%%%%55555'  + leaders.size());
     return Leaders;
    }
            
}

 

TEst Class:

@isTest
private  class TestKeyLeadership
{
   static testMethod void testGetLeaders()
   {
      // insert a test account
      Account acc=new Account(Name='Test123');
      Database.saveresult sr = database.insert(acc);
     

      Contact con=new Contact(FirstName='Lester', Lastname='DbigTester', accountID=acc.id, email='123123123@888555.com');
      insert con;

      List<leadership_position_code__c> LDS=new List<leadership_position_code__c>{new leadership_position_code__c(Contact__C=con.id, position_type__C='Clergy', Position__C='Cantor', Start_Date__C= system.Today()-2, Thru_date__C= system.Today()-1 ),
       new leadership_position_code__c(Contact__C=con.id, position_type__C='Officer', Position__C='President', Start_Date__C= system.Today()-1 ), 
       new leadership_position_code__c(Contact__C=Con.id, position_type__C='Clergy', Position__C='Senior (or only) Rabbi', Start_Date__C=system.Today()-1 )};
      insert LDS;
     
      ApexPages.currentPage().getParameters().put('id', acc.id);


      // instantiate the class under test
      KeyLeadership kl=new KeyLeadership();
     
      List<Leadership_Position_code__c> leaders=kl.getLeaders();

      // change the number below to the number of leadership_position_code__c objects created above
      System.assertEquals(2, leaders.size());

      // should also assert that the inserted records above match those retrieved.
      Leadership_position_code__C[] LDSok = [select id,position__C from Leadership_position_code__C where id in :leaders];
    System.assertEquals('President',LDSok[0].Position__c);
    System.assertEquals('Senior (or only) Rabbi',LDSok[1].Position__c);
      
      
   }
}

 

Thanks in advance!

B

 

Hello All,

 

I am fairly new to Apex and Force.com in general. I have been going through the Force.com Workbook (the one that comes with Eclipse Force IDE Help File) which has an example Adding Apex Trigger, with the following code:

 

trigger HandleProductPriceChange on Merchandise__c (after undelete) 
{
	// update invoice line items associated with open invoices
	List openLineItems = 
		[SELECT j.Unit_Price__c, j.Merchandise__r.Price__c
		 FROM Line_Item__c j 
		 WHERE j.Invoice_Statement__r.Status__c = 'Negotiating'  
		 AND j.Merchandise__r.id IN :Trigger.new 
	 	 FOR UPDATE]; 
	 	 
	for (Line_Item__c li: openLineItems)
	{
		if (li.Merchandise__r.Price__c < li.Unit_Price__c)
		{
			li.Unit_Price__c = li.Merchandise__r.Price__c;
		}
	}
	
	update openLineItems;
}

The trigger saves and even shows up on the Web Interface under Triggers for the Merchandise Object. But when I change the price of the item, the line item never updates the amount. 

 

Can anyone tell me what I am missing? Everything else seems to be working like a charm.

 

Thanks,

Pedram

I am trying to use visualforce to show Note and Attachments as separate lists have the code below as the controller:

This works, however it returns the CreatedByID (00520000001U7ZIAA0) only and not the name.

 

  public List<Attachment> getAtt() {   
return [SELECT ID, Name, Description, Body, CreatedByID, CreatedDate FROM Attachment WHERE ParentID = :FeedbackID ORDER BY CreatedDate DESC ] ]
}

 

I then tried to do a mp but i'm missing a step or logic somewhere to get the name of the user

 

 

  public List<Attachment> getAtt() {
      List <Attachment> a = [SELECT ID, Name, Description, Body, CreatedByID, CreatedDate FROM Attachment WHERE ParentID = :FeedbackID];
      SET<String> cList = new Set<String>();
          for (Attachment ac: a) {
          cList.add(ac.CreatedByID);
      }
      List <User> u = [SELECT ID, Name FROM User WHERE ID in :cList ];
      Map<String, String> CreatedByMap = new Map<String, String>();
      for (User cb :u) {
           CreatedByMap.put(cb.Id,cb.Name);//populate map
      }
      
      return [SELECT ID, Name, Description, Body, CreatedByID, CreatedDate FROM Attachment WHERE ParentID = :FeedbackID  ORDER BY CreatedDate DESC AND CreatedByID in :CreatedByMap ];
 
  }

 This gives an error: Error: Compile Error: IN operator must be used with an iterable expression at line 50 column 141 which is 

in :CreatedByMap ];

At the limit of my knowledge here so any help much appreciated.

  • June 07, 2010
  • Like
  • 0

Hi,

 

i'm a newbie with apex. (Sorry for my english) .I want to write a query (soql) but i don't know how start!

i need an apex class to use query? or what?

i had create a custom object named "phonebook" that have the following fields : Name, Surname, Telephone number.

Suppse that i want to retrive contacts with filde surname " Russel", how should  i do?

Tnx

Using the IDE, I can view and edit the XML metadata for an object that's not defined by a managed package. Specifically, I can find a publicly defined list view in the object's XML, copy/edit it to define a new list view, and just save the object's definition from the IDE.

 

Is there a way to do the same thing for an object that is defined by a managed package?

 

I've installed a managed package that defines a custom object, and I'd like to create a few dozen list views that are just slight variations on each other. I could do it manually through the UI, but it'd be so much easier if I could do it by editing the <listViews> tags in the XML. However, the file for that object's metadata seems to be read-only as far as the IDE is concerned, and even if I change it to be writable, saving the file in the IDE doesn't seem to save it to the server. I'm wondering if there's a way to edit the XML for a managed package object, or am I stuck with having to do it manually through the browser's UI.

  • December 09, 2011
  • Like
  • 0

I have a managed package that defines an object whose Name is an Auto Number field. A customer who has the managed package wants to import some data from another org into his current org, and maintain the Auto Number values that he had in his original org, instead of having new Auto Number values assigned. 

 

I know you can ask Salesforce to allow you to use the Data Loader to write to the CreatedDate and LastModifiedDate fields. Will they also allow us to write to an Auto Number field?

 

(Before you suggest I temporarily change the field type to Text,l please note that the field is defined in a managed package, so I can't change its field type.)

  • December 05, 2011
  • Like
  • 0

Has anyone noticed whether inserting records into a list-type custom setting takes a long time?

 

I have a managed package that includes 4 custom settings, as well as a trigger that expects the custom settings to be populated with certain records. When the trigger starts, it checks whether the custom settings are populated, and if not, it populates them, inserting about 350 records total, split among the 4 custom settings. It inserts the custom settings using standard Apex techniques for inserting records -- it constructs a List<My_Custom_Setting__c> and then inserts the list with a since insert statement.

 

One of the custom settings gets populated with about 250 records. Checking the debug logs, I see that it often takes 5 minutes or more for that one insert operation to complete. That, combined with the time it takes to populate the other 3 custom settings, sometimes pushes the trigger over the 10-minute-per-transaction limit. 

 

I know I can use a Visualforce page to populate the custom settings, and just tell people who install the package to load that page to complete the installation. But I'm more interested in why it takes 5 minutes or more for a single insert statement to create 250 custom setting records. Has anyone else seen that kind of performance issue? Any thoughts on what I could do to avoid it?

  • November 17, 2011
  • Like
  • 1

Just about every time I go to Setup | Develop | Apex Test Execution, I get the "Is Your Code Covered?" pop-up, with the buttons inviting me to take the tour of this functionality. I've taken the tour. Many times. And yet, with each new browser session, I get prompted to take the tour. Again.

 

I've tried clicking each of the buttons in the prompt window, to no avail. The page just can't seem to remember, at least not past the current browser session, that I've taken the tour. It's getting pretty annoying. 

 

Has anybody else seen this behavior? Is there any way to get it to stop asking me to take tour?

 

Thanks!

  • October 14, 2011
  • Like
  • 0

The LMA captures the Org Id associated with a License. Is there any way I can tell from that Org Id whether it's a Sandbox or Production? 

 

I've seen a few postings suggesting that you can, but when I check against a few known sandbox and Production Org Ids, the suggestions don't pan out. One such posting says, "The letter after "00D" determines the instance," and another says, "If the character after "00D" is a number, it's Production; if it's a letter, it's a sandbox," but I've seen exceptions to that. Specifically, I have a Developer Edition org that starts with 00DC, and another Developer Edition org that starts with 00D5.

 

Thanks!

  • August 24, 2011
  • Like
  • 0

Spring 11 includes a new "Calculate your organization's code coverage" feature. If I have a managed package that includes Apex code installed in my org, do the results of that new feature include coverage of the code in the managed package? Or does it tell me the coverage of only my own code?

 

If that number includes all coverage results for managed package code, how can I determine the coverage percent for only my own code?

  • June 08, 2011
  • Like
  • 0

I have a managed package that includes a schedulable Apex class. As part of the post-package-installation instructions, the customer schedules that class to run on a nightly basis.

 

I also have an unmanaged version of this package, developed in a completely separate org, that I have deployed via the IDE to a few test orgs. If the Apex job is scheduled in the target org, the deployment fails, because it won't allow me to deploy a new version of the Apex class while the previous version is still scheduled to execute. I have to unschedule the job, then deploy, and then re-schedule the new version of the class.

 

With the managed package, I was surprised to discover that I can install a new version of the package into an org in which the previous version was already installed, even if the previous version of the Apex class is still scheduled to execute. This behavior of the managed package upgrade process seems different from the behavior of the unmanaged deployment process.

 

What IS the expected behavior when I install an upgrade to a managed package, when a class for that package is already scheduled to execute? Should the upgrade fail? If it shouldn't fail (it doesn't), what version of the code will run the next time the scheduled job runs -- the original version or the upgraded version?

  • June 08, 2011
  • Like
  • 0

I just viewed a post that I had written a week or so ago, copied the subject line, and then pasted it into the Search box at the top right of the Developerforce site. But when I searched for that string, my post didn't show up in the results.

 

It's not just this one post -- I've been having problems for quite a while finding posts that I know exist. The discussion boards are a great resource, but not so much when I can't rely on the search results. Is somebody working on the search funtionality?

  • April 12, 2011
  • Like
  • 0

I’m the developer of a package that has a heavy dependence on a scheduled Batch Apex job. The package currently runs in a dozen or so orgs, some of which have fairly large amounts of data. One org in particular has over 3 million records that are processed by the Batch Apex job.

 

Over the past 3 months, we’ve been encountering a lot of stability problems with Batch Apex.  We’ve opened cases for several of these issues, and they’ve been escalated to Tier 3 Support, but it consistently takes 2 weeks or more to get a case escalated, and then it can several more weeks to get a meaningful reply form Tier 3.

 

We really need to talk with the Product Manager responsible for Batch Apex. We asked Tier 3 to make that introduction, but they said they couldn’t. We’re trying to work with Sales to set up a discussion with a Product Manager, but so far, we haven’t had any luck there either. We’re hoping that a Product Manager might see this post and get in touch with us. We need to find out whether Batch Apex is a reliable-enough platform for our application.

 

Here are a few examples of the problems we’ve been having:

 

  • The batch job aborts in the start() method. Tier 3 Support told us that the batch job was occasionally timing out because its initial  query was too complex. We simplified the query (at this point, there are no WHERE or ORDER BY clauses), but we occasionally see timeouts or near timeouts. However, from what we can observe in the Debug Logs, actually executing the query (creating the QueryLocator) takes only a few seconds, but then it can take many minutes for the rest of the start() method to complete. This seems inconsistent with the “query is too complex” timeout scenario that Tier 3 support described.  (Case 04274732.)
  • We get the “Unable to write to ACS Stores” problem. We first saw this error last Fall, and once it was eventually fixed, Support assured us that the situation would be monitored so it couldn’t happen again. Then we saw it happen in January, and once it was eventually fixed, Support assured us (again) that the situation would be monitored so it couldn’t happen again. However, having seen this problem twice, we have no confidence that it won’t arise again. (Case 04788905.)
  • In one run of our job, we got errors that seemed to imply that the execute() method was being called multiple times concurrently. Is that possible? If so, (a) the documentation should say so, and (b) it seems odd that after over 6 months of running this batch job in a dozen different orgs, it suddenly became a problem.

 

  • We just got an error saying, “First error: SQLException [java.sql.SQLException: ORA-00028: your session has been killed. SQLException while executing plsql statement: {?=call cApiCursor.mark_used_auto(?)}(01g3000000HZSMW)] thrown but connection was canceled.” We aborted the job and ran it again, and the error didn’t happen again.
  • We recently got an error saying, “Unable to access query cursor data; too many cursors are in use.” We got the error at a time when the only process running on behalf of that user was the Batch Apex process itself. (Perhaps this is symptomatic of the “concurrent execution” issue, but if the platform is calling our execute() method multiple times at once, shouldn’t it manage cursor usage better?)
  • We have a second Batch Apex job that uses an Iterable rather than a QueryLocator. When Spring 11 was released, that Batch Apex job suddenly began to run without calling the execute() method even once. Apparently, some support for the way we were creating the Iterable changed, and even though we didn’t change the API version of our Apex class, that change caused our Batch Apex job to stop working. (Case 04788905.)
  • We just got a new error, "All attempts to execute message failed, message was put on dead message queue."

 

We really need to talk with a Product Manager responsible for Batch Apex. We need to determine whether Batch Apex is sufficiently stable and reliable for our needs. If not, we’ll have to find a more reliable platform, re-implement our package, and move our dozen or more customers off of Salesforce altogether.

 

If you’re responsible for Batch Apex or you know who is, please send me a private message so we can make contact. Thank you!

 

  • April 04, 2011
  • Like
  • 0

I'm managing an org that has a pretty large amount of data -- it has over 5GB of data for just one object, with about 3 million records for that object.

 

The org actually has a data storage limit (as reported by Setup | Data Management | Storage Usage) of 1GB, but the Storage Usage page says it has 5.8GB used, with a Percent Used of 582%.

 

I have a Batch Apex job that runs once a week to perform calculations based on those 3M records. The job starts by creating a QueryLocator for all the records, then iterates over them in the execute() method, 200 at a time. It can take the job 5 hours or more to run from start to finish.

 

Over the last few months, I've been getting many different intermittent errors from the Batch Apex job. Most of the errors seem platform-specific -- ACS errors, PLSQL exceptions, too many cursors in use (when the batch job is the only job running), among others. I'm wondering whether these errors could be related to the fact that we're so far over the data storage limit. Has anybody seen an org get so far over the limit? When it has, have you encountered any odd errors?

 

(Yes, I know, the owner of the org really should pay Salesforce to bump up their storage limits. I'm a consultant working for owner of the org, not the org owner myself. We're working on getting the org owner to pay for a higher limit. But in the meantime, I'm wondering whether the overage could be responsible for some of our intermittend platform errors.)

  • April 04, 2011
  • Like
  • 0

Every now and then, when I save a Visualforce page's code in the IDE, I find that it has been re-formatted. For example, if the code originally defined a style like this:

 

.myClassName {border:none;width:200px;}

 

I'll suddenly find that the code looks like this:

 

.myClassName {

  border:none;

  width: 200px;

}

 

One could argue over which style is better, but the point is that I wrote it the first way, and I don't appreciate having something (the IDE? the compiler?) format it differently.

 

Any idea what's causing this to happen and what I can do to stop it?

 

FWIW, I know that I'm the only developer in this org -- there's no person who's changing this. I've seen it happen in several orgs - it's not in just one org. Also, it doesn't happen every time I save -- just sometimes.

  • April 03, 2011
  • Like
  • 0

I have a Batch Apex process that runs for a long time (several hours) over a large set of records. One of the things the job's execute() method does is to query for a bunch of records, and then do an upsert on them. This job is the *only* bit of Apex code that modifies these records. Users can't modify these records through the UI -- the records are visible to users only in that they contribute to roll-up summary fields on a parent object.

 

I just started getting occasional "Unable to obtain exclusive access to this record" errors. Since the batch job's execute() method is the only code that tries to update (upsert) these records, I'm wondering if it's possible that multiple invocations of the execute() method are running at the same time. That's the only possible reason I can think of for this error to occur.

 

So:

 

1. Is it possible for the execute() method to be running more than once at the same time? (I've already confirmed that the batch job was launched only once.)

 

2. If the only Apex code that touches these records is in the execute() method, what else might be responsible for this error?

 

Thanks!

  • March 28, 2011
  • Like
  • 0

How can I get a VF page to render in such a way that I can open it in Word?

 

I know that renderAs supports only PDF, but per this blog article (http://blog.sforce.com/sforce/2008/12/visualforce-to-excel.html), I've been playing with contentType. I've tried a few contentTypes:

 

application/vnd.ms-word

application/rtf

vnd.ms-word.document.macroEnabled.12

 

The first two (especially when used with an #filename.doc suffix) create content that can be opened in Word, but the Word document contains HTML tags.

 

The second one came from http://www.iana.org/assignments/media-types/application/index.html which has a list of valid media types. Word wouldn't open anything created with that content type.

 

How can I get a VF page to generate content that can be opened in MS Word?

 

  • March 23, 2011
  • Like
  • 0

I have an application that deals with an extremely large number of records. It includes a child object that has a lookup relationship to a parent object, where a single parent record could have a hundred thousand (100,000) or more child records. 

 

The schema uses a lookup relationship, rather than a master/detail relationship, because a user may want to move a child record from one parent to another.

 

I need to compute several roll-up summary values on the parent object to count the number of child records, the number of distinct values in a particular field, the min and max of values in other fields, etc. Because I'm not using a master/detail relationship, I can't use roll-up summary fields, so I wrote a scheduled Batch Apex process to perform the calculations. Unfortunately, I'm finding that Batch Apex just can't handle this reliably. (See http://boards.developerforce.com/t5/Apex-Code-Development/Batch-Apex-problems/m-p/244937 for just some of the problems I've encountered.)

 

So now I'm considering whether to switch to a master/detail relationship. I can write some Apex code to handle the situation where a user wants to move a child to a different parent -- in that case, I'll clone the child record, change the clone to refer to the new parent, and then delete the old child record. But now I'm wondering whether roll-up summary fields can handle very large numbers of child records. 

 

Has anyone worked with roll-up summary fields that compute counts, counts distrincts, sums, mins, and maxes over an extremely large number of child records? If so, what's your experience with it? Does it perform reasonably well? Or does it take a long time to compute new values when a child record is touched?

 

Thanks very much for your input!

 

  • March 14, 2011
  • Like
  • 0

Update Collision Detection is described here:  http://wiki.developerforce.com/index.php/Summer07:_Update_Collision_Detection

 

Is this feature enabled by default in all Salesforce Editions? Is there any way to turn it off (if I ever wanted to)?

  • February 15, 2011
  • Like
  • 0

In several projects, in several orgs, I've scheduled some Batch Apex jobs to run nightly to process large numbers of records. I've run into a couple of problems that are leaving me very uncertain about whether Batch Apex really can handle large jobs.

Every now and then, a job will fail with this error: Unable to write to any of the ACS stores in the alloted time. I first encountered this in September 2010. I filed a Case and created a discussion group posting (http://boards.developerforce.com/t5/Apex-Code-Development/Unable-to-write-to-any-of-the-ACS-stores-in-the-alloted-time/m-p/205908#M36022). After a few weeks, I was finally told that it was an internal issue that had been resolved. After months of running nightly Batch Apex jobs without this problem, it just recurred.

A second issue is that, every now and then, a Batch Apex job gets stuck in the queue in the "Queued" state. When you launch a Batch Apex job, it gets added to the queue in the "Queued" state, and when the system gets around to executing it, the job gets moved to a "Processing" state. Well, I have batch jobs that have been stuck in the "Queued" state since early January. I've had cases open on this problem for over a month, and while the Case finally found its way to Tier 3 Support, there's still no sign of a resolution.

In both cases, the issue is NOT an Apex coding problem. It's an issue with how the platform is queueing and processing Batch Apex jobs.

I'm wondeirng whether anybody else has run into these problems, or other problems executing Batch Apex jobs. What problems have you run into? How have you resolved or worked around them?

Thanks for your insights.

 
  • February 04, 2011
  • Like
  • 0

I have a batch Apex job (A) that, upon completion, needs to launch another batch Apex job (B). Since you can't launch a batch Apex job from from a batch Apex job, job A schedules job B to run 2 seconds later. (JobB is both batchable and schedulable. JobB's schedulable execute() method launches the JobB batch job.)

 

Here's the code from job A's finish() method:

 

 

global void finish(Database.BatchableContext bc) {
// Seconds Minutes Hours Day_of_month Month Day_of_week optional_year
String str = Datetime.now().addSeconds(2).format('s m H d M ?');
JobB cls = new JobB();
System.Schedule('Job B', str, cls);
return;
}

 

Most of the time, this works just fine. However, every now and then, instead of being scheduled to run 2 seconds later, JobB gets scheduled to run 1 year later. For example, I'll see a job in the Scheduled Jobs queue that has a Submitted time of "11/10/2010 5:00PM" and a Next Scheduled Run time of "11/10/2011 5:00PM."

 

I just changed my code to wait 10 seconds instead of just 2 seconds to schedule JobB, which will hopefully eliminate the problem. However, I'm left wondering why 2 seconds isn't sufficient. Any thoughts? Has anybody else run into this kind of problem?

 

  • November 11, 2010
  • Like
  • 0

I have a managed package that includes a Protected Hierarchy-type Custom Setting named Config__c that contains configuration values. The following code fetches the Config__c record for the current user:

    public static Config__c QueryConfigData() {
Config__c config = Config__c.getValues(UserInfo.getUserId());
if (config == null) {
// The current user has no value for this custom setting - create one
config = new Config__c();
config.setupownerid = UserInfo.getUserId();
config.Show__c = '-All-';
insert config;
}

return config;
}

 

The following code saves a given Config__c record that was fetched using the method above:

    public static void SaveConfigData(Config__c config) {
update config;
}

 

Both methods are called from a custom controller. I don't know whether it's relevant, but the custom controller is defined "without sharing." The controller calls the query method in an apex:page action method, and when the user clicks a Save button, it calls the save method.

I've tested this in the development org, running as both a System Admin and as a Standard User, and it works great.

However, when I install the managed package in another org, while it works when I'm running as a System Admin, when I'm running as a Standard User, the save method generates the error "An error occurred when processing your submitted information."

Because the custom setting is defined as Protected in a managed package, I can't see whether custom setting data is being created for the non-System Admin user. The error doesn't happen on the insert in the query method, but there seems to be a problem updating the custom setting in the save method.

Any ideas?

 

  • November 02, 2010
  • Like
  • 0

I know that I can find the number of API requests over the last 24 hours in the UI via Setup | Company Profile | Company Information. However, that doesn't quite give me everything I need:

 

  • Is there any way to monitor the API Call Count programmatically through Apex? For example, can I schedule a process to run every hour, in which some Apex code checks the count, compares it to the organization's 24-hour limit, and emails me if I'm at a certain percentage of the limit?

 

  • Once the 24-hour limit has been reached, is there any way to count the number of API calls that are attempted and rejected?

 

I know that I can get a report of the API call counts over the last 7 days via Reports | Administrative Reports | API Usage Last 7 Days. However, this report only summarizes the API call count per day. Is there any way to get a report at a finer level of granularity, like per hour?

  • September 24, 2010
  • Like
  • 0

I have a batch job that runs in several production orgs. It has run fine, twice a day, for several weeks now. However, in a few of the orgs in which it runs, I've started getting the error "Unable to write to any of the ACS stores in the alloted time." What does this error mean, why am I getting it, and what can I do about it?

 

Thanks--

  • September 14, 2010
  • Like
  • 0

Has anyone noticed whether inserting records into a list-type custom setting takes a long time?

 

I have a managed package that includes 4 custom settings, as well as a trigger that expects the custom settings to be populated with certain records. When the trigger starts, it checks whether the custom settings are populated, and if not, it populates them, inserting about 350 records total, split among the 4 custom settings. It inserts the custom settings using standard Apex techniques for inserting records -- it constructs a List<My_Custom_Setting__c> and then inserts the list with a since insert statement.

 

One of the custom settings gets populated with about 250 records. Checking the debug logs, I see that it often takes 5 minutes or more for that one insert operation to complete. That, combined with the time it takes to populate the other 3 custom settings, sometimes pushes the trigger over the 10-minute-per-transaction limit. 

 

I know I can use a Visualforce page to populate the custom settings, and just tell people who install the package to load that page to complete the installation. But I'm more interested in why it takes 5 minutes or more for a single insert statement to create 250 custom setting records. Has anyone else seen that kind of performance issue? Any thoughts on what I could do to avoid it?

  • November 17, 2011
  • Like
  • 1

I'm working with a standard object that's used by a managed package. The managed package includes some software that runs on the package author's own server and that calls into Salesforce using the web services API, to create records for that standard object. Other Salesforce apps also create records for that standard object. I need to write a trigger that can detect if a record is being added by the managed package's code (i.e., as a result of an API call) or by some other means.

 

I have no control over the way the external server calls into Salesforce to create records for this object. I can't modify that software.

 

I have no control over the UI. Because it's a standard object, many apps can create records for it. I can't change all of those apps to make them set some kind of flag that says "I'm not creating this record through the API."

 

Given those limitations, is there any way a trigger can determine whether it's being called as a result of an API call?

 

I've tried various UserInfo methods, without much luck. I thought if I called UserInfo.getUiThemeDisplayed() from a trigger invoked as a result of an API call, it might tell me that there's no UI Theme being displayed, but it doesn't.  

UserInfo.getUserId() doesn't help me because the external server logs into Salesforce using the same login credentials that a browser user would.

  

Is there anything in UserInfo.getSessionId() that might be useful? 

 

Thanks for your help!

  • June 10, 2010
  • Like
  • 1

I'd like to create a validation rule that ensures that if a lookup field in my custom object is non-null, it refers to a record with a particular record type. I know how to write that validation rule using RecordTypeId:

 

    myfield__r.RecordTypeId = "012800000006Rz1"

 

But the custom object containing this rule will eventually be packaged into an App Exchange offering, so I can't rely on the Id always being the same. How can I write this rule to check the field's RecordType.Name?

 

I've tried the following, but I get "Field RecordType does not exist"

 

    myfield__r.RecordType.Name = "My RecordType Name"

 

Any ideas?

 

  • April 20, 2009
  • Like
  • 1

The Apex Language Reference says the "Total number of classes that can be scheduled concurrently" is 25. 

 

The wording is a little difficult to parse. Which of the following (if any) does this mean?

 

  • I can have at most 25 Schedulable Apex jobs scheduled, even if they're scheduled to run at a wide variety of times that don't overlap
  • I can have at most 25 Schedulable Apex jobs actually running at any one point in time, 

Thanks!

 

Hi,

 

I am inserting rates on daily basis. For this, I have scheduled apex class using following code:

 

scheduledGenerateRates m = new scheduledGenerateRates(); 
cronId = system.schedule('Update Rate Schedule', '0 0 * * 1-12 ? *', m);

 Afterthat, if some error occured while inserting records, I want to delete existing scheduled job. ISo, i have used abortjob function to do this:

 

List<CronTrigger> cron = [SELECT id, CronExpression, TimesTriggered, NextFireTime FROM CronTrigger WHERE id = :scheduledJobId];
for(CronTrigger CT:cron)
{ 
   try
   {
       System.abortjob(CT.id); 
   }catch(Exception e1)
   {
       System.debug('--- Unable to delete scheduled job with Scheduled_Geocoder_Job_Id = ' + CT.id + ' because ' + e1.getMessage());
    }
}

 And in the debug log, i am getting:

Aggregations:0|SELECT id, CronExpression, TimesTriggered, NextFireTime FROM CronTrigger WHERE id = :scheduledJobId
07:00:03.778 (2778266000)|SOQL_EXECUTE_END|[170]|Rows:1
07:00:03.778 (2778504000)|SYSTEM_METHOD_ENTRY|[175]|System.abortJob(String)

 But, still in the list of Setup - > Monitoring ->Scheduled Jobs, I am able to see that job.

 

Please suggest me some solution to delete scheduled job from here.

 

Thanks.

I have a managed package that defines an object whose Name is an Auto Number field. A customer who has the managed package wants to import some data from another org into his current org, and maintain the Auto Number values that he had in his original org, instead of having new Auto Number values assigned. 

 

I know you can ask Salesforce to allow you to use the Data Loader to write to the CreatedDate and LastModifiedDate fields. Will they also allow us to write to an Auto Number field?

 

(Before you suggest I temporarily change the field type to Text,l please note that the field is defined in a managed package, so I can't change its field type.)

  • December 05, 2011
  • Like
  • 0

Ok all here is one that I am running into Trouble with. I have a Trigger that is Before insert, before update.

It is validating the address via the postalcodes as well as ensuing a proper grouping for reporting using the County and State returned from the Zip look up.
It works great however I need to Accommodate for passing 2300+ accounts at one time.
Any one do a Bulk/Batch processing before insert, before update? I keep getting a recursive issue.


Hi All,

 

I'm new here, so hopefully I don't say anything too silly/dumb.

 

I am having a problem with testing an undelete trigger I have written.  I am getting an DMLException for de-referenfing a null object when trying to test my undelete trigger.  It may be that I am misunderstanding completely how testing of Apex undeletes should work... 

 

My basic requirements are that I have an order, and I have a field on that order called Needs Attn.  I need to set this field to True whenever the order is updated or when it is undeleted.  

 

My Trigger in JGOrderUpdate.trigger:

 

trigger JGOrderUpdate on JG__Order__c (before update, after undelete) {
	    for(JG__Order__c o : Trigger.new) {
		    JG__Order__c old = Trigger.oldMap.get(o.Id);
		    if(!old.Needs_Attn__c) o.Needs_Attn__c = true;
		}
	}

 

My Test Code JGTriggerTests.cls :

 

    // create a order
    JG__Order__c o = new JG__Order__c();
    o.Needs_Attn__c = false;
    insert o;

    // this kicks off the before update trigger for orders
    update o;

     // Confirms Actual update tests for orders
     JG__Order__c test_o = [Select Id, Needs_Attn__c From JG__Order__c Where Id =: o.Id];
     System.assert(test_o.Needs_Attn__c);

     // now we set the order back to needing attention false
     o.Needs_Attn__c = false;
	
     // now we delete the order so we can undelete it to make sure when we undelete it actually
     // changes to Needs_Attn__c == True
     delete o;
	
     // this kicks off the undelete trigger code
     // HAVING THE PROBLEM ON THE LINE BELOW
     undelete o;

     // did the trigger run properly
     System.assert(test_so.Needs_Attn__c);

 

 

Here's the error i get on the line above with "undelete o;"

 

 

Description Resource Path Location Type
 System.DmlException: Undelete failed. 
 first error: CANNOT_INSERT_UPDATE_ACTIVATE_ENTITY, JGOrderUpdate: execution of AfterUndelete
 caused by: System.NullPointerException: Attempt to de-reference a null object
 Trigger.JGOrderUpdate: line 3, column 38: [] JGTriggerTests.cls
 Description Resource Path Location TypeSystem.DmlException: Undelete failed.  
 first error: CANNOT_INSERT_UPDATE_ACTIVATE_ENTITY, JGOrderUpdate: execution of AfterUndelete
 caused by: System.NullPointerException: Attempt to de-reference a null object
 Trigger.JGOrderUpdate: line 3, column 38: [] JGTriggerTests.cls

Description Resource Path Location Type System.DmlException: Undelete failed.  first error: CANNOT_INSERT_UPDATE_ACTIVATE_ENTITY, JGOrderUpdate: execution of AfterUndelete caused by: System.NullPointerException: Attempt to de-reference a null object Trigger.JGOrderUpdate: line 3, column 38: [] JGTriggerTests.cls Description Resource Path Location TypeSystem.DmlException: Undelete failed.   first error: CANNOT_INSERT_UPDATE_ACTIVATE_ENTITY, JGOrderUpdate: execution of AfterUndelete caused by: System.NullPointerException: Attempt to de-reference a null object Trigger.JGOrderUpdate: line 3, column 38: [] JGTriggerTests.cls

 

 

 

Any help would be greatly appreciated!

 

Thanks
j

I’m the developer of a package that has a heavy dependence on a scheduled Batch Apex job. The package currently runs in a dozen or so orgs, some of which have fairly large amounts of data. One org in particular has over 3 million records that are processed by the Batch Apex job.

 

Over the past 3 months, we’ve been encountering a lot of stability problems with Batch Apex.  We’ve opened cases for several of these issues, and they’ve been escalated to Tier 3 Support, but it consistently takes 2 weeks or more to get a case escalated, and then it can several more weeks to get a meaningful reply form Tier 3.

 

We really need to talk with the Product Manager responsible for Batch Apex. We asked Tier 3 to make that introduction, but they said they couldn’t. We’re trying to work with Sales to set up a discussion with a Product Manager, but so far, we haven’t had any luck there either. We’re hoping that a Product Manager might see this post and get in touch with us. We need to find out whether Batch Apex is a reliable-enough platform for our application.

 

Here are a few examples of the problems we’ve been having:

 

  • The batch job aborts in the start() method. Tier 3 Support told us that the batch job was occasionally timing out because its initial  query was too complex. We simplified the query (at this point, there are no WHERE or ORDER BY clauses), but we occasionally see timeouts or near timeouts. However, from what we can observe in the Debug Logs, actually executing the query (creating the QueryLocator) takes only a few seconds, but then it can take many minutes for the rest of the start() method to complete. This seems inconsistent with the “query is too complex” timeout scenario that Tier 3 support described.  (Case 04274732.)
  • We get the “Unable to write to ACS Stores” problem. We first saw this error last Fall, and once it was eventually fixed, Support assured us that the situation would be monitored so it couldn’t happen again. Then we saw it happen in January, and once it was eventually fixed, Support assured us (again) that the situation would be monitored so it couldn’t happen again. However, having seen this problem twice, we have no confidence that it won’t arise again. (Case 04788905.)
  • In one run of our job, we got errors that seemed to imply that the execute() method was being called multiple times concurrently. Is that possible? If so, (a) the documentation should say so, and (b) it seems odd that after over 6 months of running this batch job in a dozen different orgs, it suddenly became a problem.

 

  • We just got an error saying, “First error: SQLException [java.sql.SQLException: ORA-00028: your session has been killed. SQLException while executing plsql statement: {?=call cApiCursor.mark_used_auto(?)}(01g3000000HZSMW)] thrown but connection was canceled.” We aborted the job and ran it again, and the error didn’t happen again.
  • We recently got an error saying, “Unable to access query cursor data; too many cursors are in use.” We got the error at a time when the only process running on behalf of that user was the Batch Apex process itself. (Perhaps this is symptomatic of the “concurrent execution” issue, but if the platform is calling our execute() method multiple times at once, shouldn’t it manage cursor usage better?)
  • We have a second Batch Apex job that uses an Iterable rather than a QueryLocator. When Spring 11 was released, that Batch Apex job suddenly began to run without calling the execute() method even once. Apparently, some support for the way we were creating the Iterable changed, and even though we didn’t change the API version of our Apex class, that change caused our Batch Apex job to stop working. (Case 04788905.)
  • We just got a new error, "All attempts to execute message failed, message was put on dead message queue."

 

We really need to talk with a Product Manager responsible for Batch Apex. We need to determine whether Batch Apex is sufficiently stable and reliable for our needs. If not, we’ll have to find a more reliable platform, re-implement our package, and move our dozen or more customers off of Salesforce altogether.

 

If you’re responsible for Batch Apex or you know who is, please send me a private message so we can make contact. Thank you!

 

  • April 04, 2011
  • Like
  • 0

I know that I can find the number of API requests over the last 24 hours in the UI via Setup | Company Profile | Company Information. However, that doesn't quite give me everything I need:

 

  • Is there any way to monitor the API Call Count programmatically through Apex? For example, can I schedule a process to run every hour, in which some Apex code checks the count, compares it to the organization's 24-hour limit, and emails me if I'm at a certain percentage of the limit?

 

  • Once the 24-hour limit has been reached, is there any way to count the number of API calls that are attempted and rejected?

 

I know that I can get a report of the API call counts over the last 7 days via Reports | Administrative Reports | API Usage Last 7 Days. However, this report only summarizes the API call count per day. Is there any way to get a report at a finer level of granularity, like per hour?

  • September 24, 2010
  • Like
  • 0

If I issue a SOQL query that includes aggregate functions, I get back a list of AggregateResult objects. From each of those, I can use ar.get('fieldname') to get the various values. But that assumes I know the names of the field. If I do a Dynamic SOQL query from a VF page controller, where the query is based on input from the user, I may not know the field names ahead of time. How can I determine the fieldnames to use when I want to access the contents of an AggregateResult object?

 

Having read that an AggregateResult is a special type of SObject, I tried using Dynamic Apex to discover the fields:

 

List<AggregateResult> lst = database.query('select count(id), grp__c from myobj__c group by grp__c');

Schema.SObjectType sot = lst.getSObjectType();
Schema.DescribeSObjectResult res = sot.getDescribe();
Set<String> setFields = res.fields.getMap().keyset();
System.debug('Fields are: ' + setFields);

But the only field I get back is "id."  

 

Thanks for your help!

  • July 28, 2010
  • Like
  • 0

Hi,

 

I am implementing a trigger in Accounts. If some conditions becomes true then I need to display an alert message to the user.

 

I searched in the discussion boards?? But I didnt find anything reg this.

 

Is there any way to achieve this?

 

If so then pls give some examples..

 

Thanks,

I'm working with a standard object that's used by a managed package. The managed package includes some software that runs on the package author's own server and that calls into Salesforce using the web services API, to create records for that standard object. Other Salesforce apps also create records for that standard object. I need to write a trigger that can detect if a record is being added by the managed package's code (i.e., as a result of an API call) or by some other means.

 

I have no control over the way the external server calls into Salesforce to create records for this object. I can't modify that software.

 

I have no control over the UI. Because it's a standard object, many apps can create records for it. I can't change all of those apps to make them set some kind of flag that says "I'm not creating this record through the API."

 

Given those limitations, is there any way a trigger can determine whether it's being called as a result of an API call?

 

I've tried various UserInfo methods, without much luck. I thought if I called UserInfo.getUiThemeDisplayed() from a trigger invoked as a result of an API call, it might tell me that there's no UI Theme being displayed, but it doesn't.  

UserInfo.getUserId() doesn't help me because the external server logs into Salesforce using the same login credentials that a browser user would.

  

Is there anything in UserInfo.getSessionId() that might be useful? 

 

Thanks for your help!

  • June 10, 2010
  • Like
  • 1

I have a VF page that I'm embedding in a standard page layout. I'd like to pass a parameter into the VF page. The VF page's controller will look at the parameter and the current record, then decide what to display on the embedded page.

 

I have the VF page and controller written. But how can I get the standard page layout editor to pass a parameter in to the VF page? Is that even possible? If not, do you have any other suggestions?

 

Thanks!

  • May 08, 2010
  • Like
  • 0

I'd like to develop a batch file that can export data using the Data Loader command line. I'm starting with a simple process-conf.xml file:

 

<!DOCTYPE beans PUBLIC "-//SPRING//DTD BEAN//EN" "http://www.springframework.org/dtd/spring-beans.dtd">
<beans>
<bean id="account"
class="com.salesforce.lexiloader.process.ProcessRunner"
singleton="false">
<description></description>
<property name="name" value="csvAccount"/>
<property name="configOverrideMap">
<map>
<entry key="sfdc.endpoint" value="https://www.salesforce.com"/>
<entry key="sfdc.username" value="my_user_name"/>
<entry key="sfdc.password" value="my_80-char_password_string"/>

<entry key="sfdc.debugMessages" value="false"/>
<entry key="sfdc.debugMessagesFile" value="C:\Export.log" />
<entry key="sfdc.timeoutSecs" value="120"/>
<entry key="sfdc.loadBatchSize" value="200"/>

<entry key="process.operation" value="extract"/>
<entry key="dataAccess.type" value="csvWrite" />
<entry key="dataAccess.writeBatchSize" value="1" />
<entry key="sfdc.extractionRequestSize" value="1" />
<entry key="process.enableExtractSuccessOutput" value="true" />

<entry key="process.outputError" value="C:\ERRORaccount.csv" />

<entry key="dataAccess.name" value="C:\account.csv" />
<entry key="sfdc.entity" value="account"/>
<entry key="sfdc.extractionSOQL" value="select id, phone from account" />
</map>
</property>
</bean>
</beans>

 

When I run this from the command line, I get a bunch of DEBUG and INFO output, followed by:

 

2704 [account] INFO com.salesforce.lexiloader.dao.DataAccessObjectFactory - Instantiating data access object: C:\account.csv of type: csvWrite
2704 [account] INFO com.salesforce.lexiloader.process.ProcessRunner - Checkingthe data access object connection
2704 [account] INFO com.salesforce.lexiloader.process.ProcessRunner - Setting field types
3500 [account] INFO com.salesforce.lexiloader.process.ProcessRunner - Setting object reference types
Exception in thread "main" java.lang.NullPointerException
at com.salesforce.lexiloader.client.PartnerClient.setFieldReferenceDescribes(PartnerClient.java:462)
at com.salesforce.lexiloader.controller.Controller.setReferenceDescribes(Controller.java:140)
at com.salesforce.lexiloader.process.ProcessRunner.run(ProcessRunner.java:129)
at com.salesforce.lexiloader.process.ProcessRunner.main(ProcessRunner.java:228)

 

 It's logging in (if I change the username, I get a different error), and it creates the account.csv file, but it's empty.

 

If I change the sfdc.entity property to "xxx," I get an error indicating that no such SObject exists.

 

If I change the sfdc.extractionSOQL property to select from "xxx" instead of from "account," I get the same null pointer exception.

 

I'm logging in as a System Administrator, so I doubt there's a permission problem.

 

Any ideas?

 

Thanks,

 

MJ.

 

 

 

  • September 04, 2009
  • Like
  • 0

I'm in the process of packaging some code that includes callouts to Google Maps for geocoding. The code works just fine in my development org, but since the package could be installed on any Salesforce instance, I can't always use a Google Maps API key that's based on my development org's instance. Fortunately, in Google Map in Visualforce Page, I found that I can use a key based on this URL:  http://force.com

 

This seems to work, except now I get G_GEO_TOO_MANY_QUERIES.

 

Is Google's "too many queries" limit based on all requests coming from http://force.com? That would make sense from Google's perspective, but from the perspective of a single org on a single Salesforce instance, it doesn't seem fair that every other instance's org's requests should count against my limit?

 

Is there any way to create a Google Maps API key that's specific to my own org (or an org my package is installed in), so that only requests made from my org count against my limit.

 

For reference, see my post from late 2008: http://community.salesforce.com/sforce/board/message?board.id=apex&message.id=10535

 

Thanks--

  • May 01, 2009
  • Like
  • 0

I'd like to override the standard Save button for Account objects. I know I can't do that directly, so I thought I'd override the Account Edit page, going to a Visualforce page instead, and then override the standard save action in a controller extension.

 

So I created a new VF page that looks like this:

 

 

<apex:page standardController="Account" extensions="AccountEditController">
<apex:detail subject="{!id}" relatedList="true"/>
</apex:page>

Then I overrode the Account Edit button to go to this page.

 

My first problem is that this page displays in view mode, not edit mode. If I click the Edit button, it (of course) just loads the same page again. How do I get the <apex:detail> component to display in edit mode?

 

My second problem is overriding the standard Save action. I think I can do it with the following in my controller extension:


public class AccountEditController {
private final Account acct;
private ApexPages.StandardController stdController;

public AccountEditController(ApexPages.StandardController stdController) {
this.acct = (Account)stdController.getRecord();
this.stdController = stdController;
}

public PageReference save() {
// Put my own stuff here

// Do the standard save action
return this.stdController.save();
}
}

 

Is that correct?

 

Thanks!

 

MJ.

 

 

 

 

 

 

 

  • April 16, 2009
  • Like
  • 0

I'd like to add an onChange event handler to the standard Contact page layout, so that when the user changes the Contact's first or last name, some JavaScript code checks to see whether there's already a Contact with that name. I'd like this code to fire when the user changes the Contact's name, and not wait until the user clicks Save.

 

I've read this, so I know how to create an S-control that includes HTML input tags for the Contact's first and last names, and I know how to add the onChange event handler for them. What I don't know is how to get the values of those fields saved as the Contact's first and last names when the user clicks Save.

 

If this can't be done in the standard page, I can write a VisualForce page. But if I go that route, can I use the <apex:detail> tag or do I have re-create the standard page layout manually, using individual inputField tags?

 

Thanks!

 

  • February 18, 2009
  • Like
  • 0
Hi,
 

We have created a time based workflow (to trigger say 2 days before scheduled date, a custom datetime field). After activating this workflow,  we are not able to convert leads to accounts and the system throws the following error message in the convert lead page:

Error: Unable to convert lead that is in use by workflow"

Is there a way to fix this issue as we need to convert the leads even if the workflow has not been triggerd for a particular lead.

Appreciate all your response.

Thanks