• ADC Test User
  • NEWBIE
  • 0 Points
  • Member since 2005

  • Chatter
    Feed
  • 0
    Best Answers
  • 0
    Likes Received
  • 0
    Likes Given
  • 4
    Questions
  • 2
    Replies

 

I have a custom object which has a field which is the data type Master-Detail(account).  I have several currency fields in that custom object.   I have created on Account several roll-up summary fields based on the fields within my custom object.  I have about 100,000 which I need to re-insert daily (we delete all the entries from the previous day).

 

 

What I'm looking for is this: Is there a way to disable the roll-up summary field recalculations until my Apex/Soap code has completed and processing is finished with all the insertions and then force the recalculation?

  

 

Thanks in advance.

 

 

 

 

 

 

 

Hi -

 

I'm doing some work with the Bulk API.  I am able to get my CSV file uploaded and the job to process successfully.  I am, however, having a problem obtaining the status of the job. Here is the code I am using which is based on the example provided...

 

BatchInfo[] statusList = connection.getBatchInfoList(job.getId()).getBatchInfo();

 

I am passing it a valid job ID, which has been successfully submitted.  The error I am receiving is...

 

java.lang.IllegalArgumentException: input stream can not be null
at com.sforce.ws.parser.MXParser.setInput(MXParser.java:522)
at com.sforce.ws.parser.XmlInputStream.setInput(XmlInputStream.java:64)
at com.sforce.async.RestConnection.parseAndThrowException(RestConnection.java:112)
at com.sforce.async.RestConnection.doHttpGet(RestConnection.java:283)
at com.sforce.async.RestConnection.getBatchInfoList(RestConnection.java:190)

 

Any help would be greatly appreciated,

 

Thanks,

 

Jeff Podlogar

 

Hi -

 

I am looking for the fastest way possible to mass delete all records for a given custom object. I developed a very simple Batch Apex class which is described below.  The code works but it performs very slowly. Yesterday it took 40 minutes to delete 34,000 records, although I did not run into any governor limits.  It seems that the execute() call is only receiving a few records at a time.  Is there any way to coax it to process large groups of records?

 

Any help would be greatly appreciated.

 

global class MassDeleteCustomObject implements Database.Batchable<Sobject>

{

  global final String query = 'select Id from <my custom object>';

 

  global Database.QueryLocator start(Database.BatchableContext BC)

  {

 

       return Database.getQueryLocator(query);

  }

 

    global void execute(Database.BatchableContext BC, List<Sobject> records)

   {

        delete records;

   }

 

  global void finish(Database.BatchableContext BC)

  {

  }

 

}

Hi -

I need to do some special processing on our system when a contact gets merged into another contact.

Currently, I have no way of detecting that a merge has taken place via the API.  We are being notified the deletion of the contact who did not survive the merge (via the getDeleted() call), and (via the getUpdated() call) we are being notified of an update of the contact who did survive, but I can see no way of knowing that an actual merge has taken place, nor can I associate the deleted contact's id to the surviving contact's id.  Is there any way of doing this?

 

Thanks!

Jeff Podlogar

 

 

 

 

 

I have a custom object which has a field which is the data type Master-Detail(account).  I have several currency fields in that custom object.   I have created on Account several roll-up summary fields based on the fields within my custom object.  I have about 100,000 which I need to re-insert daily (we delete all the entries from the previous day).

 

 

What I'm looking for is this: Is there a way to disable the roll-up summary field recalculations until my Apex/Soap code has completed and processing is finished with all the insertions and then force the recalculation?

  

 

Thanks in advance.

 

 

 

 

 

 

 

Hi -

 

I am looking for the fastest way possible to mass delete all records for a given custom object. I developed a very simple Batch Apex class which is described below.  The code works but it performs very slowly. Yesterday it took 40 minutes to delete 34,000 records, although I did not run into any governor limits.  It seems that the execute() call is only receiving a few records at a time.  Is there any way to coax it to process large groups of records?

 

Any help would be greatly appreciated.

 

global class MassDeleteCustomObject implements Database.Batchable<Sobject>

{

  global final String query = 'select Id from <my custom object>';

 

  global Database.QueryLocator start(Database.BatchableContext BC)

  {

 

       return Database.getQueryLocator(query);

  }

 

    global void execute(Database.BatchableContext BC, List<Sobject> records)

   {

        delete records;

   }

 

  global void finish(Database.BatchableContext BC)

  {

  }

 

}