function readOnly(count){ }
Starting November 20, the site will be set to read-only. On December 4, 2023,
forum discussions will move to the Trailblazer Community.
+ Start a Discussion

Apex Callout to External Data


Let me preface this by saying that while I am comfortable coding Apex for Triggers and test classes, I have never coded scheduled Apex and Apex callouts.  I have an external database that I want to bring into SFDC on a nightly basis.  I would like to set up a batch Apex trigger that fires early in the morning and updates either a standard or custom obect with data from the database.  Can anyone point me to resources where I can see sample code on how to accomplish this?  Thanks,
SFDC HedgehogSFDC Hedgehog
This is kind of a open-ended question, but I don't think this is the best solution to your problem.....  hear me out....

Are joins / star queries / SQL rollups required at the remote end?  
If so, you might want to consider doing all that on the remote side and pushing the data via the bulk loader.  
This also begs the question on whether inserts are required - if so, doing an upsert from the remote side seems to make more sense to me.

An HTTP Callout is best (in my experience) when there is an established API on a remote system that Salesforce needs to synchronize with - like, updating sales info to or from a SFDC account as it happens - either from SFDC or the remote system.   Which may be what you're trying to do, but it has limitations...

Also - nothing can touch the bulk loader in terms of mass-update speed of records.  
What is the database at the remote end?  What kind of system does it reside on? 
Assuming its a Windows server system, it's easy to set up a daily data extract to CSV and push via the bulk loader.

If it's Apple Server, you can use the Lexiloader with a process.bat file that has the commands to load the CSV formed data.  
Or you can use Jitterbit to set up a scheduled load.

How much data are you talking about?  
You might run hard into governor limits by using the pull-approach.

I know it's tempting to write a scheduled batch utility - and it might be correct for your usage - but my personal experience is that is a favored approach for specific internal data functions - like updating a field in a group of records all internally within Salesforce - A typical use-case would to, say, update all accounts with a "Account processed" status at the end of the business day if there was a task created for it... something like that.

But if you're convinced that the scheduled batch Apex is the way to go - I just have the following things on hand to remind you about;

You are limited to 10 call outs per task execution.
You can't use callouts with transactions.
Scheduled jobs are not guaranteed to run at the exact time they are scheduled - just when resources become available.

Hi Kevin,

Thanks for the great feedback.  Truth is I'm not sure at this point what the data looks like or even the system it's on  I just took on a new role with a company and one of the main priorities is going to be to try to meld custom data they collect in this external system with the corresponding Account/Contact records in SFDC.  I'm not sure of the frequency of updates, but I imagine it would be daily.  The time that these updates would be processed does not matter to me as much as automating it.  I have used the Data Loader and know it is very useful, but in this case I want the updates to be automatic so I won't need a resource to spend time each day updating the records with the Data Loader.  Assuming I do go the batch Apex route, do you know of any sites where I could see examples and get information about them?  Thanks.
SFDC HedgehogSFDC Hedgehog

Oh - OK.    Just some more suggestions - I'm sure you know all these, but I'm just throwing another pair of eyeballs on your task;

If you want it to be automatic, I would suggest ditching the scheduled batch job and have either a trigger that does a callout that does the CRUD operation with the record data, sybchronizing Salesforce with your external system (push), or code a webservice that synchorizes the external data with Slesforce and the external system consumes that resource.  This latter part would obviously mean working with a developer on the external system.     The JSON classes and utilities are good for this kind of use case.  There are also third party JSON parsing (and XML) classes out there that are free and even more lightweight and easier to use than the Salesforce ones.

I still think a scheduled batch interface will run into governor limits;  
10 callouts per scheduled task (with to transaction-calls for data integrity) sounds like a you are very limited in what you can do - unless you pass HUGE amounts of JSON on each calls - but even that may or may not be sufficient.   But obviously you know more about the particuar situation than me.