• Sam Frid
  • NEWBIE
  • 0 Points
  • Member since 2023

  • Chatter
    Feed
  • 0
    Best Answers
  • 0
    Likes Received
  • 0
    Likes Given
  • 0
    Questions
  • 6
    Replies
Hi All,

Here is my business requirement and I never worked on webservices. 

1. We have specif objects and fields to consume by Oracle.
2. Oracle will consume the webservice and then should be able to  updated few fields in SFDC.

My Assumptions:
Simply generate WSDL and provide them to consume the webservice?
or Do we have to write any classes for each object to expose the data?

Please help me, how I have to achieve this.

regards,
Ajay

  • August 05, 2014
  • Like
  • 0
Hi,

I would like to automate my data upload in salesforce without data loader , please let me know if i can use anyother tool for performing the data upload automatically or is there any scripts that i can use to automate the upload of data.


Thanks in Advance
I need to query an Oracle database from my Salesforce1 app. Can someone suggest a way of doing the same using APEX ? 

Hi everyone,

Any help with this problem would be greatly appreciated.  I am currently trying to do a custom CSV import in order to automate the process of importing a CSV every day.  I have a VisualForce page that accepts the filename of the CSV file and calls my apex class to do the upload. (both are below) The CSV parser is working fine, after testing.  The problem I am having is with grabbing the related object ids for the import.  The error I am getting is "System.ListException: List index out of bounds: 17

Error is in expression '{!ReadFile}' in page fotaupload Class.FOTAuploader.ReadFile: line 116, column 1"

 

I think my problem is around the section with the comment "//create a list of the related zip code ids"

 

About my objects:

I have a zip code object that is a related list to my Activations object.

 

 

***********************************************************

FOTAuploader.cls

 

 

 

public with sharing class FOTAuploader {

 

    public string nameFile{get;set;}

    public Blob contentFile{get;set;}

    public Integer rowCount{get;set;}

    public Integer colCount{get;set;}

    List<Activations__c> actvstoupload;

    public Zip_Code_Master__c tempZip;

   

 

// taken from stackoverflow.com/questions/10425925/how-to-parse-a-csv-in-salesforce

    public static List<List<String>> parseCSV(String contents,Boolean skipHeaders) {

List<List<String>> allFields = new List<List<String>>();

 

// replace instances where a double quote begins a field containing a comma

// in this case you get a double quote followed by a doubled double quote

// do this for beginning and end of a field

contents = contents.replaceAll(',"""',',"DBLQT').replaceall('""",','DBLQT",');

// now replace all remaining double quotes - we do this so that we can reconstruct

// fields with commas inside assuming they begin and end with a double quote

contents = contents.replaceAll('""','DBLQT');

// we are not attempting to handle fields with a newline inside of them

// so, split on newline to get the spreadsheet rows

List<String> lines = new List<String>();

try {

lines = contents.split('\n');

} catch (System.ListException e) {

System.debug('Limits exceeded?' + e.getMessage());

}

Integer num = 0;

for(String line : lines) {

// check for blank CSV lines (only commas)

if (line.replaceAll(',','').trim().length() == 0) break;

 

List<String> fields = line.split(',');        

List<String> cleanFields = new List<String>();

String compositeField;

Boolean makeCompositeField = false;

for(String field : fields) {

if (field.startsWith('"') && field.endsWith('"')) {

cleanFields.add(field.replaceAll('DBLQT','"'));

} else if (field.startsWith('"')) {

makeCompositeField = true;

compositeField = field;

} else if (field.endsWith('"')) {

compositeField += ',' + field;

cleanFields.add(compositeField.replaceAll('DBLQT','"'));

makeCompositeField = false;

} else if (makeCompositeField) {

compositeField +=  ',' + field;

} else {

cleanFields.add(field.replaceAll('DBLQT','"'));

}

}

 

allFields.add(cleanFields);

}

if (skipHeaders) allFields.remove(0);

return allFields;                

}

 

 

   

    public Pagereference ReadFile()

    {                  

            //create a restore point incase the upload fails it can back out everything.

            Savepoint sp = Database.setSavepoint();

            

        actvstoupload = new List<Activations__c>();      

        List<List<String>> parsedCSV = new List<List<String>>();

        List<String> zips = new List<String>();

       

        //fill up the parsedCSV table

        rowCount = 0;

        colCount = 0;

        if (contentFile != null){

            String fileString = contentFile.toString();

            parsedCSV = parseCSV(fileString, false);

            rowCount = parsedCSV.size();

            for (List<String> row : parsedCSV){

                if (row.size() > colCount){

                    colCount = row.size();

                }

            }

         }

        

      //create a list of the related zip code ids

 

        for (Integer i=1;i<parsedCSV.size();i++)

        {

        zips[i] = parsedCSV[i][6].replaceAll('\"','');

        }

        List<Zip_Code_Master__c> zipList= [select id from Zip_Code_Master__c where name =:zips];

       

       

       

        for (Integer i=1;i<parsedCSV.size();i++)

        {

           

            Activations__c a = new Activations__c();

           

            a.Zip_Code_of_Activation__c = zipList[i].id;

           

           //process quantity field

            a.ActQty__c = Double.valueOf(parsedCSV[i][18].replaceAll('\"',''));    

 

                

                     //process date field  -- filter out the hour and minutes from the Date field.

                     Date dT = date.parse(parsedCSV[i][0].replaceAll('\"','').trim().substringBefore(' '));

            a.Date_entry__c = dT;  //get date from visualforce page 

 

 

            actvstoupload.add(a);

        }

        try{

        insert actvstoupload;

        }

        catch (Exception e)

        {

            Database.rollback(sp);

            ApexPages.Message errormsg = new ApexPages.Message(ApexPages.severity.ERROR,'An error has occured. Please check the template or try again later');

            ApexPages.addMessage(errormsg);

        }   

        return null;

    }

   

    public List<Activations__c> getuploadedActivations()

    {

        if (actvstoupload!= NULL)

            if (actvstoupload.size() > 0)

                return actvstoupload;

            else

                return null;                   

        else

            return null;

    }           

}

 

**********************************************************************************

 

FOTAupload.page

 

<apex:page sidebar="false" controller="FOTAuploader">
<apex:form >
<apex:sectionHeader title="Upload FOTA data from CSV file"/>
<apex:pagemessages />
<apex:pageBlock >
<center>
<apex:inputFile value="{!contentFile}" filename="{!nameFile}" /> <apex:commandButton action="{!ReadFile}" value="Upload File" id="theButton" style="width:70px;"/>
<br/> <br/> <font color="red"> <b>Note: Please use the standard .CSV file from the Retail Activations Portal. </b> </font>
</center>


<apex:pageblocktable value="{!uploadedActivations}" var="actv" rendered="{!NOT(ISNULL(uploadedActivations))}">
<apex:column headerValue="Actv Date">
<apex:outputField value="{!actv.Date_entry__c}"/>
</apex:column>
<apex:column headerValue="Zip">
<apex:outputField value="{!actv.Zip_Code_of_Activation__c}"/>
</apex:column>
<apex:column headerValue="Qty">
<apex:outputField value="{!actv.ActQty__c}"/>
</apex:column>
<apex:column headerValue="City-State">
<apex:outputField value="{!actv.City_State_Concatenation__c}"/>
</apex:column>
</apex:pageblocktable>

</apex:pageBlock>
</apex:form>
</apex:page>

 

Hi,

 

Just want to know is there any way to get file from FTP server, or I have to use third party tool?

 

Thanks in advance.

Ankit Arora

Blog | Facebook | Blog Page

I'm working on a migration routine of legacy data into Salesforce.  I must export the data after it gets into Salesforce in order to link objects with the SF internal IDs.  The data exporter only allows one export every 48 hours and I need to iterate more often that that. The scheduler only lets me do once a week!  I'm only testing with 30 records so I'm not talking a big export here. Is there anyway around this limitation of the data exporter? Can I use the Force IDE?