• TJMaxx
  • NEWBIE
  • 0 Points
  • Member since 2008

  • Chatter
    Feed
  • 0
    Best Answers
  • 0
    Likes Received
  • 0
    Likes Given
  • 3
    Questions
  • 5
    Replies
Can there be exception to Case Escalation. For example December 25 (is a Holiday) and falls on a weekday (Tuesday).
Can we have a case escalation exception out here.
  • November 04, 2008
  • Like
  • 0
Is there a feature available in Eclipse to migrate Reports from Sandbox to Production.
What are the steps if it is avialable
  • October 28, 2008
  • Like
  • 0
           We have an exiting instance of Web-2-Case setup. But that is used in the intranet.
           We plan to setup a similar instance for internet.

            Default Origin has not been setup as Web but using name as Suggestion Box .
            Is it possible to setup a new instance.

Regards
TJ
               

   

Hi there,

 

I tried doing a Bulk API upsert, I have the below job definition:

 

  <?xml version="1.0" encoding="UTF-8" ?>
  <operation>upsert</operation>
  <object>Contact</object>
  <externalIdFieldName>External_Id__c</externalIdFieldName>
  <contentType>XML</contentType>
  </jobInfo>

 

This works fine except when there are duplicate external ids in the data, in which case it simply ignores subsequent entries, and I get this error:

...

- <errors>
  <fields>External_Id__c</fields>
  <message>Duplicate external id specified: test@test.com </message> 
  <statusCode>DUPLICATE_VALUE</statusCode>
  </errors>
  <success>false</success>
  <created>false</created>
  </result>
 
The External_Id__c field is set to unique, required and external id.
 
I thought the first record would be treated as an insert (which it does), and subsequent records as an update but is ignored. How can I get the subsequent records in the batch to update the inserted contact?
 
thanks,
Dan.
 
  • March 22, 2011
  • Like
  • 0

I am using CLIq to build a CLI Data Loader Job.  I keep getting exceptions that tell me that the field mappings are invalid for the fields in the CSV that should not be mapped.  I get the following exception...

 

 

Exception occured during loading
com.salesforce.dataloader.exception.MappingInitializationException: Field mapping is invalid: Street2 => 

Exception occured during loadingcom.salesforce.dataloader.exception.MappingInitializationException: Field mapping is invalid: Street2 => 

 

 

For some reason, it does not like the fields that arent mapped.  I am really confused why this is happening because the SDL file workes in the Data Loader GUI, but not the CLI.  

 

Here is my process-conf file...

 

 

<!DOCTYPE beans PUBLIC "-//SPRING//DTD BEAN//EN" "http://www.springframework.org/dtd/spring-beans.dtd">
<beans>
	<bean id="paLeadInsert" class="com.salesforce.dataloader.process.ProcessRunner" singleton="false">
		<description>Created by Dataloader Cliq.</description>
		<property name="name" value="paLeadInsert"/>
		<property name="configOverrideMap">
			<map>
				<entry key="dataAccess.name" value="C:\dataloader\cliq_process\paLeadInsert\read\paLeadInsert.csv"/>
				<entry key="dataAccess.readUTF8" value="true"/>
				<entry key="dataAccess.type" value="csvRead"/>
				<entry key="dataAccess.writeUTF8" value="true"/>
				<entry key="process.enableExtractSuccessOutput" value="true"/>
				<entry key="process.enableLastRunOutput" value="true"/>
				<entry key="process.lastRunOutputDirectory" value="C:\dataloader\cliq_process\paLeadInsert\log"/>
				<entry key="process.mappingFile" value="C:\dataloader\cliq_process\paLeadInsert\config\paLeadInsert.sdl"/>
				<entry key="process.operation" value="insert"/>
				<entry key="process.statusOutputDirectory" value="C:\dataloader\cliq_process\paLeadInsert\log"/>
				<entry key="sfdc.bulkApiCheckStatusInterval" value="5000"/>
				<entry key="sfdc.bulkApiSerialMode" value="5000"/>
				<entry key="sfdc.debugMessages" value="false"/>
				<entry key="sfdc.enableRetries" value="true"/>
				<entry key="sfdc.endpoint" value="https://test.salesforce.com/services/Soap/u/21.0"/>
				<entry key="sfdc.entity" value="Lead"/>
				<entry key="sfdc.extractionRequestSize" value="500"/>
				<entry key="sfdc.insertNulls" value="false"/>
				<entry key="sfdc.loadBatchSize" value="100"/>
				<entry key="sfdc.maxRetries" value="3"/>
				<entry key="sfdc.minRetrySleepSecs" value="2"/>
				<entry key="sfdc.noCompression" value="false"/>
				<entry key="sfdc.password" value="ENCRYPTEDPASSWORD"/>
				<entry key="sfdc.timeoutSecs" value="60"/>
				<entry key="sfdc.useBulkApi" value="false"/>
				<entry key="sfdc.username" value="username@test.com"/>
			</map>
		</property>
	</bean>
</beans>

 

And here is my SDL file....

 

 

# SDL Mapping File
SearchKey=
BorrowerBirthDate=Borrower_Birthdate__c
MiddleName=
FirstName=FirstName
LoanPurpose=Purpose__c
EmailHId=
HashedSSN=
Zip=PostalCode
LeadSystemId=
City=City
Email=Email
OwnerID=
Street2=
WorkPhone=Work_Phone__c
Street1=Street
CoBorrowerHomePhone=Co_Borrower_Home_Phone__c
Stat60Dt=
CoBorrowerBirthDate=Co_Borrower_Birthdate__c
CellularPhone=Mobile_Phone__c
County=County__c
Suffix=
CLTV=
CoBorrowerMaritalStatus=Co_Borrower_Marital_Status__c
LoanNumber=Loan_Number__c
WorkPhoneExtension=
CoBorrowerLastName=Co_Borrower_Last__c
GCId=GCId__c
LoanAmount=Loan_Amount__c
FICO=
Comments=Description
CoBorrowerFirstName=Co_Borrower_First__c
DOB=
BorrowerMaritalStatus=Marital_Status__c
Zip4=
HomePhone=Phone
State=State
CoBorrowerWorkPhone=Co_Borrower_Work_Phone__c
InterestRate=
LeadTypeCode=Lead_Type__c
EncryptedSSN=
LTV=
CoBorrowerMiddleName=
StatusId=
LastName=LastName
CoBorrowerCellPhone=Co_Borrower_Mobile_Phone__c
BankerName=Banker_Name__c

 

Anyone know how to fix?

 

 

Hello,

 

I have a SQL sever with db that is backup data of live website.

 

I want to create get data from SQL server on demand. I have read Dataloader documentation but not sure how I should Impement

 

 

http://wiki.developerforce.com/index.php/Using_Data_Loader_from_the_command_linehttp://www.developerforce.com/media/Cheatsheet_Setting_Up_Automated_Data_Loader_9_0.pdfhttps://na1.salesforce.com/help/doc/en/salesforce_data_loader.pdf

 

this is what I wan to do

 

1. Custom button on Account Page

2. Onclick pass Account Id to a process to query Single row from SQL server on network

3. If matching account found push data to Salesforce to custom object

4. compare values with salesforce Account to new record created from backupdb

5. Generate Mail merge document of the change fields.

 

I do not know who to implement step #2, appreciate any suggestions

 

Thanks your time.

 

Hi i want to Export a SalesForce Object from 1 salesforce to another salesforce how can i achieve this.
 
 
Thanks
Anand.