You need to sign in to do that
Don't have an account?
Apex Data Loader Command Line Issue - Field Mapping Exception
I am using CLIq to build a CLI Data Loader Job. I keep getting exceptions that tell me that the field mappings are invalid for the fields in the CSV that should not be mapped. I get the following exception...
Exception occured during loadingcom.salesforce.dataloader.exception.MappingInitializationException: Field mapping is invalid: Street2 =>
For some reason, it does not like the fields that arent mapped. I am really confused why this is happening because the SDL file workes in the Data Loader GUI, but not the CLI.
Here is my process-conf file...
<!DOCTYPE beans PUBLIC "-//SPRING//DTD BEAN//EN" "http://www.springframework.org/dtd/spring-beans.dtd"> <beans> <bean id="paLeadInsert" class="com.salesforce.dataloader.process.ProcessRunner" singleton="false"> <description>Created by Dataloader Cliq.</description> <property name="name" value="paLeadInsert"/> <property name="configOverrideMap"> <map> <entry key="dataAccess.name" value="C:\dataloader\cliq_process\paLeadInsert\read\paLeadInsert.csv"/> <entry key="dataAccess.readUTF8" value="true"/> <entry key="dataAccess.type" value="csvRead"/> <entry key="dataAccess.writeUTF8" value="true"/> <entry key="process.enableExtractSuccessOutput" value="true"/> <entry key="process.enableLastRunOutput" value="true"/> <entry key="process.lastRunOutputDirectory" value="C:\dataloader\cliq_process\paLeadInsert\log"/> <entry key="process.mappingFile" value="C:\dataloader\cliq_process\paLeadInsert\config\paLeadInsert.sdl"/> <entry key="process.operation" value="insert"/> <entry key="process.statusOutputDirectory" value="C:\dataloader\cliq_process\paLeadInsert\log"/> <entry key="sfdc.bulkApiCheckStatusInterval" value="5000"/> <entry key="sfdc.bulkApiSerialMode" value="5000"/> <entry key="sfdc.debugMessages" value="false"/> <entry key="sfdc.enableRetries" value="true"/> <entry key="sfdc.endpoint" value="https://test.salesforce.com/services/Soap/u/21.0"/> <entry key="sfdc.entity" value="Lead"/> <entry key="sfdc.extractionRequestSize" value="500"/> <entry key="sfdc.insertNulls" value="false"/> <entry key="sfdc.loadBatchSize" value="100"/> <entry key="sfdc.maxRetries" value="3"/> <entry key="sfdc.minRetrySleepSecs" value="2"/> <entry key="sfdc.noCompression" value="false"/> <entry key="sfdc.password" value="ENCRYPTEDPASSWORD"/> <entry key="sfdc.timeoutSecs" value="60"/> <entry key="sfdc.useBulkApi" value="false"/> <entry key="sfdc.username" value="username@test.com"/> </map> </property> </bean> </beans>
And here is my SDL file....
# SDL Mapping File SearchKey= BorrowerBirthDate=Borrower_Birthdate__c MiddleName= FirstName=FirstName LoanPurpose=Purpose__c EmailHId= HashedSSN= Zip=PostalCode LeadSystemId= City=City Email=Email OwnerID= Street2= WorkPhone=Work_Phone__c Street1=Street CoBorrowerHomePhone=Co_Borrower_Home_Phone__c Stat60Dt= CoBorrowerBirthDate=Co_Borrower_Birthdate__c CellularPhone=Mobile_Phone__c County=County__c Suffix= CLTV= CoBorrowerMaritalStatus=Co_Borrower_Marital_Status__c LoanNumber=Loan_Number__c WorkPhoneExtension= CoBorrowerLastName=Co_Borrower_Last__c GCId=GCId__c LoanAmount=Loan_Amount__c FICO= Comments=Description CoBorrowerFirstName=Co_Borrower_First__c DOB= BorrowerMaritalStatus=Marital_Status__c Zip4= HomePhone=Phone State=State CoBorrowerWorkPhone=Co_Borrower_Work_Phone__c InterestRate= LeadTypeCode=Lead_Type__c EncryptedSSN= LTV= CoBorrowerMiddleName= StatusId= LastName=LastName CoBorrowerCellPhone=Co_Borrower_Mobile_Phone__c BankerName=Banker_Name__c
Anyone know how to fix?
What happens when you remove the non-mapped field references from the SDL file?
~ Clint
Tried that, but then I get an exception that says those fields aren't mapped. I think you have to have all of your CSV column headers in the SDL file.
That's right.
I've bumped into that issue before and ended up removing the unmapped fields from my source CSV file. Not sure if your particular case would allow you to do the same.
This is a known issues in APex Data Loader 21 , try to down load and reinstall latest code or use earlier version
Great, that makes me feel better. Will 20.0 work?
yes no issues with 20
I am trying to map a nvarchar 255 field to Opportunity Description using DL 20. I am getting the same error message.
Any thoughts?
I'm having the same issue and I'm willing to try to older Data Loader -- so bone-headed question -- where can I find the older version of the Data Loader these days?
I checked SourceForge, and they have only the Java portion (and version 19, at that). I'd like to have the Windows installer version that packages the JVM, as I am sintalling this on a server and would rather have the fully encapsulated package.
Thanks,
Dave
"This is a known issues in APex Data Loader 21"... Currently, Salesforce proposes version 23... and there's still the bug... it's not very serious...
I have found this site to have the Apex Data Loader archives. Version 20 worked great for me.
http://www.cloudsuccess.com/resource-centre/apex-data-loader-archive/
I ran into this problem today and I noticed I have to remove the Whitespace in my SDL File after th __c
Sample:
Support_Level__c=Support_Level__c<I had Whitespave Here>
Once I removed the Whitespace I would good to go.
Great!!! Worked for me too.
Thanks :)
Post is kind of old but I encountered this problem right now.
For some reason, it worked when I matched the column heading in the csv file with the field label in SFDC org.