function readOnly(count){ }
Starting November 20, the site will be set to read-only. On December 4, 2023,
forum discussions will move to the Trailblazer Community.
+ Start a Discussion
paulcrawfordpaulcrawford 

Problems with command line version of Data Loader 14.0

Hi,

I am having problems with the CLI on Apex Data Loader 14.0. In general, I can get it to work but there are certain issues. I have no problem with Insert and Update but I cannot get Upsert to work properly when it has to do an Insert nor can I get Extract to work properly. The Upsert problem is not so critical as I can always do Insert and Update separately but Extract is critical.

The following is the CLI output from running this batch file:

Batch file:

@echo off
c:
cd C:\Program Files\salesforce.com\Apex Data Loader 14.0\bin
process ..\conf accountExtract


Output file (login info deleted):

2009-01-16 12:47:23,912 INFO [main] controller.Controller initLog (Controller.java:388) - The log has been initialized
2009-01-16 12:47:23,922 INFO [main] process.ProcessConfig getBeanFactory (ProcessConfig.java:78) - Loading process configuration from config file: C:\Program Files\salesforce.com\Apex Data Loader 14.0\bin\..\conf\process-conf.xml
2009-01-16 12:47:23,972 INFO [main] xml.XmlBeanDefinitionReader loadBeanDefinitions (XmlBeanDefinitionReader.java:163) - Loading XML bean definitions from file [C:\Program Files\salesforce.com\Apex Data Loader 14.0\bin\..\conf\process-conf.xml]
2009-01-16 12:47:24,022 INFO [main] core.CollectionFactory (CollectionFactory.java:66) - JDK 1.4+ collections available
2009-01-16 12:47:24,032 INFO [main] core.CollectionFactory (CollectionFactory.java:71) - Commons Collections 3.x available
2009-01-16 12:47:24,965 INFO [analysisExtract] controller.Controller initConfig (Controller.java:343) - The controller config has been initialized
2009-01-16 12:47:24,975 INFO [analysisExtract] process.ProcessRunner run (ProcessRunner.java:102) - Initializing process engine
2009-01-16 12:47:24,975 INFO [analysisExtract] process.ProcessRunner run (ProcessRunner.java:105) - Loading parameters
2009-01-16 12:47:24,975 INFO [analysisExtract] config.LastRun load (LastRun.java:101) - Last run info will be saved in file: C:\Program Files\salesforce.com\Apex Data Loader 14.0\bin\..\conf\analysisExtract_lastRun.properties
2009-01-16 12:47:24,975 INFO [analysisExtract] process.ProcessRunner run (ProcessRunner.java:114) - Logging in to: https://www.salesforce.com
2009-01-16 12:47:24,985 INFO [analysisExtract] client.PartnerClient connectImpl (PartnerClient.java:156) - Beginning Partner Salesforce login ....
2009-01-16 12:47:25,006 INFO [analysisExtract] client.PartnerClient connectImpl (PartnerClient.java:165) - Salesforce login to https://www.salesforce.com/services/Soap/u/14.0 as user xxxxxxxxxxxx
2009-01-16 12:47:26,009 INFO [analysisExtract] dao.DataAccessObjectFactory getDaoInstance (DataAccessObjectFactory.java:51) - Instantiating data access object: C:\Program Files\salesforce.com\Apex Data Loader 14.0\conf\analysisExtract.csv of type: csvWrite
2009-01-16 12:47:26,009 INFO [analysisExtract] process.ProcessRunner run (ProcessRunner.java:119) - Checking the data access object connection
2009-01-16 12:47:26,009 INFO [analysisExtract] process.ProcessRunner run (ProcessRunner.java:124) - Setting field types
2009-01-16 12:47:26,822 INFO [analysisExtract] process.ProcessRunner run (ProcessRunner.java:128) - Setting object reference types
2009-01-16 12:47:30,715 INFO [analysisExtract] process.ProcessRunner run (ProcessRunner.java:132) - Creating Map
2009-01-16 12:47:30,715 INFO [analysisExtract] action.ActionFactory getActionInstance (ActionFactory.java:64) - Instantiating operation: extract
2009-01-16 12:47:30,715 INFO [analysisExtract] controller.Controller executeAction (Controller.java:125) - executing operation: extract
2009-01-16 12:47:31,136 INFO [analysisExtract] progress.NihilistProgressAdapter setSubTask (NihilistProgressAdapter.java:68) - Processed 500 of 621 total records. Rate: 0 records per hour. Estimated time to complete: 0 minutes and 0 seconds. There are 500 successes and 0 errors.
2009-01-16 12:47:31,337 INFO [analysisExtract] progress.NihilistProgressAdapter setSubTask (NihilistProgressAdapter.java:68) - Processed 621 of 621 total records. Rate: 11122000 records per hour. Estimated time to complete: 0 minutes and 0 seconds. There are 621 successes and 0 errors.
2009-01-16 12:47:31,337 INFO [analysisExtract] progress.NihilistProgressAdapter doneSuccess (NihilistProgressAdapter.java:55) - The extract has fully completed. There were 621 successful extracts and 0 errors.


Although it says it was successful, when you open the file analysisExtract.csv you get:

"ID"
""
""
""
""
""

with 621 empty strings (not just 5 as shown above).

Here is the bean that runs from process-conf.xml (login info deleted):



class="com.salesforce.lexiloader.process.ProcessRunner"
singleton="false">
analysisExtract downloads Analysis data from salesforce using 'extract' and puts them into a csv file.


























So apparently the code ran correctly (no errors) and did generate a csv file with 621 entries as expected but there was no data in any of them. Also what happened to the other fields requested in the SOQL call i.e. Account__c, Name, and Sampling_Date__c?

If I remove the "Id" field from the SOQL call then I get an error as in the following output file:

2009-01-16 13:00:58,796 INFO [main] controller.Controller initLog (Controller.java:388) - The log has been initialized
2009-01-16 13:00:58,806 INFO [main] process.ProcessConfig getBeanFactory (ProcessConfig.java:78) - Loading process configuration from config file: C:\Program Files\salesforce.com\Apex Data Loader 14.0\bin\..\conf\process-conf.xml
2009-01-16 13:00:58,846 INFO [main] xml.XmlBeanDefinitionReader loadBeanDefinitions (XmlBeanDefinitionReader.java:163) - Loading XML bean definitions from file [C:\Program Files\salesforce.com\Apex Data Loader 14.0\bin\..\conf\process-conf.xml]
2009-01-16 13:00:58,906 INFO [main] core.CollectionFactory (CollectionFactory.java:66) - JDK 1.4+ collections available
2009-01-16 13:00:58,906 INFO [main] core.CollectionFactory (CollectionFactory.java:71) - Commons Collections 3.x available
2009-01-16 13:00:59,829 INFO [analysisExtract] controller.Controller initConfig (Controller.java:343) - The controller config has been initialized
2009-01-16 13:00:59,839 INFO [analysisExtract] process.ProcessRunner run (ProcessRunner.java:102) - Initializing process engine
2009-01-16 13:00:59,839 INFO [analysisExtract] process.ProcessRunner run (ProcessRunner.java:105) - Loading parameters
2009-01-16 13:00:59,839 INFO [analysisExtract] config.LastRun load (LastRun.java:101) - Last run info will be saved in file: C:\Program Files\salesforce.com\Apex Data Loader 14.0\bin\..\conf\analysisExtract_lastRun.properties
2009-01-16 13:00:59,839 INFO [analysisExtract] process.ProcessRunner run (ProcessRunner.java:114) - Logging in to: https://www.salesforce.com
2009-01-16 13:00:59,849 INFO [analysisExtract] client.PartnerClient connectImpl (PartnerClient.java:156) - Beginning Partner Salesforce login ....
2009-01-16 13:00:59,869 INFO [analysisExtract] client.PartnerClient connectImpl (PartnerClient.java:165) - Salesforce login to https://www.salesforce.com/services/Soap/u/14.0 as user xxxxxxxxxxx
2009-01-16 13:01:01,093 INFO [analysisExtract] dao.DataAccessObjectFactory getDaoInstance (DataAccessObjectFactory.java:51) - Instantiating data access object: C:\Program Files\salesforce.com\Apex Data Loader 14.0\conf\analysisExtract.csv of type: csvWrite
2009-01-16 13:01:01,093 INFO [analysisExtract] process.ProcessRunner run (ProcessRunner.java:119) - Checking the data access object connection
2009-01-16 13:01:01,093 INFO [analysisExtract] process.ProcessRunner run (ProcessRunner.java:124) - Setting field types
2009-01-16 13:01:01,996 INFO [analysisExtract] process.ProcessRunner run (ProcessRunner.java:128) - Setting object reference types
2009-01-16 13:01:05,909 INFO [analysisExtract] process.ProcessRunner run (ProcessRunner.java:132) - Creating Map
2009-01-16 13:01:05,909 INFO [analysisExtract] action.ActionFactory getActionInstance (ActionFactory.java:64) - Instantiating operation: extract
2009-01-16 13:01:05,909 INFO [analysisExtract] controller.Controller executeAction (Controller.java:125) - executing operation: extract
2009-01-16 13:01:05,909 ERROR [analysisExtract] csv.CSVFileWriter setColumnNames (CSVFileWriter.java:239) - Error opening CSV file for writing: header row (with column names) has to be provided
2009-01-16 13:01:05,919 ERROR [analysisExtract] progress.NihilistProgressAdapter doneError (NihilistProgressAdapter.java:51) - Error opening CSV file for writing: header row (with column names) has to be provided

See http://forums.sforce.com/sforce/board/message?board.id=scontrols&message.id=1930&query.id=22853#M1930
for a similar but different problem with Data Loader 12.0.

Can anyone shed some light on this?

If we can solve this one I would like to attack the problem with the Upsert failing on Insert as well.

Thanks for your help.

Paul Crawford
gtuerkgtuerk

Can you post your process-conf.xml?  I've seen problems with the DataLoader output .csv when you don't have the fields named correctly in your .sdl mapping file.

 

<entry key="sfdc.extractionSOQL" value="Select Id, Name, Order_Stage__c from Service_Order__c where Order_Stage__c = 'Revenue Commitment'"/>
                <entry key="process.operation" value="extract"/>
                <entry key="process.mappingFile" value="c:\Program Files\salesforce.com\Apex Data Loader 14.0\samples\conf\soExtractMap.sdl"/>
                <entry key="dataAccess.type" value="csvWrite"/>
                <entry key="dataAccess.name" value="c:\temp\serviceOrders.csv"/>

 

#Mapping values
#Mon Jan 23 17:07:24 PST 2006
Bandwidth__c=BANDWIDTH__C
Due_Date__c=DUE_DATE__C
Name=NAME
Id=ID
Account__c=ACCOUNT__C
Order_Stage__c=ORDER_STAGE__C
 

paulcrawfordpaulcrawford

Thanks for the suggestion. Actually, I had already figured it out for myself a few weeks ago. I was led a bit astray by the documentation in:

 

 http://www.apexdevnet.com/media/Cheatsheet_Setting_Up_Automated_Data_Loader_9_0.pdf

 

which would imply that you do not need an sdl file for an Extract. That, of course, is not the case. Once I put in a reference to the correct sdl in process-conf.xml everything worked perfectly.

 

At this stage, I don't know if the situation was different for ADL v9.0 but certainly it is the case for  ADL v14.0. Perhaps someone needs to update the documentation....

gtuerkgtuerk

Paul

Glad you were able to figure this out.  I had similar issues with the documentation and wrote up what I felt was deficient on a previous post.  SDL was one of the key callout areas.  take care

g

GoForceGoGoForceGo

 

I struggled with extract today, since it isn't well documented.

 

Here is what I found:

 

1. You do not need an sdl map file to extract - if you don't, the Data Loader just uses the column names in the SOQL query to output the file.  Which of course might not look pretty, since it will say things like __c. [select id, Status__c from Mytable__c] will output two columns, if there is no mapping file -> id and Status__c

2. If you create an sdl file, it better be well formatted. SDL file purpose would be to give your column different names from what is in SOQL

 

3. Formatting has to be REVERESE of what it is on import (insert,upsert,update). Sequence of fields in sdl does seem to be matter - better to put it in exact sequence as in the SOQL query. This will output id and Status. 

 

  • #Mapping values
    #Wed Apr 29 21:55:51 PDT 2009
    id = id
    Status__c = Status.

4. If you use references, use the exact string.

SOQL = Select id, Status__c, QQQ__r.Custom_Name__c from MyTable__c

 

sdl

 

#Mapping values
#Wed Apr 29 21:55:51 PDT 2009
id = id
Status__c = Status
QQQ__r.Custom_Name__c = QQQ Name

 


5. This can throw you off, since on imports, if you do upserts with external id and references, the mapping file uses a different way of expressing it. E.g it wil use QQQ Name = QQQ__r:\Custom_Name__c if you were importing. Custom_Name__c is an external id in the QQQ__c table.
Message Edited by GoForceGo on 05-08-2009 03:54 PM
Message Edited by GoForceGo on 05-08-2009 04:06 PM
Message Edited by GoForceGo on 05-08-2009 04:08 PM
jcalfeejcalfee

I'm getting the same error using a csvExtrat and this aggregate function:

 

Select max(Date_Submitted__c) FROM Daily_Script_Stat__c

 

This does not work with or without a SDL. I'm not sure what to put in the mapper for the aggregate function. I could not find an alias feature in the SOQL reference.