function readOnly(count){ }
Starting November 20, the site will be set to read-only. On December 4, 2023,
forum discussions will move to the Trailblazer Community.
+ Start a Discussion

Data Loader Command Line Tool slow

I am trying to automate a data load using the Command Line tools and have everything working smoothly but it is taking 10x longer than if I do it through the Data Loader GUI.

Below is an exert from my process-conf.xml

>     <bean id="csvUpsertOrderItem"
>           class="com.salesforce.dataloader.process.ProcessRunner"
>           singleton="false">
>         <description>Upsert Transaction Headers into Orders standard object.</description>
>         <property name="name" value="csvUpsertOrderItem"/>
>         <property name="configOverrideMap">
>             <map>
>                 <entry key="sfdc.debugMessages" value="false"/>
>                 <entry key="sfdc.endpoint" value="CUSTOM ENDPOINT"/>
>                 <entry key="sfdc.username" value="USERNAME"/>
>                 <entry key="sfdc.password" value="ENCRYPTED PASSWORD"/>
>                 <entry key="process.encryptionKeyFile" value="C:\Program Files (x86)\\Data Loader\bin\key.txt"/>
>                 <entry key="sfdc.timeoutSecs" value="540"/>
>                 <entry key="sfdc.loadBatchSize" value="2000"/>
>                 <entry key="sfdc.entity" value="OrderItem"/>
>                 <entry key="process.operation" value="upsert"/>
>                 <entry key="sfdc.useBulkApi" value="true"/>
>                 <entry key="sfdc.bulkApiSerialMode" value="true"/>
>                 <entry key="sfdc.externalIdField" value="SlId__c"/>
>                 <entry key="process.mappingFile" value="C:\Users\User\Google Drive\Automation\OrdersLine21Jan.sdl"/>
>                 <entry key="process.outputError" value="C:\Users\User\downloads\Logs\errorUpsertOrderItem.csv"/>
>                 <entry key="process.outputSuccess" value="C:\Users\User\downloads\Logs\successUpsertOrderItem.csv"/>
>                 <entry key="" value="C:\Users\User\Google Drive\Automation\JAEG_TransactionDetails.csv" />
>                 <entry key="dataAccess.type" value="csvRead" />
>             </map>
>         </property>     </bean>

From my research, it seems to be something to do with either the debug log (I think most likely) or batch size.

I have set the sfdc.debugMessages to 'false' so it is not writing the files but it does seem to write it to the command screen.  I feel this could be causing the problem, is there a default log setting?  Maybe a process command setting?

In the data loader document it says the max sfdc.loadBatchSize is 200 but through the UI it sets it to 2000 when batch is true.  If that does restrict it, that could explain it.

I just cant find anything recent about this problem, anyone had any luck doing this at full pace recently?
Daniel BallingerDaniel Ballinger
If you are using the Bulk API, then each batch would be limited to 10,000 records. The 200/2000 limits would have been coming from the Partner API. It's worth trying to increase the batch sizes as a first set.

I also see the sfdc.bulkApiSerialMode setting is true. At a guess, case that be set to false so that the bulk API is used in parallel to load the records?
Great suggestion but unfortunately no luck.

                <entry key="sfdc.loadBatchSize" value="10000"/>
                <entry key="sfdc.entity" value="OrderItem"/>
                <entry key="process.operation" value="upsert"/>
                <entry key="sfdc.useBulkApi" value="true"/>
                <entry key="sfdc.bulkApiSerialMode" value="false"/>

Still takes over 12 hours but through data loader it takes about 30 mins and that is with the batch size of 2,000.

Any other ideas?
It is writing all the warnings to the command line, I wonder if that is slowing it down?  Does anyone know how to turn that off?
Ok so I think I am getting to the bottom of the problem but still dont know how to fix it.  The <entry key="sfdc.debugMessages" value="false"/> should stop the process from writing debug messages to the command line.... but it is not.  It is still writing all the logs in the command line.  It seems to be overwriten somewhere but I cant find where.

Sumit Jain CelebalSumit Jain Celebal
Were you able to fix this issue Kris?

I just noticed that you are running the bulkapi in serial mode.
 <entry key="sfdc.bulkApiSerialMode" value="true"/>

Try running in parallel mode and it may do the trick for you.

Marc ClaessenMarc Claessen

hi, I'm facing the same problem when uploading a large amount of records to SFDC using Dataloader CLI.

During execution of the load, it is generating the following warning messages:

visitor.BulkLoadVisitor writeSingleColumn ( - No value provided for field: Visiting_Housenumber_Suffix__c

This is slowing down the load enormously and eventually the load fails with error:

action.AbstractAction handleException ( - Exception occured during loading
com.salesforce.dataloader.exception.LoadException: ApiBatchItems Limit exceeded.

So I'm desperately seeking for a way to suppress these warning messages!

Kelly KKelly K
I was browsing around to see if there was a way to switch my CLI over to bulk and noticed the same error messages coming through "no value provided..." blah blah blah.

So here's what I found:

Essentially, there's not a way to disable the warning message, even with sfdc.debugMessages set to false, because it's not a debug message. Bulk API ignores blank values (does not update) where as the default will push null values. I guess they wanted to make sure this was blatantly obvious. The #N/A work around appears to work based on the testing I did.

Matt Matt 
You're right, those nags really slow down the CLI tool. Could consider to can pipe the messages somewhere other than the screen, or modify the JAR ( with the nag commented out.
Manminder Singh ThakurManminder Singh Thakur
Please vote for the Idea, I've merged all possible suggestions and discussions from various platforms.

Need Improvements in Dataloader and Dataloader CLI - Ideas - Salesforce Trailblazer Community (