You need to sign in to do that
Don't have an account?
Manid
confusion regarding data loader
i saw in documentation maximum batch size is 200. if we enabled bulk Api its is upto 10000 .
But how we can import more than 50000 - 5M records if it exceed the size? please anyone explain the overall dataloader process
But how we can import more than 50000 - 5M records if it exceed the size? please anyone explain the overall dataloader process
I am not sure what size you would be referring to over here.
Just to clarify a couple of things, Data Loader is capable of performing Data Manipulation operations such as Insert, Update, Delete upto 5 million records.
- Generally, the Data Loader processes these batches in a sequential mode i.e. one after another. However, when the Bulk API is used, by default, the batches for data manipulations are processed in parallel i.e. simultaneously. This provides the scale of performance with the heavy volume of records along with low upload times. However, you can explicitly enforce processing in serial mode i.e. one after another even with Bulk API by checking the Enable serial mode for Bulk API in Data Loader Settings.
I believe this should clarify your confusion.All Answers
I am not sure what size you would be referring to over here.
Just to clarify a couple of things, Data Loader is capable of performing Data Manipulation operations such as Insert, Update, Delete upto 5 million records.
- Generally, the Data Loader processes these batches in a sequential mode i.e. one after another. However, when the Bulk API is used, by default, the batches for data manipulations are processed in parallel i.e. simultaneously. This provides the scale of performance with the heavy volume of records along with low upload times. However, you can explicitly enforce processing in serial mode i.e. one after another even with Bulk API by checking the Enable serial mode for Bulk API in Data Loader Settings.
I believe this should clarify your confusion.Does the abvo explanation clarify your doubt? Do you stil need more clarification?
There a couple of ways to accomplish this.
Please close this thread in case your query has been addressed.
Hi Jigarshah,
The above response is very helpful. Thank you for posting the detailed information.
Regardsi have query for the same.
when performing delete operation on multiple records i.e 400, the data loader gives an error-- hed.TDTM_Affiliation: execution of AfterDelete caused by: System.QueryException: unexpected token: ',' (hed)
here TDTM indicates Table driven trigger management which is a TDTM is a framework for managing that code. Actually it's a tool to manage your code in Salesforce and control how Apex behaves. it is used in salesforce eda (Education Cloud) https://powerofus.force.com/s/article/EDA-TDTM-Overview
This error is only reflected when the delete operation is on a file of 400 records and when the file has only 21 records the delete operation is successful.
The user said that she changed the BATCH SIZE of data loader settings to 1 and all the 400 records were deleted.
i am not sure what can be the reason of this error?
Can you please help me to find the solution.
Nikita