+ Start a Discussion

Custom data loader to load ~300,000 records - Errors in Split function & Regex

Hi All,

We have the following requirements:

1. Load ~300,000 records into a custom object. There are 22 fields.

2. Validations - no duplicates (unique combination of four fields), no null values and correct data type.

3. Need to capture the errors at the record level and display to the user.

4. Insertion of records should not happen at all even if there is a single error in the data.

The system is financial in nature and users do not want to use the apex data loader.


Custom data loader (using apex code) to insert records into the custom object.

While loading records, we are getting multiple errors based on file size - Regex too complicated, view state exceeded etc.


Is there any alternative for this process OR any way to optimise the current code (even if it is fully optimised - will it validate and upload all the records)?


set the batch size to smaller values.... may be 1.

Andy BoettcherAndy Boettcher

There are governor limits in place that may prevent you from performing this kind of transaction.


However - if you load the objects into an interstitial object (errors and all) - you can use Batch APEX to run through your custom logic.  All data stays in the system and you can meet your requirements.






I developed a batch apex process with a custom iterator in order to process a CSV file of 4MB and even higher size. For futher information: http://developer.financialforce.com/customizations/importing-large-csv-files-via-batch-apex/