Custom data loader to load ~300,000 records - Errors in Split function & Regex
We have the following requirements:
1. Load ~300,000 records into a custom object. There are 22 fields.
2. Validations - no duplicates (unique combination of four fields), no null values and correct data type.
3. Need to capture the errors at the record level and display to the user.
4. Insertion of records should not happen at all even if there is a single error in the data.
The system is financial in nature and users do not want to use the apex data loader.
Custom data loader (using apex code) to insert records into the custom object.
While loading records, we are getting multiple errors based on file size - Regex too complicated, view state exceeded etc.
Is there any alternative for this process OR any way to optimise the current code (even if it is fully optimised - will it validate and upload all the records)?