function readOnly(count){ }
Starting November 20, the site will be set to read-only. On December 4, 2023,
forum discussions will move to the Trailblazer Community.
+ Start a Discussion
MJ09MJ09 

Going over Data Storage limit

I'm managing an org that has a pretty large amount of data -- it has over 5GB of data for just one object, with about 3 million records for that object.

 

The org actually has a data storage limit (as reported by Setup | Data Management | Storage Usage) of 1GB, but the Storage Usage page says it has 5.8GB used, with a Percent Used of 582%.

 

I have a Batch Apex job that runs once a week to perform calculations based on those 3M records. The job starts by creating a QueryLocator for all the records, then iterates over them in the execute() method, 200 at a time. It can take the job 5 hours or more to run from start to finish.

 

Over the last few months, I've been getting many different intermittent errors from the Batch Apex job. Most of the errors seem platform-specific -- ACS errors, PLSQL exceptions, too many cursors in use (when the batch job is the only job running), among others. I'm wondering whether these errors could be related to the fact that we're so far over the data storage limit. Has anybody seen an org get so far over the limit? When it has, have you encountered any odd errors?

 

(Yes, I know, the owner of the org really should pay Salesforce to bump up their storage limits. I'm a consultant working for owner of the org, not the org owner myself. We're working on getting the org owner to pay for a higher limit. But in the meantime, I'm wondering whether the overage could be responsible for some of our intermittend platform errors.)

sfdcfoxsfdcfox

The intermittent errors are not related to the storage usage (several people have reported various problems with Batch Apex, to the point where some are trying to get a PM to talk to them... I think your organization is one of them). At any rate, you're not supposed to be able to go that far over your limit. Salesforce cuts you off shortly after you reach 100% capacity (it has a small grace period window to allow a couple of records past the limit). At any rate, it seems that Salesforce needs to spent another release cycle performing maintenance like they did a few years back...