+ Start a Discussion
Nevin O'Regan 3Nevin O'Regan 3 

Big Object Limits

If I create a Big Object and then delete it is that counted towards the 100 big object limit? 
NagendraNagendra (Salesforce Developers) 
Hi Nevin,

Big objects in Salesforce are as much similar to custom objects in Salesforce.

The only difference is Big objects stores and manages massive amounts of data on the Salesforce platform.

Big object storage doesn't count against organization storage limit, if you delete a big object, if that's removed from the Salesforce servers then the limit will be restored automatically for big objects.

Hope this helps.

Kindly mark this as solved if the reply was helpful.

Thanks,
Nagendra
Nevin O'Regan 3Nevin O'Regan 3
Hi Nagendra,

Yes this is what I understood also but it doesn't seem to be true. I have 8 big objects deployed in my org, when I try to deploy one more big object I get an error stating that I have reached my maximum number of custom objects. I have only used 6% of my allocation. Do you know if I create 1 big object and then delete that big object if it is counted against my overall allocation of 100 big objects, even if it has been deleted? 
Ajay K DubediAjay K Dubedi
Hi Nevin,
Hope I got your requirement clearly so, you can create up to 100 big objects per org. The limits for big object fields are similar to the limits on custom objects and depend on your org’s license type. They have been built to provide consistent performance whether there are 1 million records, 100 million, or even 1 billion records. This scale is what gives big objects their power and what defines the features that are provided. Storage Limit of Big Objects should not change Governer Limits.
And for more details you can go through this link:
https://developer.salesforce.com/docs/atlas.en-us.bigobjects.meta/bigobjects/big_object.htm

Hope this will be helpful to you.
Thanks.
Ajay Dubedi 
Nevin O'Regan 3Nevin O'Regan 3
Hi Ajay,

Thank you for that information but it doesn't really help or answer my question. We purchased the extra Big Object license which gives us up to 50 million records of space. The issue that I am now facing is not around the data itself but the limits on the creation of the Big Objects themselves. Salesforce state that you can create up to 100 Big Objects per org, therefore one would assume that you can create 100 Custom_Big_Objects__b in a single Org. I have created 8 Custom_Big_Objects__b, when I try to deploy a 9th Custom_Big_Object__b I get an error message stating "problem: reached maximum number of custom objects". I can verify that I have only used 6% of my custom object allocation through System Overview, therefore I should have plenty of scope to create a lot more Custom_Big_Objects__b. Just to reiterate, I have only created 8 Custom_Big_Objects__b and the system will not allow me to create anymore Custom_Big_Objects__b "problem: reached maximum number of custom objects". 
This has nothing to do with data but more around limits. The confusion here is that I can't see how I can be hitting the Custom_Big_Object__b limit as I have only created 8 when I should be able to create 100.
Deepali KulshresthaDeepali Kulshrestha
Hi Nevin,

If you create a Big Object and then delete it is not counted towards the 100 big object limit.
A big object stores and manages massive amounts of data on the Salesforce platform.
You can archive data from other objects or bring massive datasets from outside systems
into a big object to get a full view of your customers. Clients and external systems
use a standard set of APIs to access big object data. A big object provides consistent
performance, whether you have 1 million records, 100 million, or even 1 billion. This
scale gives a big object its power and defines its features.
Available in: both Salesforce Classic and Lightning Experience
Available in: Enterprise, Performance, Unlimited, and Developer Editions for up to 1 million records
Additional record capacity and Async SOQL query is available as an add-on license.


There are two types of big objects.
Standard big objects—Objects defined by Salesforce and included in Salesforce products.
FieldHistoryArchive is a standard big object that stores data as part of the Field Audit
Trail product. Standard big objects are available out of the box and cannot be customized.
Custom big objects—New objects that you create to store information unique to your org.
Custom big objects extend the functionality that Lightning Platform provides. For example,
if you’re building an app to track product inventory, create a custom big object called
HistoricalInventoryLevels to track historical inventory levels for analysis and future
optimizations. This implementation guide is for configuring and deploying custom big objects.

I hope you find the above solution helpful. If it does, please mark as Best Answer to help others too.

Thanks and Regards,
Deepali Kulshrestha
 
Nevin O'Regan 3Nevin O'Regan 3
Hi Deepali,

Thank you for the indept infromation and taking the time to reply to this thread. However I'm not sure if you are correct regarding the number of deleted big objects being calculated against the total big object allocation. I did encounter this scenario with a system that I was working on, and after a lot of toing and froing with Salesforce (I think the case eventually had to be dealt with the product team in San Fran) it got resolved. The tech team had to reset the number back to 0. It was a unique scenario but one that I feel should be made availble for anyone that is considering Big Objects. 
Rinkle Kapur 7Rinkle Kapur 7
Hello Nevin
i was curious about this issue as well. Was this a salesforce bug?
Could you elaborate more on the solution? It would really be helpful.