You need to sign in to do that
Don't have an account?
Wm Peck 1958
Options for Data Warehouses from with Salesforce ERP
We are using Salesforce and planning on using Big Objects as the Data Warehouse, then use Einstein to pull from Big Objects and then report fancy Dashboards and such. (We are migrating from an Oracle environment, some components in Production.)
But we don't have Big Data, we have little data, so I am confused why we would need a "Big Data" solution in Big Objects. Also, as an Oracle developer, the 100 table limitation in Big Objects (seriously) perplexes me.
Couldn't our Data Warehouse simply be another Salesforce org and just move the data there (and switch it up a bit), and then report out of it normally (or use Einstein)? All the industrial strength data warehouse solutions are for big data - Azure, AWS, Google Cloud Platform, etc.
But we don't have Big Data, we have little data, so I am confused why we would need a "Big Data" solution in Big Objects. Also, as an Oracle developer, the 100 table limitation in Big Objects (seriously) perplexes me.
Couldn't our Data Warehouse simply be another Salesforce org and just move the data there (and switch it up a bit), and then report out of it normally (or use Einstein)? All the industrial strength data warehouse solutions are for big data - Azure, AWS, Google Cloud Platform, etc.
With Einstein analytics, it only supports standard , custom object an .csv files. It does no support external objects. We have to load all data into salesforce.
In your case,you may migrate some of the Oracle tables to big objects and keep some of them in custom object (or maybe standard object). Those would avoid 100 object limitation for big objects. But I don't think you can put in a separate org.
When using custom object, you may ask Salesforce support to index some of the fields. That will significantly improve reporting performance.
All Answers
With Einstein analytics, it only supports standard , custom object an .csv files. It does no support external objects. We have to load all data into salesforce.
In your case,you may migrate some of the Oracle tables to big objects and keep some of them in custom object (or maybe standard object). Those would avoid 100 object limitation for big objects. But I don't think you can put in a separate org.
When using custom object, you may ask Salesforce support to index some of the fields. That will significantly improve reporting performance.
great, thank you! Very helpful
Big Objects is way different from regular Salesforce, although you could use Einstein Analytics for both.
Big Objects is focused on three main areas:
1. Adding 360 degree view (multiple data sets) of a customer (e.g., Student)
2. Audit and tracking (like maybe track who comes on your campus with IoT devices)
3. Historical Archiving
Of course one of its main purposes is for Big Data (billions of rows) and fast retrieval time
It's not a relational data structure, it's denormalized a LOT. For example, you might have Student info in one object that drives all the way down to courses, grades, instructors (and all related tables).
But the biggest thing he emphasized is we need to have an Archive strategy. iow, What's it actually being used for?
and some Training resources:
https://trailhead.salesforce.com/en/content/learn/modules/big_objects
https://developer.salesforce.com/docs/atlas.en-us.bigobjects.meta/bigobjects/big_object.htm
https://developer.salesforce.com/docs/atlas.en-us.bigobjects.meta/bigobjects/big_object_define.htm
https://medium.com/big-object/custom-use-cases-where-you-need-big-objects-8ec14f421a2f