You need to sign in to do that
Don't have an account?
Danny_Teng__c
Salesforce cannot handle 6 million records
Hi,
About early last year, we decided to purchase additonal data storage to store millions of data records. We were considering if we should go with regular relational database hosted in SQL server or put them into Salesforce. We opted for 2nd option because we tought we can take advantage Salesforce reporting/dashboard, build additional custom logic with APEX, etc.
Unforunately after spending days of integration, data clean up, ETL to bring the data into our Salesforce, we discovered that Salesforce keeps timing out when we try to view the data with View list, when we try to build reports, or when we try to perform SOQL. The thing is, we are not trying to query all the data. We were just trying to pull subset of data with filter by specific zip code or city. However, no matter what, we kept getting system times out error message.
We logged a ticket and got escalated to tier 3. Support Tier 3 said that this is Salesforce limitation. We were told to download the data somewhere else to view or perform reporting. I feel this is a very unacceptable answer. If this is a real limitation, I feel that Salesforce should have disclosed this info before we purchased the additional data storage.
I am out of luck right now. I am working with our Account Execetive, hoping we can get some refund. Now, I am considering about storing the data in Postgrees / Heroku and use Heroku Connect to have the data exposed in our Salesforce.
What do you think? Should I go further with this idea, or should I just go with traditional database & web services?
Thanks for your help in advance.
About early last year, we decided to purchase additonal data storage to store millions of data records. We were considering if we should go with regular relational database hosted in SQL server or put them into Salesforce. We opted for 2nd option because we tought we can take advantage Salesforce reporting/dashboard, build additional custom logic with APEX, etc.
Unforunately after spending days of integration, data clean up, ETL to bring the data into our Salesforce, we discovered that Salesforce keeps timing out when we try to view the data with View list, when we try to build reports, or when we try to perform SOQL. The thing is, we are not trying to query all the data. We were just trying to pull subset of data with filter by specific zip code or city. However, no matter what, we kept getting system times out error message.
We logged a ticket and got escalated to tier 3. Support Tier 3 said that this is Salesforce limitation. We were told to download the data somewhere else to view or perform reporting. I feel this is a very unacceptable answer. If this is a real limitation, I feel that Salesforce should have disclosed this info before we purchased the additional data storage.
I am out of luck right now. I am working with our Account Execetive, hoping we can get some refund. Now, I am considering about storing the data in Postgrees / Heroku and use Heroku Connect to have the data exposed in our Salesforce.
What do you think? Should I go further with this idea, or should I just go with traditional database & web services?
Thanks for your help in advance.
http://www.salesforce.com/docs/en/cce/ldv_deployments/salesforce_large_data_volumes_bp.pdf
Please have a look at this document as it will be helpful in understanding how to deal with the issue you're facing. Hope it helps.