function readOnly(count){ }
Starting November 20, the site will be set to read-only. On December 4, 2023,
forum discussions will move to the Trailblazer Community.
+ Start a Discussion
SONAL GSONAL G 

Wants to store more than 300000 characters in Salesforce object column.

Hi,
 I want to store my serialize object into the database. For this I am using Long Text Area(131072) data type. But my serialize object contains more than 300000 characters which is throwing me to an error STRING_TOO_LONG, Object Json: data value too large:
So is any other option or data type is available to store such long string in salesforce database ?? or what else i can do in this situation??

Thanks in advance !!
bob_buzzardbob_buzzard
You have a couple of choices - store the value as an attachment or file, or break it up into multiple fields - I use the latter myself, and then join the fields back into a single message string in code.
SONAL GSONAL G
Thanks for reply,
It's true that I can break up and store into multiple fields !! But the another problem is, I wants to deserialise it and show on VF page... which is giving me an error as CPU limit exceeded !! as there are multiple records in the JSON which I am binding with wrapper class !!
SONAL GSONAL G
I am trying for following scenario,
I am having VF page where user can select upto 500 products and for each product can select upto 71 states. When user selects any product or state I am iteraing data ,performing some calculations and showing to user instantly, after that all data I am storing in the wrapper class (memory), and when user click on save button only that time I am serialising wrapper class data and storing into database.

But when user selects 200 products with each 71 states it's throwing CPU limit exceeded error while adding to wrapper class!!
Also I tried to stored 100 products with 71 states for each product mannually to database just to check it accepts or not ?? it gave me STRING_TOO_LONG, Object Json: data value too large:  error !!

So I got stuck in the salesforce limits and not understading exactly what to do in this case !!

Can anyone guide me on this ??

Thanks in Advance !!
 
Naore AzenkutNaore Azenkut
Hi Sonal,

Quick and dirty:
Serialize on the client, split to two (maybe 3 for future proofing) strings push each string to a different field it via REST API (avoid any APEX limitations). Note, this will probably not work on mobile or old browsers. This is bad practice. Please use this approach with extreme care. Or better yet, don't use it at all (:
A quick improvement to that approach would be, as Bob wrote above, to push one attachment txt file instead of 3 fields.

I would urge you to reconsider the serialization approach - using  a wrapper class for all the data may is probably a sub par solution. How about custom data model representation? Perhaps 2 new tables (A Master object and a detail object) could solve the issue by incrementally pushing data into the server (e.g. AJAX from the client creating a master record and then incrementally pushing the details).

Hope this helps,
Naore
test 1971test 1971
hi bob_buzzard,
Hi,
 I want to store my serialize object into the database. For this I am using Long Text Area(131072) data type. But my serialize object contains more than 300000 characters which is throwing me to an error STRING_TOO_LONG, Object Json: data value too large: 

You have a couple of choices - store the value as an attachment or file, or break it up into multiple fields - I use the latter myself, and then join the fields back into a single message string in code.

could you share the code how you have implemented to over come this issue.
Gabriel "Ibeatdungeon" MillerdGabriel "Ibeatdungeon" Millerd

One of the common patterns here for storing data with a limited field size is to use an ordinal. Giving you a pagination like pattern.

So if you have 10 characters to put into a 2 character size limit field , you would create 4 successive records (5 in total). The object would have a ordinal field that tells you 1 of 5, 2 of 5 and so forth.

{ docid__c: 1, ordinal__c: 1, content__c: "12} ,
{ docid__c: 1, ordinal__c: 2, content__c: "34"} ,
{ docid__c: 1, ordinal__c: 3, content__c: "56"} ,
{ docid__c: 1, ordinal__c: 5, content__c: "90"} ,
{ docid__c: 1, ordinal__c: 4, content__c: "78"}
Then when you want to re-stringify the content you
select content_c from obj where docid__c=1 order by ordinal__c asc
You can use docid to be the parentid (eg, the ordinal 1's id) or whatever identifier you have for your integration. Its also handy to hold the max ordinal size (eg, the 5 above) if you integration could use it as a helper at runtime.

The limits here are heap size (string size, unless you stream) and the return row limit (50,000) ... or 131,072 * 50,000; but still you can chunk and stream.