You need to sign in to do that
Don't have an account?
SONAL G
Wants to store more than 300000 characters in Salesforce object column.
Hi,
I want to store my serialize object into the database. For this I am using Long Text Area(131072) data type. But my serialize object contains more than 300000 characters which is throwing me to an error STRING_TOO_LONG, Object Json: data value too large:
So is any other option or data type is available to store such long string in salesforce database ?? or what else i can do in this situation??
Thanks in advance !!
I want to store my serialize object into the database. For this I am using Long Text Area(131072) data type. But my serialize object contains more than 300000 characters which is throwing me to an error STRING_TOO_LONG, Object Json: data value too large:
So is any other option or data type is available to store such long string in salesforce database ?? or what else i can do in this situation??
Thanks in advance !!
It's true that I can break up and store into multiple fields !! But the another problem is, I wants to deserialise it and show on VF page... which is giving me an error as CPU limit exceeded !! as there are multiple records in the JSON which I am binding with wrapper class !!
You can have look at "Continuation Class". Here are some links -
https://developer.salesforce.com/docs/atlas.en-us.apexcode.meta/apexcode/apex_continuation_overview.htm
https://developer.salesforce.com/docs/atlas.en-us.apexcode.meta/apexcode/apex_class_System_Continuation.htm
https://developer.salesforce.com/blogs/developer-relations/2015/02/apex-continuations-asynchronous-callouts-visualforce-pages.html
Thanks,
Sumit Kumar Singh
I am having VF page where user can select upto 500 products and for each product can select upto 71 states. When user selects any product or state I am iteraing data ,performing some calculations and showing to user instantly, after that all data I am storing in the wrapper class (memory), and when user click on save button only that time I am serialising wrapper class data and storing into database.
But when user selects 200 products with each 71 states it's throwing CPU limit exceeded error while adding to wrapper class!!
Also I tried to stored 100 products with 71 states for each product mannually to database just to check it accepts or not ?? it gave me STRING_TOO_LONG, Object Json: data value too large: error !!
So I got stuck in the salesforce limits and not understading exactly what to do in this case !!
Can anyone guide me on this ??
Thanks in Advance !!
Quick and dirty:
Serialize on the client, split to two (maybe 3 for future proofing) strings push each string to a different field it via REST API (avoid any APEX limitations). Note, this will probably not work on mobile or old browsers. This is bad practice. Please use this approach with extreme care. Or better yet, don't use it at all (:
A quick improvement to that approach would be, as Bob wrote above, to push one attachment txt file instead of 3 fields.
I would urge you to reconsider the serialization approach - using a wrapper class for all the data may is probably a sub par solution. How about custom data model representation? Perhaps 2 new tables (A Master object and a detail object) could solve the issue by incrementally pushing data into the server (e.g. AJAX from the client creating a master record and then incrementally pushing the details).
Hope this helps,
Naore
Hi,
I want to store my serialize object into the database. For this I am using Long Text Area(131072) data type. But my serialize object contains more than 300000 characters which is throwing me to an error STRING_TOO_LONG, Object Json: data value too large:
You have a couple of choices - store the value as an attachment or file, or break it up into multiple fields - I use the latter myself, and then join the fields back into a single message string in code.
could you share the code how you have implemented to over come this issue.
One of the common patterns here for storing data with a limited field size is to use an ordinal. Giving you a pagination like pattern.
Then when you want to re-stringify the content you You can use docid to be the parentid (eg, the ordinal 1's id) or whatever identifier you have for your integration. Its also handy to hold the max ordinal size (eg, the 5 above) if you integration could use it as a helper at runtime.So if you have 10 characters to put into a 2 character size limit field , you would create 4 successive records (5 in total). The object would have a ordinal field that tells you 1 of 5, 2 of 5 and so forth.
The limits here are heap size (string size, unless you stream) and the return row limit (50,000) ... or 131,072 * 50,000; but still you can chunk and stream.