• Rick Paugh 3
  • NEWBIE
  • 25 Points
  • Member since 2017

  • Chatter
    Feed
  • 0
    Best Answers
  • 1
    Likes Received
  • 0
    Likes Given
  • 2
    Questions
  • 9
    Replies
Is possible to disable cache during Lightning Component development?  I'm running into an issue where I've fixed a bug, but keep getting the error when I test.  And I know it's a caching issue because the error is on line number 29, but my component only has 24 lines of code.  I've tried disabling cache in the org and refreshing the web browser without cache.  Are there any other options available to ensure every time I load a Lightning component I'm getting the latest code?
I have a page layout that display some basic information about a custom object.  I have a button on that detail page that fires off a VisualForce page that runs some code from a custom controller to update the data for the record in question then redirects the user back to the record detail page.  The problem is that after the redirect to the standard record detail page from the VisualForce page, none of the new data is there - only the old data is shown.  If I refresh the page (simple F5 or Ctrl+F5), all of the new data shows up.  My question is this, how do I force a standard record detail page to refresh everytime it's loaded rather than pulling from cache?
I have a page layout that display some basic information about a custom object.  I have a button on that detail page that fires off a VisualForce page that runs some code from a custom controller to update the data for the record in question then redirects the user back to the record detail page.  The problem is that after the redirect to the standard record detail page from the VisualForce page, none of the new data is there - only the old data is shown.  If I refresh the page (simple F5 or Ctrl+F5), all of the new data shows up.  My question is this, how do I force a standard record detail page to refresh everytime it's loaded rather than pulling from cache?
HI,
I have a problem when I try to run the Encrypt.bat file in the data loader using the command prompt. I installed ava 8 Update 202 (64-bit), Java(TM) SE Development Kit 11.0.2 (64-bit). Zulu 11.29 (64-bit) then I installed Data Loader version 45.0.0.
As I want to use command line interface datta loader first i tried to run encrypt.bat. The following problem occurs:
c:\Program Files (x86)\salesforce.com\Data Loader\bin>encrypt.bat -g test
Error: A JNI error has occurred, please check your installation and try again
Exception in thread "main" java.lang.UnsupportedClassVersionError: com/salesforce/dataloader/security/EncryptionUtil has been compiled by a more recent version of the Java Runtime (class file version 55.0), this version of the Java Runtime only recognizes class file versions up to 52.0
at java.lang.ClassLoader.defineClass1(Native Method)
at java.lang.ClassLoader.defineClass(Unknown Source)
at java.security.SecureClassLoader.defineClass(Unknown Source)
at java.net.URLClassLoader.defineClass(Unknown Source)
at java.net.URLClassLoader.access$100(Unknown Source)
at java.net.URLClassLoader$1.run(Unknown Source)
at java.net.URLClassLoader$1.run(Unknown Source)
at java.security.AccessController.doPrivileged(Native Method)
at java.net.URLClassLoader.findClass(Unknown Source)
at java.lang.ClassLoader.loadClass(Unknown Source)
at sun.misc.Launcher$AppClassLoader.loadClass(Unknown Source)
at java.lang.ClassLoader.loadClass(Unknown Source)
at sun.launcher.LauncherHelper.checkAndLoadMain(Unknown Source)


I tried different versions of java jdk and jre but the result seems the same. Do you have some suggestions? 
thank you
Is possible to disable cache during Lightning Component development?  I'm running into an issue where I've fixed a bug, but keep getting the error when I test.  And I know it's a caching issue because the error is on line number 29, but my component only has 24 lines of code.  I've tried disabling cache in the org and refreshing the web browser without cache.  Are there any other options available to ensure every time I load a Lightning component I'm getting the latest code?
I have a page layout that display some basic information about a custom object.  I have a button on that detail page that fires off a VisualForce page that runs some code from a custom controller to update the data for the record in question then redirects the user back to the record detail page.  The problem is that after the redirect to the standard record detail page from the VisualForce page, none of the new data is there - only the old data is shown.  If I refresh the page (simple F5 or Ctrl+F5), all of the new data shows up.  My question is this, how do I force a standard record detail page to refresh everytime it's loaded rather than pulling from cache?
<!--campingListItem.cmp-->
<aura:component >
	<aura:attribute name="item" type="Camping_Item__c" required="true"/>

	<ui:outputText value="{!v.item.Name}" />
	<ui:outputCheckbox value="{!v.item.Packed__c}" />
	<ui:outputCurrency value="{!v.item.Price__c}" />
	<ui:outputNumber value="{!v.item.Quantity__c}" />
	<ui:button label="Packed!" press="{!c.packItem}"/>
	
</aura:component>
<!--campingListController.js-->
({
	packItem : function(component, event, helper) {
		var button = event.getSource().get("v.disabled");
		component.set("v.item.Packed__c", "true");
		component.set(button, "true");
	}
})
What am I doing wrong?

 

I am getting a ‘Regex too complicated’ error below when loading data into our org using the following process:

 

1) an email service to receive the CSV data,

2) an APEX class to split and validate the CSV data, and then

3) a set of @future calls to upsert the data.

 

The same data works in smaller volumes, but not beyond a certain threshold. This applies whether we reduce the number of rows, or reduce the width of certain columns of data by truncating them to 3000 characters (a small number of columns have 10,000 characters of text included). When we do either or both of these steps in any combination to reduce the file size, we don't get this problem. It’s not a problem with a specific badly formatted row either, because reducing the number of rows in various combinations always causes the problem to go away.

 

So we don’t believe it is actually a regex problem, because the regular expression is just finding commas to split up a comma separated file/string - i.e. it's very simple.

 

This is why we think there's an undocumented storage or capacity limit somewhere within the APEX processing that is being exceeded - but one that doesn't have a governor limit associated with it, or indeed an accurate error message. We think it is an erroneous error message - i.e. it's not to do with complicated regex – and that this error message is a symptom of another issue.

 

This error has occurred in code that has been stable to date, but has appeared since the filesize we're uploading has increased to beyond about 4600-4800KB, which seems to be the threshold beyond which this problem occurs. There seem to be some undocumented limits in the volume of data than can be processed using the solution architecture we've designed.

 

We want to be able to code around this problem, but unless we know exactly what the error is, any changes we make to our code may not actually fix the problem and result in wasted effort. So I don't want to start changing this until I know exactly which part of the solution needs to be changed!

 

I’ve raised this with Salesforce as a potential bug or to see if they could clarify any undocumented limits on processing large volume datasets using the process we’ve designed, but they seem to have decided it’s a developer issue so won’t help.

 

The error message is below:

 

Apex script unhandled exception by user/organization: 

Failed to invoke future method 'public static void PrepareCSV(String, String, String, Integer, Boolean)'

caused by: System.Exception: Regex too complicated

Class.futureClassToProcess.GetList: line 98, column 17
Class.futureClassToProcess.parseCSV: line 53, column 38
Class.futureClassToProcess.PrepareCSV: line 35, column 20 External entry point

 The relevant code snippet is below:

 

 

 

public static list<List<String>> GetList(String Content)
        {
        Content = Content.replaceAll(',"""',',"DBLQT').replaceall('""",','DBLQT",');
            Content = Content.replaceAll('""','DBLQT');
            List<List<String>> lstCSV = new List<List<String>>();
            Boolean Cont = true;
            while (Cont == true){
                List<String> lstS = Content.Split('\r\n',500);
                if(lstS.size() == 500){
                    Content =lstS[499];
                    lstS.remove(499);
                }else{
                    Cont = false;
                }
                lstCSV.add(lstS);
            }
            return lstCSV;
        }

 

Any suggestions gratefully received as to whether we're missing something obvious, whether 4MB+ files just can't be processed this way, or whether this might actually be a SFDC APEX bug.

 

 

 

public static list<List<String>> GetList(String Content)
        {
            //Sanjeeb
            Log('GetList started.');
            Content = Content.replaceAll(',"""',',"DBLQT').replaceall('""",','DBLQT",');
            Log('Replaing DBLQT.');
            Content = Content.replaceAll('""','DBLQT');
            Log('Replaing DBLQT.');
            List<List<String>> lstCSV = new List<List<String>>();
            Boolean Cont = true;
            while (Cont == true){
                List<String> lstS = Content.Split('\r\n',500);
                Log('Split upto 500 Rows.');
                //List<String> lstS = Content.Split('\r\n',1000);
                if(lstS.size() == 500){
                    Content =lstS[499];
                    lstS.remove(499);
                }else{
                    Cont = false;
                }
                lstCSV.add(lstS);
            }
            Log('GetList ends.');
            return lstCSV;
        }