• davehilary
  • NEWBIE
  • 75 Points
  • Member since 2010

  • Chatter
    Feed
  • 2
    Best Answers
  • 1
    Likes Received
  • 0
    Likes Given
  • 4
    Questions
  • 7
    Replies

I am getting a ‘Regex too complicated’ error below when loading data into our org using the following process:

 

1) an email service to receive the CSV data,

2) an APEX class to split and validate the CSV data, and then

3) a set of @future calls to upsert the data.

 

The same data works in smaller volumes, but not beyond a certain threshold. This applies whether we reduce the number of rows, or reduce the width of certain columns of data by truncating them to 3000 characters (a small number of columns have 10,000 characters of text included). When we do either or both of these steps in any combination to reduce the file size, we don't get this problem. It’s not a problem with a specific badly formatted row either, because reducing the number of rows in various combinations always causes the problem to go away.

 

So we don’t believe it is actually a regex problem, because the regular expression is just finding commas to split up a comma separated file/string - i.e. it's very simple.

 

This is why we think there's an undocumented storage or capacity limit somewhere within the APEX processing that is being exceeded - but one that doesn't have a governor limit associated with it, or indeed an accurate error message. We think it is an erroneous error message - i.e. it's not to do with complicated regex – and that this error message is a symptom of another issue.

 

This error has occurred in code that has been stable to date, but has appeared since the filesize we're uploading has increased to beyond about 4600-4800KB, which seems to be the threshold beyond which this problem occurs. There seem to be some undocumented limits in the volume of data than can be processed using the solution architecture we've designed.

 

We want to be able to code around this problem, but unless we know exactly what the error is, any changes we make to our code may not actually fix the problem and result in wasted effort. So I don't want to start changing this until I know exactly which part of the solution needs to be changed!

 

I’ve raised this with Salesforce as a potential bug or to see if they could clarify any undocumented limits on processing large volume datasets using the process we’ve designed, but they seem to have decided it’s a developer issue so won’t help.

 

The error message is below:

 

Apex script unhandled exception by user/organization: 

Failed to invoke future method 'public static void PrepareCSV(String, String, String, Integer, Boolean)'

caused by: System.Exception: Regex too complicated

Class.futureClassToProcess.GetList: line 98, column 17
Class.futureClassToProcess.parseCSV: line 53, column 38
Class.futureClassToProcess.PrepareCSV: line 35, column 20 External entry point

 The relevant code snippet is below:

 

 

 

public static list<List<String>> GetList(String Content)
        {
        Content = Content.replaceAll(',"""',',"DBLQT').replaceall('""",','DBLQT",');
            Content = Content.replaceAll('""','DBLQT');
            List<List<String>> lstCSV = new List<List<String>>();
            Boolean Cont = true;
            while (Cont == true){
                List<String> lstS = Content.Split('\r\n',500);
                if(lstS.size() == 500){
                    Content =lstS[499];
                    lstS.remove(499);
                }else{
                    Cont = false;
                }
                lstCSV.add(lstS);
            }
            return lstCSV;
        }

 

Any suggestions gratefully received as to whether we're missing something obvious, whether 4MB+ files just can't be processed this way, or whether this might actually be a SFDC APEX bug.

 

 

 

public static list<List<String>> GetList(String Content)
        {
            //Sanjeeb
            Log('GetList started.');
            Content = Content.replaceAll(',"""',',"DBLQT').replaceall('""",','DBLQT",');
            Log('Replaing DBLQT.');
            Content = Content.replaceAll('""','DBLQT');
            Log('Replaing DBLQT.');
            List<List<String>> lstCSV = new List<List<String>>();
            Boolean Cont = true;
            while (Cont == true){
                List<String> lstS = Content.Split('\r\n',500);
                Log('Split upto 500 Rows.');
                //List<String> lstS = Content.Split('\r\n',1000);
                if(lstS.size() == 500){
                    Content =lstS[499];
                    lstS.remove(499);
                }else{
                    Cont = false;
                }
                lstCSV.add(lstS);
            }
            Log('GetList ends.');
            return lstCSV;
        }

I am building a Force.com application using only custom objects. One of the custom objects is called 'Properties' and holds household address information, which I wish to use to drive a mass mail merge. For example, I may have 200 property records, and I would like to generate PDF informational letters to send to all of them. I don't need to send emails.

 

At the moment I export the data to a CSV file and do a merge using MS Word on my PC, but I'd like to move this functionality into Salesforce so I can deploy it to a wider group of users and enforce consistent merge templates held in Salesforce.

 

In Salesforce the functionality I would like to use is 'Mass Mail Merge' but this seems to be limited to the standard Contacts object. Since I'll always be mail merging in bulk (100+ letters) using the individual merge functionality doesn't work either. There doesn't seem to be generic mail merge functionality that is useable within the Force.com platform, but this doesn't seem to me to be an unusual use case.

 

My options seem to be:

 

1. customise the whole process in Salesforce, using Visualforce templates for each letter template

    - taking this approach means we lose the ability to have users easily amend/rewrite Word templates, extensive ongoing VF template maintenance will be costly

2. use a paid App like Conga Mail Merge or LOOP Merge

    - taking this approach solves most of the problems, but we'd like to trigger these merges from a workflow which is not possible using this approach

3. use an external merge service like LiveDocx and send/receive merged files using web callouts

    -  taking this approach involves building customisations which effectively replicate what SFDC does. Also, there are governor limits on number of callouts which

4. use a hybrid mail provider who will do the merge and then also print, envelope, stamp & send the letters

    -  this solution is also constrained by the volume limits on sending data (e.g. the merge template and field data) outbound in a web callout (as far as I can tell, there's a 100kB limit on outbound data, whereas a 2 page Word letter merge template with a few images is 230kB). This also ties us to one hybrid mail provider and means we can't choose to print locally if we want.

 

Has anyone found a way to get the standard mail merge functionality to work within Force.com, or had to build or implement similar functionality?

 

Thanks.

Hello,

 

We have 2 custom objects: 'Property' and 'Marketed_Property'. These objects are related with a look up field on Marketed_Property. So in this relationship, Property is the Parent object and Marketed_Property is the Child object. We have a trigger on the 'Marketed Property' object called 'updateStatusOnProperty' which updates all parent Property records when fields on the child Marketed_Property object are updated. There are also 2 triggers on Property which amend fields on the Property object when updated. The Marketed_Property object is populated by 10 @future calls carrying out a bulk update, fed by an input file of approx 2000 rows of data.

 

While processing the data using the @future jobs, we are get an "UNABLE_TO_LOCK_ROW" error for one of the Property record updates in one of the @future jobs. The other 9 @future jobs complete successfully. The error is reproducible but only on our live org and only sporadically, and with the lock occurring against different single records each time. We have cloned our live environment in a full size test org but cannot recreate the problem here, nor in any sandbox or DE org.

 

The trigger code is 1) doing a select on Property for all records where there is a child Marketed_Property object, 2) doing some comparisons on the Marketed_Property data to determine which Property rows/fields should be updated, and 3) updating the relevant Property records....and it's this step that's failing.

 

The code is below:

 

if(mpIds.size() == 0){return;}
private List<Property__c> RecordsBatch=new List<Property__c>();
List<Property__c> props = [Select id,Property_Status__c,Asking_Price__c,Estate_Agent__c,Beds__c,Weeks_On_Market__c,Date_Marketed__c,Property_Type__c,Type__c,Last_Update_Date__c,Matched__c,(Select id,Property_Status__c,Asking_Price__c,Estate_Agent__c,Beds__c,Weeks_On_Market__c,Date_Marketed__c,Property_Type__c,Type__c,Last_Updated__c from Properties__r order by LastModifiedDate desc)from Property__c where Id IN : mpIds];
for(Property__c p : props){
Property__c p1 = new Property__c(ID = p.Id);
List<Marketed_Property__c> listMP = p.Properties__r;
if(listMP.size()>0)
{
if(listMP.size()==2)
{
if(listMP[0].Asking_Price__c < listMP[1].Asking_Price__c)
{
p1.Asking_Price__c = listMP[0].Asking_Price__c;
}
else
{
p1.Asking_Price__c = listMP[1].Asking_Price__c;
}
if(listMP[0].Property_Status__c == 'For Sale' && listMP[1].Property_Status__c == 'For Sale')
{
p1.Property_Status__c = 'For Sale';
}
else if((listMP[0].Property_Status__c == 'For Sale' && listMP[1].Property_Status__c == 'Sold STC')||(listMP[0].Property_Status__c == 'Sold STC' && listMP[1].Property_Status__c == 'For Sale'))
{
p1.Property_Status__c = 'Sold STC';
}
else if((listMP[0].Property_Status__c == 'For Sale' && listMP[1].Property_Status__c == 'Sold')||(listMP[0].Property_Status__c == 'Sold' && listMP[1].Property_Status__c == 'For Sale'))
{
p1.Property_Status__c = 'Sold';
}
else if(listMP[0].Property_Status__c == 'Withdrawn' && listMP[1].Property_Status__c == 'Withdrawn')
{
p1.Property_Status__c = 'Withdrawn';
}
Marketed_Property__c MP = null;
if(listMP[0].Date_Marketed__c == listMP[1].Date_Marketed__c){
list<String> forEA = new List<String>();
forEA.add(listMP[0].Estate_Agent__c);
forEA.add(listMP[1].Estate_Agent__c);
forEA.sort();
if(forEA[0] == listMP[0].Estate_Agent__c){
MP = listMP[0];
}else{
MP = listMP[1];
}
} else if(listMP[0].Date_Marketed__c > listMP[1].Date_Marketed__c){
MP = listMP[1];
}else
{MP = listMP[0];}

p1.Estate_Agent__c = MP.Estate_Agent__c;
p1.Beds__c = MP.Beds__c;
p1.Weeks_On_Market__c = MP.Weeks_On_Market__c;
p1.Date_Marketed__c = MP.Date_Marketed__c;
p1.Property_Type__c = MP.Property_Type__c;
p1.Type__c = MP.Type__c;
p1.Last_Update_Date__c = MP.Last_Updated__c;
}
else
{
p1.Property_Status__c = listMP[0].Property_Status__c;
p1.Asking_Price__c = listMP[0].Asking_Price__c;
p1.Estate_Agent__c = listMP[0].Estate_Agent__c;
p1.Beds__c = listMP[0].Beds__c;
p1.Weeks_On_Market__c = listMP[0].Weeks_On_Market__c;
p1.Date_Marketed__c = listMP[0].Date_Marketed__c;
p1.Property_Type__c = listMP[0].Property_Type__c;
p1.Type__c = listMP[0].Type__c;
p1.Last_Update_Date__c = listMP[0].Last_Updated__c;
}
}
if(p.Matched__c == false){
//p.Matched__c = true;
p1.Matched__c = true;
}
RecordsBatch.add(p1);
if(RecordsBatch.size()== 1000)
{
update RecordsBatch;
RecordsBatch.clear();
}
}
if(RecordsBatch.size()> 0)
{
update RecordsBatch;
RecordsBatch.clear();
}

 

 

 

The error message is below:

18:3:38.13|CODE_UNIT_FINISHED
18:3:38.631|CODE_UNIT_STARTED|[EXTERNAL]updateStatusOnProperty on Marketed_Property trigger event AfterUpdate for a0DA0000000ukYY, a0DA0000000ukYZ, <snip> 186 IDs </snip>
18:3:38.719|DML_BEGIN|[62,2]|Op:Insert|Type:MatchingProHistory__c|Rows:187
18:3:39.338|DML_END|[62,2]|
18:3:39.339|SOQL_EXECUTE_BEGIN|[112,28]|Aggregations:1|Select id,Property_Status__c,Asking_Price__c,Estate_Agent__c,Beds__c,Weeks_On_Market__c,Date_Marketed__c,Property_Type__c,Type__c,Last_Update_Date__c,Matched__c,(Select id,Property_Status__c,Asking_Price__c,Estate_Agent__c,Beds__c,Weeks_On_Market__c,Date_Marketed__c,Property_Type__c,Type__c,Last_Updated__c from Properties__r order by LastModifiedDate desc)from Property__c where Id IN : mpIds
18:3:39.427|SOQL_EXECUTE_END|[112,28]|Rows:139|Duration:88
18:3:39.605|DML_BEGIN|[280,5]|Op:Update|Type:Property__c|Rows:139
18:3:49.7|DML_END|[280,5]|
18:3:49.8|EXCEPTION_THROWN|[280,5]|System.DmlException: Update failed. First exception on row 0 with id a0CA0000000Y27WMAS; first error: UNABLE_TO_LOCK_ROW, unable to obtain exclusive access to this record: []
18:3:49.13|FATAL_ERROR|System.DmlException: Update failed. First exception on row 0 with id a0CA0000000Y27WMAS; first error: UNABLE_TO_LOCK_ROW, unable to obtain exclusive access to this record: []

We've been trying to rewrite this query but it isn't improving things. Our current theory is that since the asynchronous @future calls are running in parallel across the Marketed_Property object records, and since some of these child records have the same parent then the same Property record is being updated multiple times and being locked as a result. However opinion is divided as some of the team thinks Salesforce execution controls and prevents such a situation occurring.

 

Anyone seen this before and can see something we're missing?

 

Thanks.

Message Edited by davehilary on 03-17-2010 08:32 AM

Hi,

 

I would like to display the output of a dynamic SOQL query on a custom object using an enhanced list. I can't find anything in the documentation that explicitly says this is not possible, but I've had a number of my developers say it can't be done...but it's only enhanced lists that have this problem. 

Has anyone ever managed to work around this limitation, or is it an enhancement slated for later releases of SFDC beyond Spring 10?

 

Thanks.

I am getting a ‘Regex too complicated’ error below when loading data into our org using the following process:

 

1) an email service to receive the CSV data,

2) an APEX class to split and validate the CSV data, and then

3) a set of @future calls to upsert the data.

 

The same data works in smaller volumes, but not beyond a certain threshold. This applies whether we reduce the number of rows, or reduce the width of certain columns of data by truncating them to 3000 characters (a small number of columns have 10,000 characters of text included). When we do either or both of these steps in any combination to reduce the file size, we don't get this problem. It’s not a problem with a specific badly formatted row either, because reducing the number of rows in various combinations always causes the problem to go away.

 

So we don’t believe it is actually a regex problem, because the regular expression is just finding commas to split up a comma separated file/string - i.e. it's very simple.

 

This is why we think there's an undocumented storage or capacity limit somewhere within the APEX processing that is being exceeded - but one that doesn't have a governor limit associated with it, or indeed an accurate error message. We think it is an erroneous error message - i.e. it's not to do with complicated regex – and that this error message is a symptom of another issue.

 

This error has occurred in code that has been stable to date, but has appeared since the filesize we're uploading has increased to beyond about 4600-4800KB, which seems to be the threshold beyond which this problem occurs. There seem to be some undocumented limits in the volume of data than can be processed using the solution architecture we've designed.

 

We want to be able to code around this problem, but unless we know exactly what the error is, any changes we make to our code may not actually fix the problem and result in wasted effort. So I don't want to start changing this until I know exactly which part of the solution needs to be changed!

 

I’ve raised this with Salesforce as a potential bug or to see if they could clarify any undocumented limits on processing large volume datasets using the process we’ve designed, but they seem to have decided it’s a developer issue so won’t help.

 

The error message is below:

 

Apex script unhandled exception by user/organization: 

Failed to invoke future method 'public static void PrepareCSV(String, String, String, Integer, Boolean)'

caused by: System.Exception: Regex too complicated

Class.futureClassToProcess.GetList: line 98, column 17
Class.futureClassToProcess.parseCSV: line 53, column 38
Class.futureClassToProcess.PrepareCSV: line 35, column 20 External entry point

 The relevant code snippet is below:

 

 

 

public static list<List<String>> GetList(String Content)
        {
        Content = Content.replaceAll(',"""',',"DBLQT').replaceall('""",','DBLQT",');
            Content = Content.replaceAll('""','DBLQT');
            List<List<String>> lstCSV = new List<List<String>>();
            Boolean Cont = true;
            while (Cont == true){
                List<String> lstS = Content.Split('\r\n',500);
                if(lstS.size() == 500){
                    Content =lstS[499];
                    lstS.remove(499);
                }else{
                    Cont = false;
                }
                lstCSV.add(lstS);
            }
            return lstCSV;
        }

 

Any suggestions gratefully received as to whether we're missing something obvious, whether 4MB+ files just can't be processed this way, or whether this might actually be a SFDC APEX bug.

 

 

 

public static list<List<String>> GetList(String Content)
        {
            //Sanjeeb
            Log('GetList started.');
            Content = Content.replaceAll(',"""',',"DBLQT').replaceall('""",','DBLQT",');
            Log('Replaing DBLQT.');
            Content = Content.replaceAll('""','DBLQT');
            Log('Replaing DBLQT.');
            List<List<String>> lstCSV = new List<List<String>>();
            Boolean Cont = true;
            while (Cont == true){
                List<String> lstS = Content.Split('\r\n',500);
                Log('Split upto 500 Rows.');
                //List<String> lstS = Content.Split('\r\n',1000);
                if(lstS.size() == 500){
                    Content =lstS[499];
                    lstS.remove(499);
                }else{
                    Cont = false;
                }
                lstCSV.add(lstS);
            }
            Log('GetList ends.');
            return lstCSV;
        }

I am getting a ‘Regex too complicated’ error below when loading data into our org using the following process:

 

1) an email service to receive the CSV data,

2) an APEX class to split and validate the CSV data, and then

3) a set of @future calls to upsert the data.

 

The same data works in smaller volumes, but not beyond a certain threshold. This applies whether we reduce the number of rows, or reduce the width of certain columns of data by truncating them to 3000 characters (a small number of columns have 10,000 characters of text included). When we do either or both of these steps in any combination to reduce the file size, we don't get this problem. It’s not a problem with a specific badly formatted row either, because reducing the number of rows in various combinations always causes the problem to go away.

 

So we don’t believe it is actually a regex problem, because the regular expression is just finding commas to split up a comma separated file/string - i.e. it's very simple.

 

This is why we think there's an undocumented storage or capacity limit somewhere within the APEX processing that is being exceeded - but one that doesn't have a governor limit associated with it, or indeed an accurate error message. We think it is an erroneous error message - i.e. it's not to do with complicated regex – and that this error message is a symptom of another issue.

 

This error has occurred in code that has been stable to date, but has appeared since the filesize we're uploading has increased to beyond about 4600-4800KB, which seems to be the threshold beyond which this problem occurs. There seem to be some undocumented limits in the volume of data than can be processed using the solution architecture we've designed.

 

We want to be able to code around this problem, but unless we know exactly what the error is, any changes we make to our code may not actually fix the problem and result in wasted effort. So I don't want to start changing this until I know exactly which part of the solution needs to be changed!

 

I’ve raised this with Salesforce as a potential bug or to see if they could clarify any undocumented limits on processing large volume datasets using the process we’ve designed, but they seem to have decided it’s a developer issue so won’t help.

 

The error message is below:

 

Apex script unhandled exception by user/organization: 

Failed to invoke future method 'public static void PrepareCSV(String, String, String, Integer, Boolean)'

caused by: System.Exception: Regex too complicated

Class.futureClassToProcess.GetList: line 98, column 17
Class.futureClassToProcess.parseCSV: line 53, column 38
Class.futureClassToProcess.PrepareCSV: line 35, column 20 External entry point

 The relevant code snippet is below:

 

 

 

public static list<List<String>> GetList(String Content)
        {
        Content = Content.replaceAll(',"""',',"DBLQT').replaceall('""",','DBLQT",');
            Content = Content.replaceAll('""','DBLQT');
            List<List<String>> lstCSV = new List<List<String>>();
            Boolean Cont = true;
            while (Cont == true){
                List<String> lstS = Content.Split('\r\n',500);
                if(lstS.size() == 500){
                    Content =lstS[499];
                    lstS.remove(499);
                }else{
                    Cont = false;
                }
                lstCSV.add(lstS);
            }
            return lstCSV;
        }

 

Any suggestions gratefully received as to whether we're missing something obvious, whether 4MB+ files just can't be processed this way, or whether this might actually be a SFDC APEX bug.

 

 

 

public static list<List<String>> GetList(String Content)
        {
            //Sanjeeb
            Log('GetList started.');
            Content = Content.replaceAll(',"""',',"DBLQT').replaceall('""",','DBLQT",');
            Log('Replaing DBLQT.');
            Content = Content.replaceAll('""','DBLQT');
            Log('Replaing DBLQT.');
            List<List<String>> lstCSV = new List<List<String>>();
            Boolean Cont = true;
            while (Cont == true){
                List<String> lstS = Content.Split('\r\n',500);
                Log('Split upto 500 Rows.');
                //List<String> lstS = Content.Split('\r\n',1000);
                if(lstS.size() == 500){
                    Content =lstS[499];
                    lstS.remove(499);
                }else{
                    Cont = false;
                }
                lstCSV.add(lstS);
            }
            Log('GetList ends.');
            return lstCSV;
        }

Hello,

 

We have 2 custom objects: 'Property' and 'Marketed_Property'. These objects are related with a look up field on Marketed_Property. So in this relationship, Property is the Parent object and Marketed_Property is the Child object. We have a trigger on the 'Marketed Property' object called 'updateStatusOnProperty' which updates all parent Property records when fields on the child Marketed_Property object are updated. There are also 2 triggers on Property which amend fields on the Property object when updated. The Marketed_Property object is populated by 10 @future calls carrying out a bulk update, fed by an input file of approx 2000 rows of data.

 

While processing the data using the @future jobs, we are get an "UNABLE_TO_LOCK_ROW" error for one of the Property record updates in one of the @future jobs. The other 9 @future jobs complete successfully. The error is reproducible but only on our live org and only sporadically, and with the lock occurring against different single records each time. We have cloned our live environment in a full size test org but cannot recreate the problem here, nor in any sandbox or DE org.

 

The trigger code is 1) doing a select on Property for all records where there is a child Marketed_Property object, 2) doing some comparisons on the Marketed_Property data to determine which Property rows/fields should be updated, and 3) updating the relevant Property records....and it's this step that's failing.

 

The code is below:

 

if(mpIds.size() == 0){return;}
private List<Property__c> RecordsBatch=new List<Property__c>();
List<Property__c> props = [Select id,Property_Status__c,Asking_Price__c,Estate_Agent__c,Beds__c,Weeks_On_Market__c,Date_Marketed__c,Property_Type__c,Type__c,Last_Update_Date__c,Matched__c,(Select id,Property_Status__c,Asking_Price__c,Estate_Agent__c,Beds__c,Weeks_On_Market__c,Date_Marketed__c,Property_Type__c,Type__c,Last_Updated__c from Properties__r order by LastModifiedDate desc)from Property__c where Id IN : mpIds];
for(Property__c p : props){
Property__c p1 = new Property__c(ID = p.Id);
List<Marketed_Property__c> listMP = p.Properties__r;
if(listMP.size()>0)
{
if(listMP.size()==2)
{
if(listMP[0].Asking_Price__c < listMP[1].Asking_Price__c)
{
p1.Asking_Price__c = listMP[0].Asking_Price__c;
}
else
{
p1.Asking_Price__c = listMP[1].Asking_Price__c;
}
if(listMP[0].Property_Status__c == 'For Sale' && listMP[1].Property_Status__c == 'For Sale')
{
p1.Property_Status__c = 'For Sale';
}
else if((listMP[0].Property_Status__c == 'For Sale' && listMP[1].Property_Status__c == 'Sold STC')||(listMP[0].Property_Status__c == 'Sold STC' && listMP[1].Property_Status__c == 'For Sale'))
{
p1.Property_Status__c = 'Sold STC';
}
else if((listMP[0].Property_Status__c == 'For Sale' && listMP[1].Property_Status__c == 'Sold')||(listMP[0].Property_Status__c == 'Sold' && listMP[1].Property_Status__c == 'For Sale'))
{
p1.Property_Status__c = 'Sold';
}
else if(listMP[0].Property_Status__c == 'Withdrawn' && listMP[1].Property_Status__c == 'Withdrawn')
{
p1.Property_Status__c = 'Withdrawn';
}
Marketed_Property__c MP = null;
if(listMP[0].Date_Marketed__c == listMP[1].Date_Marketed__c){
list<String> forEA = new List<String>();
forEA.add(listMP[0].Estate_Agent__c);
forEA.add(listMP[1].Estate_Agent__c);
forEA.sort();
if(forEA[0] == listMP[0].Estate_Agent__c){
MP = listMP[0];
}else{
MP = listMP[1];
}
} else if(listMP[0].Date_Marketed__c > listMP[1].Date_Marketed__c){
MP = listMP[1];
}else
{MP = listMP[0];}

p1.Estate_Agent__c = MP.Estate_Agent__c;
p1.Beds__c = MP.Beds__c;
p1.Weeks_On_Market__c = MP.Weeks_On_Market__c;
p1.Date_Marketed__c = MP.Date_Marketed__c;
p1.Property_Type__c = MP.Property_Type__c;
p1.Type__c = MP.Type__c;
p1.Last_Update_Date__c = MP.Last_Updated__c;
}
else
{
p1.Property_Status__c = listMP[0].Property_Status__c;
p1.Asking_Price__c = listMP[0].Asking_Price__c;
p1.Estate_Agent__c = listMP[0].Estate_Agent__c;
p1.Beds__c = listMP[0].Beds__c;
p1.Weeks_On_Market__c = listMP[0].Weeks_On_Market__c;
p1.Date_Marketed__c = listMP[0].Date_Marketed__c;
p1.Property_Type__c = listMP[0].Property_Type__c;
p1.Type__c = listMP[0].Type__c;
p1.Last_Update_Date__c = listMP[0].Last_Updated__c;
}
}
if(p.Matched__c == false){
//p.Matched__c = true;
p1.Matched__c = true;
}
RecordsBatch.add(p1);
if(RecordsBatch.size()== 1000)
{
update RecordsBatch;
RecordsBatch.clear();
}
}
if(RecordsBatch.size()> 0)
{
update RecordsBatch;
RecordsBatch.clear();
}

 

 

 

The error message is below:

18:3:38.13|CODE_UNIT_FINISHED
18:3:38.631|CODE_UNIT_STARTED|[EXTERNAL]updateStatusOnProperty on Marketed_Property trigger event AfterUpdate for a0DA0000000ukYY, a0DA0000000ukYZ, <snip> 186 IDs </snip>
18:3:38.719|DML_BEGIN|[62,2]|Op:Insert|Type:MatchingProHistory__c|Rows:187
18:3:39.338|DML_END|[62,2]|
18:3:39.339|SOQL_EXECUTE_BEGIN|[112,28]|Aggregations:1|Select id,Property_Status__c,Asking_Price__c,Estate_Agent__c,Beds__c,Weeks_On_Market__c,Date_Marketed__c,Property_Type__c,Type__c,Last_Update_Date__c,Matched__c,(Select id,Property_Status__c,Asking_Price__c,Estate_Agent__c,Beds__c,Weeks_On_Market__c,Date_Marketed__c,Property_Type__c,Type__c,Last_Updated__c from Properties__r order by LastModifiedDate desc)from Property__c where Id IN : mpIds
18:3:39.427|SOQL_EXECUTE_END|[112,28]|Rows:139|Duration:88
18:3:39.605|DML_BEGIN|[280,5]|Op:Update|Type:Property__c|Rows:139
18:3:49.7|DML_END|[280,5]|
18:3:49.8|EXCEPTION_THROWN|[280,5]|System.DmlException: Update failed. First exception on row 0 with id a0CA0000000Y27WMAS; first error: UNABLE_TO_LOCK_ROW, unable to obtain exclusive access to this record: []
18:3:49.13|FATAL_ERROR|System.DmlException: Update failed. First exception on row 0 with id a0CA0000000Y27WMAS; first error: UNABLE_TO_LOCK_ROW, unable to obtain exclusive access to this record: []

We've been trying to rewrite this query but it isn't improving things. Our current theory is that since the asynchronous @future calls are running in parallel across the Marketed_Property object records, and since some of these child records have the same parent then the same Property record is being updated multiple times and being locked as a result. However opinion is divided as some of the team thinks Salesforce execution controls and prevents such a situation occurring.

 

Anyone seen this before and can see something we're missing?

 

Thanks.

Message Edited by davehilary on 03-17-2010 08:32 AM

Hi there!

 

1) I´m wondering if there is a good guide or tutorial about Mail Merge and Custom Objects.

 

2) What about the "MS Office Add-in" or "MS Word Add-In ", I've read some posts about it but I can't find anywhere. How do I get it?.

 

3) I want to create custom Word documents that pull data from custom objects fields, using Salesforce's Mail Merge functionality. Does anyone have a working example to share?

 

Any succesful implementation outhere??

 

Greetings.... and happy new year!!!

 

I am using the Pattern and Matcher classes to search text from an email.  Sometimes, I get an exception that says Regex too complicated.  I can't find any information on this.  Does anyone know what can cause?  I get the premise of the exception but don't know what to do to fix it.  If I put my regular expression and sample text into the tester on this site, http://www.fileformat.info/tool/regex.htm.  It works fine and returns what I want.  From what I understand Salesforce uses similar functionality as Java which the above site is using.  Any ideas?  Thanks.
I have the following requirement -
 
I am implementing a search functionality to query a custom object based on certain criteria. I would like to display the result set in an <apex:enhancedlist>. Is this possible ?
 
 
  • October 14, 2008
  • Like
  • 0
Hello,
 
I am trying to take advantage of the new StandardSetController functionality with a custom data set (not driven through a predefined listview filter).  I am having difficulty finding the right syntax within the controller to enable this.
 
I have the following page:
Code:
<apex:page standardController="Account" extensions="ListPOC_Controller" tabStyle="Account" recordSetVar="Accts">  <apex:form id="theForm">
   <apex:sectionHeader title="Page: {!pageNumber}" subtitle="Total Results: {!resultSize}">
    <apex:pageBlock >
     <apex:pageBlockTable value="{!Accts}" var="acct">
      <apex:column value="{!acct.id}" />
       <apex:column value="{!acct.name}" />
     </apex:pageBlockTable>
    </apex:pageBlock>
   </apex:sectionHeader>
  </apex:form>
 </apex:page>
 
And my most recent cut of the controller extension is as follows:

Code:
public class ListPOC_Controller {
  public ListPOC_Controller(ApexPages.StandardSetController stdSetController) {
        // what static var do I need to bind— 
 }
   public List <Account> getAccts() {   return [select id, name from Account limit 25];   
}

Is this even possible?  Can anyone help me fill in the gaps on the controller extension?
 
Thanks in advance.


Message Edited by mtbclimber on 10-04-2008 12:10 PM