• mharikian1.3892229047844834E12
  • NEWBIE
  • 20 Points
  • Member since 2014

  • Chatter
    Feed
  • 0
    Best Answers
  • 1
    Likes Received
  • 0
    Likes Given
  • 10
    Questions
  • 13
    Replies
I'm using the Bulk API to retrieve records from Salesforce and load them into Microsoft SQL Server for data reporting as well as linking data sets to other databases.

Some of our custom Salesforce objects contain over 1 million records, and after reading through the Bulk API documentation, it seemed to fit the bill. However, the actual file retrieval is taking far too long. Even for 80,000 records, the time span is into the 4+ hours to retrieve the file results from Salesforce after the batch requests have been processed. This is not making any sense to me since the Salesforce Bulk API is stated as being geared for high volumes of records vs utilizing the SOAP API, but it seems the SOAP API and Bulk API take the same amount of time and there is no time savings at all whether I use Bulk API or SOAP API.

I tried adding additional logic in my code to send multiple requests to Salesforce for the file retrieval in an asynchronous method, and with SOAP API i could usually send 20 queries to Salesforce at a time in a Parallel.For loop, and get the responses back. However, I tried that with the Bulk API, and it timed out.

I then reduced the total number of requests down to 5 at a time, and it still timed out.

I'm now at 2 requests at a time and it appears to be working, but have over 1460 file results to retrieve. 

What other options do I have available to me to make the file retrieval process quicker and more efficient? This isn't making any sense to me especially since Salesforce has claimed the Bulk API is the recommended way to go for large dataset retrieval and uploads.
 
I have checked the user Profile to make sure the user has access to the page layout and record type(s) associated with the Lightning Application and Object. The user has full CRUD.

The difficulty is when I created a Quick Action in another object and added it to the Lightning Record Page, it showed up just fine and that object had several different page layouts and records associated with it. 

So, I am not sure what the problem is with this component / quick action but for some reason my Lightning Page will not show my custom Quick Action at all.
I'm having a lot of trouble using the System.debug() to debug any of my code after the Winter 16 upgrade. I have System.debugs all over my apex classes and triggers, but when I set the trace logs in the debug log console, none of the logs show up. The current API on my classes and triggers are 35.

My Debug Level on my user trace is set to Finest on all categories except validation. 

Please do not ask me to copy any code to the community as there are just way too many classes involved, and there is a lot of code that is only relevant to our org. However, if you have experieced issues with debugs in Winter 16, please let me know, and what you were able to do to work around the System.debugs not showing up in the debug logs.



 
I have two class:
OpportunityLineItemTriggers
OpportunityLineItemTriggers_Test

In the developer console, I run the test class of which there are 5 and all of them check out as green. Yet the code coverage remains very low on the class. When I open up the OpportunityLineItemTriggers class and select Code Coverage, it's only showing that it ran one test, and it's the last one in the list.

Has anyone else run into this problem, and if so, how can I get the Code Coverage to be cumulative across all of the tests in the test class?
I'm referencing a Visualforce Email Template in an apex class, and have the replyTo="ar@domain.com" but for some reason when the recipients of that email reply to the emails being sent, it comes back to my personal inbox. The replyTo value works when using a time-based workflow rule, but why doesn't it work when using APEX?

By the way, I tested this before moving to production in a sandbox, and when I click Reply To in outlook, the correct email address gets populated, so I'm not sure why in production this isn't working.
What MSSQL server data type should I use when transferring Textarea (32,000 characters) from Salesforce to my Microsoft SQL Server database?
(Ex. Account.Description)

In several examples, there are posts to use nvarchar(max) but when I use this, my integration code ends up throwing errors and stating that the values will be truncated because some of our "Account Descriptions", as an example, are longer than 4000 characters. I was looking into using a Blob, but that means I have to convert the data from binary and don't know if I want to go that route.

Has anyone else run into this issue and if so, what methods did you use to store large amounts of text in a Microsoft SQL Server which were greater than the 4000 character limit on nvarchar?

Thanks.

 
I have a trigger on the Quote Line Item. I also have triggers on Opportunity Line Items and Opportunities. I am turning methods on and off during the processing of certain line items over others.

I changed the amounts on an update on an Opportunity Line Item which is syncing with a quote line item, the Opportunity Line Item triggers fired, and then other processes took place which ultimately ended up erring out at my Quote Line Item triggers.

I can find the METHOD_EXIT, but can't see in the debug logs where there was ever a METHOD_ENTRY on the quote line item triggers. Has anyone ever run into this before and if so, what can I do to fix it? I trace what is causing the quote line item triggers to fire but need to track this.

Thanks.
I am using a Visualforce page with a Quote standard controller and extension to customize what values end up on the the Quote Line Items. However, intermittantly when I attempt to save the quote, it errors out with an UNABLE_TO_LOCK_ROW, unable to obtain exclusive access to this record. Has anyone ever run into this before? This is a BEFORE INSERT of a quote record which is throwing this error. The examples I've seen have to do with updating a record, but on this instance there are no records, no references to the quote in any other code since it's in a BEFORE.

So I'm not sure how to troubleshoot this or where to start.
I've created a C# synchronization tool which uses the getDeleted() API call, and for some reason, the GetDeletedResult contains IDs of records which have not been deleted from Salesforce.

The object which is giving me trouble so far is CurrencyType. As I step through the C# logic, the code inserts all of the currency type records retrieved from Salesforce into the MS SQL server, then when I use the getDeleted() it returns the IDs of all currency types except the corporate currency. These other currency types are still active in our Salesforce org and should not show up in the getDeleted() results.

Has anyone else seen behavoior like this with the CurrencyType object, and does the getDeleted() return Ids of other records which have not actually been deleted?

If so, I might have to go the route of doing a TEMP table in Microsoft SQL Server to run a comparison of what is coming from Salesforce and what is already in our sql server, but was hoping that the getDeleted() would work to keep things easier with the logic.
Out standard Case page layout is being overridden with another visualforce page. In that visualforce page, we are using the apex:detail to display the fields on the page layout.

However, when I add an embedded Visualforce Page to the standard layout, no data is displayed. The embedded page works fine alone or in other standard pages which are not being overridden, but in this page, it doesn't show any data, even simple text.

Here is the Case page override:

<apex:page standardController="Case" action="{!if($Profile.Name != 'Community Partner User',null,urlFor('/apex/case'))}"> 
    <style type="text/css">
        .data2Col, .dataCell { max-width:600px; word-wrap:break-word; }
        .data2Col div, .dataCell { width:700px\9; }
    </style>
   
    <apex:includeScript value="{!$Resource.project_cloud__jquery_js}" />
    <script type="text/javascript">
        function esc$(id) {
            return jQuery('#' + id.replace(/(:|\.)/g,'\\\\$1'));
        }
    </script>
    <project_cloud:tree workableId="{!Case.Id}" projectId="{!Case.project_cloud__Project_Task__r.project_cloud__Project_Phase__r.project_cloud__Project__c}" />

    <apex:pageMessages />

    <apex:form >
        <apex:detail subject="{!Case.Id}" inlineEdit="true" relatedList="true" showChatter="true" />
    </apex:form>
</apex:page>


And here is my very simple Visualforce page:

<apex:page standardController="Case"
           applyHtmlTag="false"
           applyBodyTag="false"
           showHeader="false"
           sidebar="false"
           standardStylesheets="false">
                
<html>
                
    <head>
   
    </head>

    <body>
   
        <b>Hello!!! This is the embedded page.</b>
       
    </body>
   

</html>
</apex:page>


The text doesn't show. Any ideas on why the embedded page doesn't operate under these conditions would be appreciated.

Thanks.
Out standard Case page layout is being overridden with another visualforce page. In that visualforce page, we are using the apex:detail to display the fields on the page layout.

However, when I add an embedded Visualforce Page to the standard layout, no data is displayed. The embedded page works fine alone or in other standard pages which are not being overridden, but in this page, it doesn't show any data, even simple text.

Here is the Case page override:

<apex:page standardController="Case" action="{!if($Profile.Name != 'Community Partner User',null,urlFor('/apex/case'))}"> 
    <style type="text/css">
        .data2Col, .dataCell { max-width:600px; word-wrap:break-word; }
        .data2Col div, .dataCell { width:700px\9; }
    </style>
   
    <apex:includeScript value="{!$Resource.project_cloud__jquery_js}" />
    <script type="text/javascript">
        function esc$(id) {
            return jQuery('#' + id.replace(/(:|\.)/g,'\\\\$1'));
        }
    </script>
    <project_cloud:tree workableId="{!Case.Id}" projectId="{!Case.project_cloud__Project_Task__r.project_cloud__Project_Phase__r.project_cloud__Project__c}" />

    <apex:pageMessages />

    <apex:form >
        <apex:detail subject="{!Case.Id}" inlineEdit="true" relatedList="true" showChatter="true" />
    </apex:form>
</apex:page>


And here is my very simple Visualforce page:

<apex:page standardController="Case"
           applyHtmlTag="false"
           applyBodyTag="false"
           showHeader="false"
           sidebar="false"
           standardStylesheets="false">
                
<html>
                
    <head>
   
    </head>

    <body>
   
        <b>Hello!!! This is the embedded page.</b>
       
    </body>
   

</html>
</apex:page>


The text doesn't show. Any ideas on why the embedded page doesn't operate under these conditions would be appreciated.

Thanks.
I'm using the Bulk API to retrieve records from Salesforce and load them into Microsoft SQL Server for data reporting as well as linking data sets to other databases.

Some of our custom Salesforce objects contain over 1 million records, and after reading through the Bulk API documentation, it seemed to fit the bill. However, the actual file retrieval is taking far too long. Even for 80,000 records, the time span is into the 4+ hours to retrieve the file results from Salesforce after the batch requests have been processed. This is not making any sense to me since the Salesforce Bulk API is stated as being geared for high volumes of records vs utilizing the SOAP API, but it seems the SOAP API and Bulk API take the same amount of time and there is no time savings at all whether I use Bulk API or SOAP API.

I tried adding additional logic in my code to send multiple requests to Salesforce for the file retrieval in an asynchronous method, and with SOAP API i could usually send 20 queries to Salesforce at a time in a Parallel.For loop, and get the responses back. However, I tried that with the Bulk API, and it timed out.

I then reduced the total number of requests down to 5 at a time, and it still timed out.

I'm now at 2 requests at a time and it appears to be working, but have over 1460 file results to retrieve. 

What other options do I have available to me to make the file retrieval process quicker and more efficient? This isn't making any sense to me especially since Salesforce has claimed the Bulk API is the recommended way to go for large dataset retrieval and uploads.
 
var Pricebookid = '{!Opportunity.Opp_PricebookId__c}'; 
var OppId = "{!Opportunity.Id}"; 
var acc_id = "{!Account.Id}"; 

if(Pricebookid !="") 


window.parent.location.href="apex/addOppPdCorporate?id="+OppId+"&acc_id="+acc_id; 



else 

window.parent.location.href="oppitm/choosepricebook.jsp?id="+OppId+"&retURL=/apex/addOppPdCorporate?id="+OppId+"&acc_id="+acc_id 
}


Please any help appreciated
Seems to be a bug here since Winter 16 release.  Click on the Change Log Level menu item of Debug drop down menu and the Change log level come window comes up with a perpetual "Loading" message....

Anyone else experiencing this?

User-added image
I have two class:
OpportunityLineItemTriggers
OpportunityLineItemTriggers_Test

In the developer console, I run the test class of which there are 5 and all of them check out as green. Yet the code coverage remains very low on the class. When I open up the OpportunityLineItemTriggers class and select Code Coverage, it's only showing that it ran one test, and it's the last one in the list.

Has anyone else run into this problem, and if so, how can I get the Code Coverage to be cumulative across all of the tests in the test class?
I'm referencing a Visualforce Email Template in an apex class, and have the replyTo="ar@domain.com" but for some reason when the recipients of that email reply to the emails being sent, it comes back to my personal inbox. The replyTo value works when using a time-based workflow rule, but why doesn't it work when using APEX?

By the way, I tested this before moving to production in a sandbox, and when I click Reply To in outlook, the correct email address gets populated, so I'm not sure why in production this isn't working.
What MSSQL server data type should I use when transferring Textarea (32,000 characters) from Salesforce to my Microsoft SQL Server database?
(Ex. Account.Description)

In several examples, there are posts to use nvarchar(max) but when I use this, my integration code ends up throwing errors and stating that the values will be truncated because some of our "Account Descriptions", as an example, are longer than 4000 characters. I was looking into using a Blob, but that means I have to convert the data from binary and don't know if I want to go that route.

Has anyone else run into this issue and if so, what methods did you use to store large amounts of text in a Microsoft SQL Server which were greater than the 4000 character limit on nvarchar?

Thanks.

 
I have a trigger on the Quote Line Item. I also have triggers on Opportunity Line Items and Opportunities. I am turning methods on and off during the processing of certain line items over others.

I changed the amounts on an update on an Opportunity Line Item which is syncing with a quote line item, the Opportunity Line Item triggers fired, and then other processes took place which ultimately ended up erring out at my Quote Line Item triggers.

I can find the METHOD_EXIT, but can't see in the debug logs where there was ever a METHOD_ENTRY on the quote line item triggers. Has anyone ever run into this before and if so, what can I do to fix it? I trace what is causing the quote line item triggers to fire but need to track this.

Thanks.
I am using a Visualforce page with a Quote standard controller and extension to customize what values end up on the the Quote Line Items. However, intermittantly when I attempt to save the quote, it errors out with an UNABLE_TO_LOCK_ROW, unable to obtain exclusive access to this record. Has anyone ever run into this before? This is a BEFORE INSERT of a quote record which is throwing this error. The examples I've seen have to do with updating a record, but on this instance there are no records, no references to the quote in any other code since it's in a BEFORE.

So I'm not sure how to troubleshoot this or where to start.
I've created a C# synchronization tool which uses the getDeleted() API call, and for some reason, the GetDeletedResult contains IDs of records which have not been deleted from Salesforce.

The object which is giving me trouble so far is CurrencyType. As I step through the C# logic, the code inserts all of the currency type records retrieved from Salesforce into the MS SQL server, then when I use the getDeleted() it returns the IDs of all currency types except the corporate currency. These other currency types are still active in our Salesforce org and should not show up in the getDeleted() results.

Has anyone else seen behavoior like this with the CurrencyType object, and does the getDeleted() return Ids of other records which have not actually been deleted?

If so, I might have to go the route of doing a TEMP table in Microsoft SQL Server to run a comparison of what is coming from Salesforce and what is already in our sql server, but was hoping that the getDeleted() would work to keep things easier with the logic.
Out standard Case page layout is being overridden with another visualforce page. In that visualforce page, we are using the apex:detail to display the fields on the page layout.

However, when I add an embedded Visualforce Page to the standard layout, no data is displayed. The embedded page works fine alone or in other standard pages which are not being overridden, but in this page, it doesn't show any data, even simple text.

Here is the Case page override:

<apex:page standardController="Case" action="{!if($Profile.Name != 'Community Partner User',null,urlFor('/apex/case'))}"> 
    <style type="text/css">
        .data2Col, .dataCell { max-width:600px; word-wrap:break-word; }
        .data2Col div, .dataCell { width:700px\9; }
    </style>
   
    <apex:includeScript value="{!$Resource.project_cloud__jquery_js}" />
    <script type="text/javascript">
        function esc$(id) {
            return jQuery('#' + id.replace(/(:|\.)/g,'\\\\$1'));
        }
    </script>
    <project_cloud:tree workableId="{!Case.Id}" projectId="{!Case.project_cloud__Project_Task__r.project_cloud__Project_Phase__r.project_cloud__Project__c}" />

    <apex:pageMessages />

    <apex:form >
        <apex:detail subject="{!Case.Id}" inlineEdit="true" relatedList="true" showChatter="true" />
    </apex:form>
</apex:page>


And here is my very simple Visualforce page:

<apex:page standardController="Case"
           applyHtmlTag="false"
           applyBodyTag="false"
           showHeader="false"
           sidebar="false"
           standardStylesheets="false">
                
<html>
                
    <head>
   
    </head>

    <body>
   
        <b>Hello!!! This is the embedded page.</b>
       
    </body>
   

</html>
</apex:page>


The text doesn't show. Any ideas on why the embedded page doesn't operate under these conditions would be appreciated.

Thanks.

Hi All,

 

I am having really tough time dealing with Salesforce Test classes.

 

My first problem is when I write a test class, then the class I am testing does not show up in Overall Code Coverage.

 

Then when I click on test the class does show up in Class Code Coverage and show me the coverage % but when I click on it it opens without the colors telling me which line is covered and which is not.

 

 

Please let me know how to resolve this.

 

Thanks.