• samthakur74
  • NEWBIE
  • 25 Points
  • Member since 2011

  • Chatter
    Feed
  • 1
    Best Answers
  • 0
    Likes Received
  • 0
    Likes Given
  • 13
    Questions
  • 12
    Replies

Hello,

I have created a scheduler class. The Job itself  is registered with following code

 

public class TestInsertTaskScheduler
{
 public static testMethod void testInsertTaskScheduler()
 {
        scheduledInsert i = new scheduledInsert();
        Datetime now = Datetime.now();
        System.debug ('Datetime'+now);
       
        String sch = '00 1 * * * ?';  // scheduled to execute every minute
        system.schedule('Insert Task S3', sch, i);
        System.debug ('After schedule');
     
      }
}

The job does not display under Monitoring Seheduled Jobs.

 

Also the Job does not execute.

 

Could some one point out the obvious error? :)

 

Thanks
Sameer

public class TestInsertTaskScheduler
{
 public static testMethod void testInsertTaskScheduler()
 {
        scheduledInsert i = new scheduledInsert();
        Datetime now = Datetime.now();
        System.debug ('Datetime'+now);
        
        String sch = '00 1 * * * ?';
        system.schedule('Insert Task S3', sch, i);
        System.debug ('After schedule');
      
      }
}

I defined a custom object (called Transaction). I populated it within a trigger. The trigger is defined after insert of Task object. After populating Transaction object i want to save it such that it shows up as a object of type Transaction in data management->storage usage.

How do i do this?

The only way I have saved custom objects is by using dataloader to import them. Not sure how to save them directly from within Sdfc apex code.

Any pointers would be appreciated.

regards Sameer

I have installed AWS S3 toolkit on sdfc. I have the following code where AWS_S3_ExampleController is part of the installed toolkit. When i execute this code the debug log shows the generic "internal Server error" 500.

The returned soap message has

soapenv:Value>ns1:Client.InvalidArgument soapenv:Value soapenv:Code soapenv:Reasonsoapenv:Text xml:lang="en"Invalid id /soapenv:Text/soapenv:Reason soapenv:Detail ArgumentValue>SAmeer Thakur ArgumentValue>ArgumentName>CanonicalUser/ID/soapenv:Detail>soapenv:Fault> soapenv:Body>soapenv:Envelope>

 

I do not know how to resolve the Invalid id being seen

 

The code is:

 

AWS_S3_ExampleController c = new AWS_S3_ExampleController();

c.constructor();

c.fileName=fileName;

c.OwnerId='Sameer Thakur';

c.bucketToUploadObject= bucketName;

c.fileSize=100000;

c.fileBlob= Blob.valueOf(record);

c.accessTypeSelected='public-read-write';

System.debug('Before insert');

c.syncFilesystemDoc();

System.debug('After insert');

 

Any pointer would be appreciated Thank you Sameer

 

 

 

Hello,

Can anyone tell me how to create a blob representation of csv file on my machine using Apex code? I am trying to upload a csv file programmatically using Apex custom class. I am using custom classes installed by product AWS s3 which i have installed on SalesForce. I am calling the function syncFilesystemDoc in AWS_S#_ExampleController.cls. To use this correctly i need to populate fileblob which is where i am stuck

 

regards

Sameer

Hello,

I am attempting to add Task objects/records into AWS S3. I have installed Force.com for Amazon webservices.

I wrote the following trigger code

 

trigger S3Insert on Task (before insert) {
 Task[] TaskList = Trigger.new;
   for(Task t :TaskList)
      {
            AWS_S3_ExampleController c = new AWS_S3_ExampleController();
           // String credName = c.createTestCredentials();
          //  c.AWSCredentialName = credName;
            c.constructor();
            c.createBucket();       
      }
}

 

But i get exception:

Class.AWS_S3_ExampleController.createBucket: line 239, column 1
Trigger.S3Insert: line 9, column 1
05:36:00.224 (224038000)|FATAL_ERROR|System.FinalException: ApexPages.addMessages can only be called from a Visualforce page

 

I also see "System.CalloutException: Callout from triggers are currently not supported"

 

Is there any reason why webervice callouts cannot be made from Salesforce triggers?

 

Thank you

Sameer

 

 

 

 

 

 

 

 

Hello,

I get the following error during plugin installation. I have ensured Eclipse is not under program files. I tried runninf eclispe as administrator.

Can someone tell me exactly what and how does one configure on windows 7 UAC to ensure this problem goes.

away?

 

I did try to add my login user to administrator group. I also gave all authenticated users modify permissions in eclipse folder

 

regards
Sameer

 

cannot complete the install because of a conflicting dependency.
  Software being installed: Force.com IDE 23.0.2.201201091635 (com.salesforce.ide.feature.feature.group 23.0.2.201201091635)
  Software currently installed: Eclipse IDE for Java Developers 1.4.0.20110615-0550 (epp.package.java 1.4.0.20110615-0550)
  Only one of the following can be installed at once:
    International Components for Unicode for Java (ICU4J) 4.4.2.v20110208 (com.ibm.icu 4.4.2.v20110208)
    International Components for Unicode for Java (ICU4J) 4.0.1.v20090822 (com.ibm.icu 4.0.1.v20090822)
  Only one of the following can be installed at once:
    Structured Source Editor 1.1.102.v200910200227 (org.eclipse.wst.sse.ui 1.1.102.v200910200227)
    Structured Source Editor 1.3.0.v201105101529 (org.eclipse.wst.sse.ui 1.3.0.v201105101529)
  Cannot satisfy dependency:
    From: Force.com IDE 23.0.2.201201091635 (com.salesforce.ide.feature.feature.group 23.0.2.201201091635)
    To: org.eclipse.wst.html.ui [1.0.0,2.0.0)
  Cannot satisfy dependency:
    From: Eclipse IDE for Java Developers 1.4.0.20110615-0550 (epp.package.java 1.4.0.20110615-0550)
    To: org.eclipse.epp.package.java.feature.feature.group [1.4.0.20110615-0550]
  Cannot satisfy dependency:
    From: EPP Java Package 1.4.0.20110615-0550 (org.eclipse.epp.package.java.feature.feature.group 1.4.0.20110615-0550)
    To: org.eclipse.rcp.feature.group 3.7.0

Hello,

Does anyone have code which does a bulk export of a custom object? I have used the bulk upload sample code described @ http://www.salesforce.com/us/developer/docs/api_asynch/..Wanted something similiar for bulk export.

regards
Sameer

 

 

 

 

We are implementing SSO for SalesForce using OpenAM. We followed the steps @ http://blogs.oracle.com/rangal/entry/saml2_salesforce_com

There are two scenarios

1. Idp (OpenAM) initiated SSO.

2. Service provider (salesForce) initiated SSO.

Scenario 1 works fine. Scenario 2 does not.

I read in SSO best practices for SalesForce that scenario 2 cannot be implemented for SalesForce SSO. Is this correct? regards

Sameer

Hello,

I have 2 custom objects Person and Address. Person is master and Address is the detail. I have defined Address in Person as master detail relationship.

How do i define data in csv for bulk upload (using reference bulk upload java code)? Is there any reference XML files which i can use to see how master detail is defined for bulk upload in Salesforce

 

Thank you

Sameer

Hello,

Wheni try to bulk upload 25,000 records of a custom (employee) object i get errors stating storage limits exceeded. I have ensured that there are no employee objects in salesforce (using data laoder delete). I ensured that recycle bin has been emptied. My batch size is 200 while using the data loader

 

Now, when i try the same operation using a java client and bulk upload API the upload works perfectly, with no errors upto 100,000 records.

 

Does anyone know why i get this error only while using the apex data loader?

regards

Sameer

Hello,

Hello,

I am using the bulk upload code described at http://www.salesforce.com/us/developer/docs/api_asynch/.

 

The only difference is that i am uploading a custom object type.  I can access Employee_c. But now i get a different error

 stateMessage='InvalidBatch : Field name not found : First Name'

 

First Name is the first column in the csv.

 

While debugging i can see that the temp csv is being created correctly. However i get this error when  checkResults executes. The code is exactly the same as in the sample java code for bulk api using REST.

 

I am using the free developer version of salesforce. I did check permissions of fields in employees. They are all at max visibility and read/write

 

Any pointers would be appreciated

 

Thanks

Sameer

 

Edit: I created a new permission set where i have given custom object employee Read/create/edit/delete/view all/modify all. All fields are given edit permissions. The permission set is associated with salesforce user license. The programmatic login is with a user associated with System administraor profile , which has sales force user license.

But still the error persists!

Hello,

I am using the bulk upload code described at http://www.salesforce.com/us/developer/docs/api_asynch/.

 

The only difference is that i am uploading a custom object type. The object has been defined in SalesForce. But when i refer to the object (named Employee) i get error "Unable to find object: Employee". Tried with Employee_c as well. Same result. 

Any pointers would be appreciated

thank you

Sameer

Hello,

We are trying to implement SSO with a webapplication and SF. The scenario is the user logs into the webapplication (with that webapplication's user Id. If the login is sucessful, then user can click a link on the webapplication page which sends a REST request to SF. The REST request will have the SF username and password (and anything else required for Login into SF).

SF has been configured to use SSO. So this login request is redirected to IdP , in our case OpenAM. Now i would like to retrieve the SF username in openAM (where i have written a custom authentication module). How can i preserve the parameters passed to SF, when SF redirects to openAM? I think in an http redirect the parameters get lost.

 

The webapplication is within the network, SF is on a public IP and openAM is in the DMZ.

 

Thanks
Sameer

Hello,

I have created a scheduler class. The Job itself  is registered with following code

 

public class TestInsertTaskScheduler
{
 public static testMethod void testInsertTaskScheduler()
 {
        scheduledInsert i = new scheduledInsert();
        Datetime now = Datetime.now();
        System.debug ('Datetime'+now);
       
        String sch = '00 1 * * * ?';  // scheduled to execute every minute
        system.schedule('Insert Task S3', sch, i);
        System.debug ('After schedule');
     
      }
}

The job does not display under Monitoring Seheduled Jobs.

 

Also the Job does not execute.

 

Could some one point out the obvious error? :)

 

Thanks
Sameer

public class TestInsertTaskScheduler
{
 public static testMethod void testInsertTaskScheduler()
 {
        scheduledInsert i = new scheduledInsert();
        Datetime now = Datetime.now();
        System.debug ('Datetime'+now);
        
        String sch = '00 1 * * * ?';
        system.schedule('Insert Task S3', sch, i);
        System.debug ('After schedule');
      
      }
}

I defined a custom object (called Transaction). I populated it within a trigger. The trigger is defined after insert of Task object. After populating Transaction object i want to save it such that it shows up as a object of type Transaction in data management->storage usage.

How do i do this?

The only way I have saved custom objects is by using dataloader to import them. Not sure how to save them directly from within Sdfc apex code.

Any pointers would be appreciated.

regards Sameer

I have installed AWS S3 toolkit on sdfc. I have the following code where AWS_S3_ExampleController is part of the installed toolkit. When i execute this code the debug log shows the generic "internal Server error" 500.

The returned soap message has

soapenv:Value>ns1:Client.InvalidArgument soapenv:Value soapenv:Code soapenv:Reasonsoapenv:Text xml:lang="en"Invalid id /soapenv:Text/soapenv:Reason soapenv:Detail ArgumentValue>SAmeer Thakur ArgumentValue>ArgumentName>CanonicalUser/ID/soapenv:Detail>soapenv:Fault> soapenv:Body>soapenv:Envelope>

 

I do not know how to resolve the Invalid id being seen

 

The code is:

 

AWS_S3_ExampleController c = new AWS_S3_ExampleController();

c.constructor();

c.fileName=fileName;

c.OwnerId='Sameer Thakur';

c.bucketToUploadObject= bucketName;

c.fileSize=100000;

c.fileBlob= Blob.valueOf(record);

c.accessTypeSelected='public-read-write';

System.debug('Before insert');

c.syncFilesystemDoc();

System.debug('After insert');

 

Any pointer would be appreciated Thank you Sameer

 

 

 

Hello,

Can anyone tell me how to create a blob representation of csv file on my machine using Apex code? I am trying to upload a csv file programmatically using Apex custom class. I am using custom classes installed by product AWS s3 which i have installed on SalesForce. I am calling the function syncFilesystemDoc in AWS_S#_ExampleController.cls. To use this correctly i need to populate fileblob which is where i am stuck

 

regards

Sameer

Hello,

I am attempting to add Task objects/records into AWS S3. I have installed Force.com for Amazon webservices.

I wrote the following trigger code

 

trigger S3Insert on Task (before insert) {
 Task[] TaskList = Trigger.new;
   for(Task t :TaskList)
      {
            AWS_S3_ExampleController c = new AWS_S3_ExampleController();
           // String credName = c.createTestCredentials();
          //  c.AWSCredentialName = credName;
            c.constructor();
            c.createBucket();       
      }
}

 

But i get exception:

Class.AWS_S3_ExampleController.createBucket: line 239, column 1
Trigger.S3Insert: line 9, column 1
05:36:00.224 (224038000)|FATAL_ERROR|System.FinalException: ApexPages.addMessages can only be called from a Visualforce page

 

I also see "System.CalloutException: Callout from triggers are currently not supported"

 

Is there any reason why webervice callouts cannot be made from Salesforce triggers?

 

Thank you

Sameer

 

 

 

 

 

 

 

 

Hello,

I get the following error during plugin installation. I have ensured Eclipse is not under program files. I tried runninf eclispe as administrator.

Can someone tell me exactly what and how does one configure on windows 7 UAC to ensure this problem goes.

away?

 

I did try to add my login user to administrator group. I also gave all authenticated users modify permissions in eclipse folder

 

regards
Sameer

 

cannot complete the install because of a conflicting dependency.
  Software being installed: Force.com IDE 23.0.2.201201091635 (com.salesforce.ide.feature.feature.group 23.0.2.201201091635)
  Software currently installed: Eclipse IDE for Java Developers 1.4.0.20110615-0550 (epp.package.java 1.4.0.20110615-0550)
  Only one of the following can be installed at once:
    International Components for Unicode for Java (ICU4J) 4.4.2.v20110208 (com.ibm.icu 4.4.2.v20110208)
    International Components for Unicode for Java (ICU4J) 4.0.1.v20090822 (com.ibm.icu 4.0.1.v20090822)
  Only one of the following can be installed at once:
    Structured Source Editor 1.1.102.v200910200227 (org.eclipse.wst.sse.ui 1.1.102.v200910200227)
    Structured Source Editor 1.3.0.v201105101529 (org.eclipse.wst.sse.ui 1.3.0.v201105101529)
  Cannot satisfy dependency:
    From: Force.com IDE 23.0.2.201201091635 (com.salesforce.ide.feature.feature.group 23.0.2.201201091635)
    To: org.eclipse.wst.html.ui [1.0.0,2.0.0)
  Cannot satisfy dependency:
    From: Eclipse IDE for Java Developers 1.4.0.20110615-0550 (epp.package.java 1.4.0.20110615-0550)
    To: org.eclipse.epp.package.java.feature.feature.group [1.4.0.20110615-0550]
  Cannot satisfy dependency:
    From: EPP Java Package 1.4.0.20110615-0550 (org.eclipse.epp.package.java.feature.feature.group 1.4.0.20110615-0550)
    To: org.eclipse.rcp.feature.group 3.7.0

Hello,

Wheni try to bulk upload 25,000 records of a custom (employee) object i get errors stating storage limits exceeded. I have ensured that there are no employee objects in salesforce (using data laoder delete). I ensured that recycle bin has been emptied. My batch size is 200 while using the data loader

 

Now, when i try the same operation using a java client and bulk upload API the upload works perfectly, with no errors upto 100,000 records.

 

Does anyone know why i get this error only while using the apex data loader?

regards

Sameer

Hello,

I am using the bulk upload code described at http://www.salesforce.com/us/developer/docs/api_asynch/.

 

The only difference is that i am uploading a custom object type. The object has been defined in SalesForce. But when i refer to the object (named Employee) i get error "Unable to find object: Employee". Tried with Employee_c as well. Same result. 

Any pointers would be appreciated

thank you

Sameer