function readOnly(count){ }
Starting November 20, the site will be set to read-only. On December 4, 2023,
forum discussions will move to the Trailblazer Community.
+ Start a Discussion
hamberghamberg 

proper syntax for Hard Delete?

This should be an easy one but it doesn't seem to be in the documentation:

 

When executing a hard delete with the bulk api dataloader command line, what is the right syntax?

 

<entry key="process.operation" value="hard delete"/> - this throws an error. As does 'Hard Delete' and 'harddelete.'  So I'm not quite sure what to put.  Regular 'delete' works just fine.

 

Thanks!

Best Answer chosen by Admin (Salesforce Developers) 
derekdevderekdev

I'm not sure if anything has changed in the newer versions of the Data Loader since this post, but I have found it is working in Apex Data Loader 21.0.  The key is that you need to use both the bulkApi flag and the process operation in the correct form, otherwise you will get some very uninformative error messages back out of the data loader.

Inside my process-conf, for the config I am using the following config entry lines to enable this:


 

 

<entry key="sfdc.useBulkApi" value="true"/>
<entry key="process.operation" value="hard_delete"/>

 


These are properly hard deleting several thousand records and avoiding the recycle bin.  I also added this response to your idea solution since it seems to be working for me.  Good luck!

 

All Answers

mhamberg1mhamberg1

Is everyone else just using regular delete for this? Hard Delete seems like it would be so much better.

 

Any help would be appreciated.

MycodexMycodex

Make sure the "Bulk API Hard Delete" option is enabled in the profile you are attempting to hard delete with. This might require you to clone the Administrator profile if you haven't already. I had this issue myself. The syntax for the rest of the bean is similar to delete.

mhamberg1mhamberg1

I definitely have the profile set up (from a clone) for hard delete. I also am using the exact same bean for the hard delete as I am for the delete, except my process.operation value just says "Hard Delete" instead of "delete."  

 

Is that the only difference I should have?  Does anyone else have this working?

 

I was told by a Tier 2 rep today that the dataloader documentation is wrong and that Hard Deletes aren't allowed from the command line. I've been told the wrong thing by Salesforce support in the past so before I give up I want to know if anyone else is able to execute the hard delete successfully.

MycodexMycodex

Workbench (http://developer.force.com/codeshare/apex/ProjectPage?id=a0630000002ahp3AAA) implemented it and thats a tool built on PHP and the Salesforce API. Its possible...

 

I'll test it out and post back here.

mhamberg1mhamberg1

Interesting.  Just for the record, this is what my bean looks like. If anyone notices anything odd about it let me know - like I said, it's the same as a delete operation just with the value changed (I have bolded the line in question).

 

 

<bean id="AOPDelete"
          class="com.salesforce.dataloader.process.ProcessRunner"
          singleton="false">
        <description>Deletes all AOP records from the extract file.</description>
        <property name="name" value="AOPDelete"/>
        <property name="configOverrideMap">
            <map>
                <entry key="sfdc.debugMessages" value="false"/>
                <entry key="sfdc.debugMessagesFile" value="D:\Salesforce\INT\sfdcSoapTrace.log"/>
                <entry key="sfdc.endpoint" value="https://na2.salesforce.com"/>
                <entry key="sfdc.username" value="my_username"/>
                <entry key="sfdc.password" value="my_pass_hash"/>
		<entry key="process.encryptionKeyFile" value="C:\Program Files\salesforce.com\Apex Data Loader 19.0\conf\key.txt" />
                <entry key="sfdc.timeoutSecs" value="600"/>
		<entry key="sfdc.noCompression" value="false"/>
		<entry key="sfdc.enableRetries" value="true"/>
		<entry key="sfdc.maxRetries" value="3"/>
		<entry key="sfdc.minRetrySleepSecs" value="2"/>
                <entry key="sfdc.loadBatchSize" value="2000"/>
		<entry key="sfdc.useBulkApi" value="true"/>
		<entry key="sfdc.bulkApiCheckStatusInterval" value="10000"/>
                <entry key="sfdc.entity" value="AOP__c"/>
                <entry key="sfdc.extractionRequestSize" value="1000"/>
                <entry key="process.operation" value="Hard Delete"/>
                <entry key="process.mappingFile" value="D:\Salesforce\INT\AOP\AOP_Delete.sdl"/>
                <entry key="dataAccess.type" value="csvRead"/>
                <entry key="dataAccess.name" value="D:\Salesforce\INT\AOP\AOP_extract.csv"/>
		<entry key="process.outputSuccess" value="D:\Salesforce\INT\AOP\AOP_Delete_Success.csv" />
		<entry key="process.outputError" value="D:\Salesforce\INT\AOP\AOP_Delete_Error.csv" />
		<entry key="process.enableLastRunOutput" value="false"/>
            </map>
        </property>
    </bean>

 

The error that I get back is this: 

java.lang.IllegalArgumentException: No enum const class com.salesforce.dataloader.action.OperationInfo.Hard Delete

 

ShwetaSShwetaS

Hi,

 

The Administrative permission for this operation, “Bulk API Hard Delete”, is disabled by default and must be enabled by an administrator. A Sales force user license is required for hard delete.

 

Shweta
Salesforce Developer Support

If my answer solved your question, please mark it solved so I can help as many community members as possible!

MycodexMycodex

I've tried and i still get an error. I have an open case with SF to review. Will report back when I hear something...

mhamberg1mhamberg1

I also have a case in with our CSM straight to the product manager of this.  It appears as if it's not cut and dry which is surprising - either this is a feature or it's not in my opinion. 

 

It would be nice if Salesforce was a little more transparent about this.

mhamberg1mhamberg1

After much discussion with internal resources at Salesforce, I was told definitively that a Hard Delete using the Dataloader command line interface is not possible (even though it is included in the documentation).

 

So I have given up my pursuit of an answer on this. If anyone else out there knows this to be incorrect, pleases let me know.

 

I have created an idea on the idea exchange to have this functionality delivered: https://sites.secure.force.com/ideaexchange/ideaView?c=09a30000000D9xt&id=08730000000Ik8w&returnUrl=/apex/ideaList%3Fc%3D09a30000000D9xt%26sort%3Drecent

derekdevderekdev

I'm not sure if anything has changed in the newer versions of the Data Loader since this post, but I have found it is working in Apex Data Loader 21.0.  The key is that you need to use both the bulkApi flag and the process operation in the correct form, otherwise you will get some very uninformative error messages back out of the data loader.

Inside my process-conf, for the config I am using the following config entry lines to enable this:


 

 

<entry key="sfdc.useBulkApi" value="true"/>
<entry key="process.operation" value="hard_delete"/>

 


These are properly hard deleting several thousand records and avoiding the recycle bin.  I also added this response to your idea solution since it seems to be working for me.  Good luck!

 

This was selected as the best answer
MycodexMycodex

I'll give it a try this week- It seems Salesforce is sneaking a lot of functionality into their releases lately without putting it in the release notes.

 

@derekdev - do you notice any performance gains with using hard delete? I delete 200k+ records per week as part of a refresh process and the deletes take the longest especially if I have to use serial mode.

derekdevderekdev

@Mycodex - It is very fast once you have the configuration correct.  I am deleting about 14000 records due to my dev environment size (before i move it to a sandbox), looking at about 15 seconds average for the bulk job to complete on salesforce's side, client side time included you're probably looking at a little less than 30 seconds.

 

One issue I ran into is that when you use the process.bat file, it seems to put in some default logging settings that will display millions of debug lines to the command prompt window and cause a very significant performance slowdown -- being that I'm a java developer, they are using log4j with this application but salesforce's developers do not seem to have it packed with the application's jar.  Instead, you can add the -Dlog4j.configuration parameter inside process.bat on the java executable line to have the application pick up the logging configuration and signifcantly speed up how fast the data loader is running since it will shut off all of the DEBUG display console messages.

 

inside bin\process.bat, then your updated line should look something more similar to this (assuming you have the data loader installed in the same location I do).

 

..\_jvm\bin\java.exe -cp ..\DataLoader.jar -Dlog4j.configuration="file:/C:/Program Files (x86)/salesforce.com/Apex Data Loader 21.0/conf/log-conf.xml" -Dsalesforce.config.dir=%1 com.salesforce.dataloader.process.ProcessRunner %PROCESS_OPTION%

 

 

mhamberg1mhamberg1

Thanks for the update on this. I bet SF just fixed it in one of the newer API revisions and I'm really glad they did.

 

I'm very interested in using the hard delete and so I will eventually put this in but I'm not excited about doing the software upgrade...

derekdevderekdev

@mhamberg -- I just tried this config against data loader 19.0 as well since I have that installed for a different job, and it works!

 

Here is my full working bean config.  Note again the sfdc.useBulkApi value set to "true" and the process.operation value set to "hard_delete".  I could not get this to work otherwise...

 

<bean id="salesforceLogDelete" class="com.salesforce.dataloader.process.ProcessRunner" singleton="false">
	<description>Deletes the Transactional Log data that will be refreshed in salesforce</description>
	<property name="name" value="salesforceLogDelete"/>
	<property name="configOverrideMap">
		<map>
			<entry key="sfdc.debugMessages" value="false"/>
			<entry key="sfdc.debugMessagesFile" value="c:\apextest\logs\gpSalesDataToSalesforceSoapTrace.log"/>
			<entry key="sfdc.endpoint" value="https://login.salesforce.com"/>
<entry key="sfdc.useBulkApi" value="true"/>
<entry key="sfdc.username" value="my_username"/>
<!-- password below has been encrypted using key file, therefore it will not work without the key setting: process.encryptionKeyFile
the password is not a valid encrypted value, please generate the real value using encrypt.bat utility -->
<entry key="sfdc.password" value="my_password"/>
<entry key="process.encryptionKeyFile" value="c:\apextest\pass.key"/>
<entry key="sfdc.timeoutSecs" value="600"/>
<entry key="sfdc.loadBatchSize" value="2000"/>
<entry key="sfdc.entity" value="Log_Transactions__c"/>
<entry key="process.operation" value="hard_delete"/>
<entry key="process.mappingFile" value="c:\apextest\idtodelete.sdl"/>
<entry key="dataAccess.type" value="csvRead"/>
<entry key="dataAccess.name" value="c:\apextest\idtodelete.csv"/>
</map>
</property>
</bean>
mhamberg1mhamberg1

Agreed. I was able to get this working. Thanks!  I'm so glad that you chimed in that this was now available.