function readOnly(count){ }
Starting November 20, the site will be set to read-only. On December 4, 2023,
forum discussions will move to the Trailblazer Community.
+ Start a Discussion
DekorDekor 

Dataloader CLI timestamped log file creation and import from folder rather than file

As someone who often has to import new assets into our system via the DataLoader tool I've decided I'll try and automate the process a bit as the GUI is long winded for importing multiple files a day.

So I've been reading the documentation and I've created my config files/batch file.  Finally got it working but there is a couple of limitations I'm hoping to work around.

Here is my process-conf.xml file
 
<!DOCTYPE beans PUBLIC "-//SPRING//DTD BEAN//EN" "http://www.springframework.org/dtd/spring-beans.dtd">
<beans>
    <bean id="AssetUpsertProcess"
          class="com.salesforce.dataloader.process.ProcessRunner"
          singleton="false">
        <description>Asset Upsert job gets the inventory asset record updates from a CSV file and uploads them to salesforce using 'upsert'.</description>
        <property name="name" value="AssetUpsertProcess"/>
        <property name="configOverrideMap">
            <map>
                <entry key="sfdc.endpoint" value="https://login.salesforce.com"/> 
                <entry key="sfdc.username" value=""/>
                <entry key="sfdc.password" value=""/>
                <entry key="sfdc.timeoutSecs" value="600"/>
                <entry key="sfdc.loadBatchSize" value="200"/>
                <entry key="sfdc.externalIdField" value="Oracle_Id__c"/>
                <entry key="sfdc.entity" value="Asset"/>
                <entry key="process.operation" value="upsert"/>
                <entry key="process.mappingFile" value="C:\Salesforce Imports\Mapping\1.3.sdl"/>
                <entry key="dataAccess.name" value="C:\Salesforce Imports\Inventory Imports\Ready to import\import.csv"/>
                <entry key="dataAccess.type" value="csvRead"/>
                <entry key="process.initialLastRunDate" value="2006-12-01T00:00:00.000-0800"/>
                <entry key="process.outputError" value="C:\Salesforce Imports\errors\error.csv"/>
                <entry key="process.outputSuccess" value="C:\Salesforce Imports\errors\success.csv"/>
		<entry key="process.useEuropeanDates" value="true"/>
            </map>
        </property>
    </bean>
</beans>

My queries relate to three settings, dataAccess.name, process.outputError and process.outputSuccess.

Now for the dataAccess.name file you have to specify a file name.  Is there a way you can specify a folder name and it imports all .csv files in the folder?  This would enable me to import multiple files at once rather than having to save each file in the specified folder with the same file name after each run.

In regards to the output error/success, I'd like it to timestamp the results files rather than constantly overwriting the current file.  The GUI does this so I assume there is a way to do it via command line.  I like to keep an audit of all imports and having these log files is crucial to that. 
 
Lalit Mistry 21Lalit Mistry 21
Hi Dekor,

Bean in a process-config.xml is specific to a single database operation for one object.
In order to perform dataload for multiple objects, you'll need to have multiple bean in your process-config.xml, one for each object with their respective data file and error/success files.
In order to keep history of success/error files, you can move them to folder stamped with datetime as demonstrated in below code of batch file.
##Navigate to bin folder of dataloader
cd C:\Program Files (x86)\salesforce.com\Data Loader\bin

##assuming process-config.xml file is present at C:\DataLoaderConfig
##invoke dataloader as below
## CALL process.bat <location of process-config.xml file> <name of bean id>
## for multiple files you can invoke process.bat for multiple beans (assuming AssetUpsertProcess, AccountInsertProcess, ContactUpdateProcess are bean id's)
CALL process.bat "C:\DataLoaderConfig" AssetUpsertProcess 
CALL process.bat "C:\DataLoaderConfig" AccountInsertProcess 
CALL process.bat "C:\DataLoaderConfig" ContactUpdateProcess 

set hour=%time:~0,2%
if "%hour:~0,1%" == " " set hour=0%hour:~1,1%

set min=%time:~3,2%
if "%min:~0,1%" == " " set min=0%min:~1,1%

set secs=%time:~6,2%
if "%secs:~0,1%" == " " set secs=0%secs:~1,1%

set DATETIME=%date:~10,4%%date:~7,2%%date:~4,2%_%hour%%min%%secs%

##Navigate to error/success file location
cd "C:\Salesforce Imports\errors\"
mkdir %DATETIME%
##move all csv file into folder stamped with datetime
move *.csv %DATETIME%

cls

Hope this helps.
DekorDekor
Ah thats a shame.  Seems silly that you have to hard code specify a file rather than just have a folder you can drop your csvs into and run the bat file. 
Edgars Everts 6Edgars Everts 6
Hi,

Just put all .CSV files in single INPUT location

Use some script (schedule it for automation) which will look into INPUT and take file by file, copy it to /read and run the DataLoader job.
After each job run, you should clean /read, and move input file to /Processed location.

Enjoy IT!