+ Start a Discussion
URVASHIURVASHI 

Need a Bulk API Example to start

Hi,

Can any one explain Bulk api with a small sample code.

I know bulk api is used for processing large amounts of data and is being used with apex data loader but what if some wants to impliment bulk api without the apex dataloader.

 

Please help with an example.

 

Thanks.

Best Answer chosen by Admin (Salesforce Developers) 
Dhaval PanchalDhaval Panchal

Hi Urvasi,

 

1) Bulk api always returns csv files as a result.
2) function getBulkConnection creates connection with salesforce, you need to provide your
    user name and password (password + token).
3) createJob function creates job which will execute bulk operation.
4) createBatchesFromCSVFile function creates result csv file and saves to path given in "csvFileName" parameter.

Below is the example which I used in my requirement.

 

//This is first function which I calls when I need to execute my query using bulk api.

public void retrieveBulkData(String userName, String password,
		String objectName, String query, String fileName, String environment)
		throws ConnectionException, AsyncApiException,
		FileNotFoundException, IOException {
		
	long start=System.currentTimeMillis();	
	BulkConnection bulkconnection = getBulkConnection(userName,password,environment);
	JobInfo job = createJob(objectName,bulkconnection);
	//String query = "SELECT Name,Id FROM contact";
	ByteArrayInputStream byteArrayInputStream =getData(query,bulkconnection,job);
	closeJob(bulkconnection, job.getId());
	System.out.println("FileName::"+fileName);
	createCSVFile(byteArrayInputStream, fileName);
	long end=System.currentTimeMillis();
	
	long total=start-end;
	System.out.println("total:"+total);
	int seconds=(int) ((total/1000)%60);
	int minutes=(int) ((total / (1000 * 60)) % 60);
	System.out.println("Time To Execute is::"+minutes+":"+seconds);
}

 //Below function I am using to create connection with salesforce

public BulkConnection getBulkConnection(String userName, String password,
		String environment) throws ConnectionException, AsyncApiException {
	BulkConnection bulkConnection = null;
	try {
		ConnectorConfig config = new ConnectorConfig();
		config.setUsername(userName);
		config.setPassword(password);
		System.out.println("BulkAPI.java file environment: "+environment);
		if(environment.trim().equalsIgnoreCase("Production")){
			System.out.println("conn for production type");
			config.setAuthEndpoint("https://login.salesforce.com/services/Soap/u/26.0");
		}
		else if(environment.trim().equalsIgnoreCase("Sandbox")){
			System.out.println("conn for sand box type");
			config.setAuthEndpoint("https://test.salesforce.com/services/Soap/u/26.0");
		}
		else{
			System.out.println("conn else case");
			config.setAuthEndpoint("https://login.salesforce.com/services/Soap/u/26.0");
		}
		config.setCompression(true);
		config.setTraceFile("traceLogs.txt");
		config.setTraceMessage(true);
		config.setPrettyPrintXml(true);
		System.out.println("AuthEndpoint: " + config.getRestEndpoint());
		new PartnerConnection(config);
		System.out.println("SessionID: " + config.getSessionId());
		// String soapAuthEndPoint = config.getServiceEndpoint();;
		// config.setRestEndpoint(bulkAuthEndPoint);
		String soapEndpoint = config.getServiceEndpoint();
		String apiVersion = "26.0";
		String restEndpoint = soapEndpoint.substring(0,
				soapEndpoint.indexOf("Soap/"))
				+ "async/" + apiVersion;
		config.setRestEndpoint(restEndpoint);
		// This should only be false when doing debugging.
		config.setCompression(true);
		bulkConnection = new BulkConnection(config);
		
	} 
	
	catch (AsyncApiException aae) {
		aae.printStackTrace();
	} catch (ConnectionException ce) {
		ce.printStackTrace();
	} catch (FileNotFoundException fnfe) {
		fnfe.printStackTrace();
	}
	return bulkConnection;
	
}

 
//Below function creates jobs

public JobInfo createJob(String sobjectType, BulkConnection bulkconnection)
		throws AsyncApiException {
	JobInfo job = new JobInfo();
	job.setObject(sobjectType);
	job.setOperation(OperationEnum.query);
	job.setConcurrencyMode(ConcurrencyMode.Parallel);
	job.setContentType(ContentType.CSV);
	job = bulkconnection.createJob(job);
	assert job.getId() != null;
	job = bulkconnection.getJobStatus(job.getId());
	System.out.println(job);
	return job;
}

 //Below function executes query to fetch data.

public ByteArrayInputStream getData(String query,
		BulkConnection bulkConnection, JobInfo job) {
	ByteArrayInputStream inputStream = null;
	try{
		BatchInfo info = null;
		ByteArrayInputStream bout = new ByteArrayInputStream(
				query.getBytes());// convert query into ByteArrayInputStream
		info = bulkConnection.createBatchFromStream(job, bout);// creates batch from ByteArrayInputStream
		String[] queryResults = null;// to store QueryResultList
		QueryResultList list = null;
		int count=0;
		for (int i = 0; i < 10000; i++) // 10000=maxRowsPerBatch
		{
			count++;
			Thread.sleep(i == 0 ? 30 * 1000 : 30 * 1000); // 30 sec
			
			info = bulkConnection.getBatchInfo(job.getId(), info.getId());
			// If Batch Status is Completed,get QueryResultList and store in
			// queryResults.
			if (info.getState() == BatchStateEnum.Completed) {
				list = bulkConnection.getQueryResultList(job.getId(),
						info.getId());
				queryResults = list.getResult();

				break;
			} else if (info.getState() == BatchStateEnum.Failed) {
				System.out.println("-------------- failed ----------"
						+ info);
				break;
			} else {
				System.out.println("-------------- waiting ----------"
						+ info);
			}
		}
		System.out.println("count::--"+count);
		System.out.println("QueryResultList::" + list.toString());
		if (queryResults != null) {
			//for each resultid retrieve data and store in ByteArrayInputstream provided job id,batch id and result id 
			//as returnType for bulkConnection.getQueryResultStream() method is ByteArrayInputstream
			for (String resultId : queryResults) {
				inputStream=(ByteArrayInputStream) bulkConnection.getQueryResultStream(job.getId(),
						info.getId(), resultId);
			}
		}
		
	} catch (AsyncApiException aae) {
		aae.printStackTrace();
	} catch (InterruptedException ie) {
		ie.printStackTrace();
	}
	return inputStream;
}

 //I am using following function when i need to do insert/update operation to salesforce from csv file

private List<BatchInfo> createBatchesFromCSVFile(BulkConnection connection,JobInfo jobInfo, String csvFileName) throws IOException,	AsyncApiException {
	List<BatchInfo> batchInfos = new ArrayList<BatchInfo>();
	BufferedReader rdr = new BufferedReader(new InputStreamReader(
			new FileInputStream(csvFileName)));
	// read the CSV header row
	CommonPropertyManager cpm = PropertyManagerFactory.getCommonPropertyManager();
	String dirName=cpm.getPath().getTemporaryFilesPath() + java.io.File.separator +"salesforce.com"+ java.io.File.separator;
	byte[] headerBytes = (rdr.readLine() + "\n").getBytes("UTF-8");
	int headerBytesLength = headerBytes.length;
	//File tmpFile = File.createTempFile("bulkAPIInsert", ".csv");
	File tmpFile = File.createTempFile("bulkAPIUpdate", ".csv");
	// Split the CSV file into multiple batches
	try {
		FileOutputStream tmpOut = new FileOutputStream(tmpFile);
		int maxBytesPerBatch = 10000000; // 10 million bytes per batch
		int maxRowsPerBatch = 10000; // 10 thousand rows per batch
		int currentBytes = 0;
		int currentLines = 0;
		String nextLine;
		while ((nextLine = rdr.readLine()) != null) {
			System.out.println("In while");
			byte[] bytes = (nextLine + "\n").getBytes("UTF-8");
			// Create a new batch when our batch size limit is reached
			if (currentBytes + bytes.length > maxBytesPerBatch
					|| currentLines > maxRowsPerBatch) {
				createBatch(tmpOut, tmpFile, batchInfos, connection,
						jobInfo);
				currentBytes = 0;
				currentLines = 0;
			}
			if (currentBytes == 0) {
				tmpOut = new FileOutputStream(tmpFile);
				tmpOut.write(headerBytes);
				currentBytes = headerBytesLength;
				currentLines = 1;
			}
			tmpOut.write(bytes);
			currentBytes += bytes.length;
			currentLines++;
		}
		// Finished processing all rows
		// Create a final batch for any remaining data
		System.out.println("After While");
		if (currentLines > 1) {
			System.out.println("before createBatch");
			createBatch(tmpOut, tmpFile, batchInfos, connection, jobInfo);
			System.out.println("before createBatch");
		}
	} finally {
		rdr.close();
		try
		{
			tmpFile.delete();	
		}
		catch (Exception e)
		{
			System.out.println("Exception in deleting file");
		}
		
	}
	return batchInfos;
}

private void createBatch(FileOutputStream tmpOut, File tmpFile,List<BatchInfo> batchInfos, BulkConnection connection,JobInfo jobInfo) throws IOException, AsyncApiException {
    tmpOut.flush();
    tmpOut.close();
    FileInputStream tmpInputStream = new FileInputStream(tmpFile);
    try {
        BatchInfo batchInfo = connection.createBatchFromStream(jobInfo,
                tmpInputStream);
        System.out.println(batchInfo);
        batchInfos.add(batchInfo);
    }
    catch(Exception ex){
        System.out.println("##### Exception in createbatch " + ex);
    } finally {
        tmpInputStream.close();
    }
}

 I hope this will be helpful to you.

All Answers

souvik9086souvik9086

What do you want to do with the BULK API? Means what is the actual requirement here?

 

Thanks

Dhaval PanchalDhaval Panchal
Refer below link.

http://www.salesforce.com/us/developer/docs/api_asynch/Content/asynch_api_code_walkthrough.htm

It was useful to me for developing bulk API code in java.
URVASHIURVASHI

Currently i am using soap api to transfer records,but its hiting many of the governor limits like

Maximum size of callout request or response (HTTP request or Web services call):    3 MB
Total no of callouts : 10

 

So i am looking for a workaround for it,trying to use soap api with bulk api so that i can transfer data in bulk without hitting the governor limits.

 

Can u please provide me with some start example of learning bulk api or can you give me some practical implimetation of bulk api.

 

 

Thanks a lot :)

URVASHIURVASHI

Thanks dapanchal.

But can u please tell me what is the java code doing,the link you have provided me with?

Dhaval PanchalDhaval Panchal

Hi Urvasi,

 

1) Bulk api always returns csv files as a result.
2) function getBulkConnection creates connection with salesforce, you need to provide your
    user name and password (password + token).
3) createJob function creates job which will execute bulk operation.
4) createBatchesFromCSVFile function creates result csv file and saves to path given in "csvFileName" parameter.

Below is the example which I used in my requirement.

 

//This is first function which I calls when I need to execute my query using bulk api.

public void retrieveBulkData(String userName, String password,
		String objectName, String query, String fileName, String environment)
		throws ConnectionException, AsyncApiException,
		FileNotFoundException, IOException {
		
	long start=System.currentTimeMillis();	
	BulkConnection bulkconnection = getBulkConnection(userName,password,environment);
	JobInfo job = createJob(objectName,bulkconnection);
	//String query = "SELECT Name,Id FROM contact";
	ByteArrayInputStream byteArrayInputStream =getData(query,bulkconnection,job);
	closeJob(bulkconnection, job.getId());
	System.out.println("FileName::"+fileName);
	createCSVFile(byteArrayInputStream, fileName);
	long end=System.currentTimeMillis();
	
	long total=start-end;
	System.out.println("total:"+total);
	int seconds=(int) ((total/1000)%60);
	int minutes=(int) ((total / (1000 * 60)) % 60);
	System.out.println("Time To Execute is::"+minutes+":"+seconds);
}

 //Below function I am using to create connection with salesforce

public BulkConnection getBulkConnection(String userName, String password,
		String environment) throws ConnectionException, AsyncApiException {
	BulkConnection bulkConnection = null;
	try {
		ConnectorConfig config = new ConnectorConfig();
		config.setUsername(userName);
		config.setPassword(password);
		System.out.println("BulkAPI.java file environment: "+environment);
		if(environment.trim().equalsIgnoreCase("Production")){
			System.out.println("conn for production type");
			config.setAuthEndpoint("https://login.salesforce.com/services/Soap/u/26.0");
		}
		else if(environment.trim().equalsIgnoreCase("Sandbox")){
			System.out.println("conn for sand box type");
			config.setAuthEndpoint("https://test.salesforce.com/services/Soap/u/26.0");
		}
		else{
			System.out.println("conn else case");
			config.setAuthEndpoint("https://login.salesforce.com/services/Soap/u/26.0");
		}
		config.setCompression(true);
		config.setTraceFile("traceLogs.txt");
		config.setTraceMessage(true);
		config.setPrettyPrintXml(true);
		System.out.println("AuthEndpoint: " + config.getRestEndpoint());
		new PartnerConnection(config);
		System.out.println("SessionID: " + config.getSessionId());
		// String soapAuthEndPoint = config.getServiceEndpoint();;
		// config.setRestEndpoint(bulkAuthEndPoint);
		String soapEndpoint = config.getServiceEndpoint();
		String apiVersion = "26.0";
		String restEndpoint = soapEndpoint.substring(0,
				soapEndpoint.indexOf("Soap/"))
				+ "async/" + apiVersion;
		config.setRestEndpoint(restEndpoint);
		// This should only be false when doing debugging.
		config.setCompression(true);
		bulkConnection = new BulkConnection(config);
		
	} 
	
	catch (AsyncApiException aae) {
		aae.printStackTrace();
	} catch (ConnectionException ce) {
		ce.printStackTrace();
	} catch (FileNotFoundException fnfe) {
		fnfe.printStackTrace();
	}
	return bulkConnection;
	
}

 
//Below function creates jobs

public JobInfo createJob(String sobjectType, BulkConnection bulkconnection)
		throws AsyncApiException {
	JobInfo job = new JobInfo();
	job.setObject(sobjectType);
	job.setOperation(OperationEnum.query);
	job.setConcurrencyMode(ConcurrencyMode.Parallel);
	job.setContentType(ContentType.CSV);
	job = bulkconnection.createJob(job);
	assert job.getId() != null;
	job = bulkconnection.getJobStatus(job.getId());
	System.out.println(job);
	return job;
}

 //Below function executes query to fetch data.

public ByteArrayInputStream getData(String query,
		BulkConnection bulkConnection, JobInfo job) {
	ByteArrayInputStream inputStream = null;
	try{
		BatchInfo info = null;
		ByteArrayInputStream bout = new ByteArrayInputStream(
				query.getBytes());// convert query into ByteArrayInputStream
		info = bulkConnection.createBatchFromStream(job, bout);// creates batch from ByteArrayInputStream
		String[] queryResults = null;// to store QueryResultList
		QueryResultList list = null;
		int count=0;
		for (int i = 0; i < 10000; i++) // 10000=maxRowsPerBatch
		{
			count++;
			Thread.sleep(i == 0 ? 30 * 1000 : 30 * 1000); // 30 sec
			
			info = bulkConnection.getBatchInfo(job.getId(), info.getId());
			// If Batch Status is Completed,get QueryResultList and store in
			// queryResults.
			if (info.getState() == BatchStateEnum.Completed) {
				list = bulkConnection.getQueryResultList(job.getId(),
						info.getId());
				queryResults = list.getResult();

				break;
			} else if (info.getState() == BatchStateEnum.Failed) {
				System.out.println("-------------- failed ----------"
						+ info);
				break;
			} else {
				System.out.println("-------------- waiting ----------"
						+ info);
			}
		}
		System.out.println("count::--"+count);
		System.out.println("QueryResultList::" + list.toString());
		if (queryResults != null) {
			//for each resultid retrieve data and store in ByteArrayInputstream provided job id,batch id and result id 
			//as returnType for bulkConnection.getQueryResultStream() method is ByteArrayInputstream
			for (String resultId : queryResults) {
				inputStream=(ByteArrayInputStream) bulkConnection.getQueryResultStream(job.getId(),
						info.getId(), resultId);
			}
		}
		
	} catch (AsyncApiException aae) {
		aae.printStackTrace();
	} catch (InterruptedException ie) {
		ie.printStackTrace();
	}
	return inputStream;
}

 //I am using following function when i need to do insert/update operation to salesforce from csv file

private List<BatchInfo> createBatchesFromCSVFile(BulkConnection connection,JobInfo jobInfo, String csvFileName) throws IOException,	AsyncApiException {
	List<BatchInfo> batchInfos = new ArrayList<BatchInfo>();
	BufferedReader rdr = new BufferedReader(new InputStreamReader(
			new FileInputStream(csvFileName)));
	// read the CSV header row
	CommonPropertyManager cpm = PropertyManagerFactory.getCommonPropertyManager();
	String dirName=cpm.getPath().getTemporaryFilesPath() + java.io.File.separator +"salesforce.com"+ java.io.File.separator;
	byte[] headerBytes = (rdr.readLine() + "\n").getBytes("UTF-8");
	int headerBytesLength = headerBytes.length;
	//File tmpFile = File.createTempFile("bulkAPIInsert", ".csv");
	File tmpFile = File.createTempFile("bulkAPIUpdate", ".csv");
	// Split the CSV file into multiple batches
	try {
		FileOutputStream tmpOut = new FileOutputStream(tmpFile);
		int maxBytesPerBatch = 10000000; // 10 million bytes per batch
		int maxRowsPerBatch = 10000; // 10 thousand rows per batch
		int currentBytes = 0;
		int currentLines = 0;
		String nextLine;
		while ((nextLine = rdr.readLine()) != null) {
			System.out.println("In while");
			byte[] bytes = (nextLine + "\n").getBytes("UTF-8");
			// Create a new batch when our batch size limit is reached
			if (currentBytes + bytes.length > maxBytesPerBatch
					|| currentLines > maxRowsPerBatch) {
				createBatch(tmpOut, tmpFile, batchInfos, connection,
						jobInfo);
				currentBytes = 0;
				currentLines = 0;
			}
			if (currentBytes == 0) {
				tmpOut = new FileOutputStream(tmpFile);
				tmpOut.write(headerBytes);
				currentBytes = headerBytesLength;
				currentLines = 1;
			}
			tmpOut.write(bytes);
			currentBytes += bytes.length;
			currentLines++;
		}
		// Finished processing all rows
		// Create a final batch for any remaining data
		System.out.println("After While");
		if (currentLines > 1) {
			System.out.println("before createBatch");
			createBatch(tmpOut, tmpFile, batchInfos, connection, jobInfo);
			System.out.println("before createBatch");
		}
	} finally {
		rdr.close();
		try
		{
			tmpFile.delete();	
		}
		catch (Exception e)
		{
			System.out.println("Exception in deleting file");
		}
		
	}
	return batchInfos;
}

private void createBatch(FileOutputStream tmpOut, File tmpFile,List<BatchInfo> batchInfos, BulkConnection connection,JobInfo jobInfo) throws IOException, AsyncApiException {
    tmpOut.flush();
    tmpOut.close();
    FileInputStream tmpInputStream = new FileInputStream(tmpFile);
    try {
        BatchInfo batchInfo = connection.createBatchFromStream(jobInfo,
                tmpInputStream);
        System.out.println(batchInfo);
        batchInfos.add(batchInfo);
    }
    catch(Exception ex){
        System.out.println("##### Exception in createbatch " + ex);
    } finally {
        tmpInputStream.close();
    }
}

 I hope this will be helpful to you.

This was selected as the best answer
URVASHIURVASHI

Hey thanks dapanchal for the code.But i had few doubts regarding the code:

 

1.How does the above code utilize the concept of Bulk Api of Salesforce.

 

2.How will you call these methods in Salesforce.I mean how will it be consumed in Salesforce.(using a webservice or wsdl or wat)

 

3.When i tried creating a proj in eclipse,i found that this code is incomplete,it has many classes and many methods within it.If you could please provide an entire skelton of the code including its classes and interfaces it would very grateful,as i am new to java,so its becoming difficult to understand.

 

 

Thanks a lot Once again.

URVASHIURVASHI

I got answers to the qs i had asked above.But my doubt here is my proxy settings are not allowing me to login to salesforce using the following url

Failed to send request to https://login.salesforce.com/services/Soap/u/26.0

 

which is specified in the code.Please help me regarding this.Would be really grateful of you.

 

Thanks a lot.

Dhaval PanchalDhaval Panchal
Try one thing,

Go to Setup-> Develop -> Api -> click any of link (i.e. partner wsdl) -> search text (i.e. login.salesforce.com....) u will find url with version. try to use that url. I am not sure but that might be work.
URVASHIURVASHI

Ya i wll try that.

But can u please give me the jar files of the source code u have jus uploaded.

Please share them,as ur code is very different from the bulk api developer guide client application one.

 

Thanks a lot.:)

Please share the jar files.

 

Dhaval PanchalDhaval Panchal
Ok I will share with you soon.
Dhaval PanchalDhaval Panchal
Click on below link to download jar files.

https://drive.google.com/folderview?id=0B6cPMtpRoi9FUVNUR09UNjRoS2c&usp=sharing

partner.jar and wsc-22.jar file are for salesforce API and opencsv.jar is not related to salesforce but i have used it for creating and reading csv files.
URVASHIURVASHI

Hey thanks a lot for the jar files.

But one more issue

I am getting error here:

 CommonPropertyManager cpm = PropertyManagerFactory.getCommonPropertyManager();

Can u tell me y is it so?

Even after uploading all the jars.

 

 

Thanks a lot :) :)

Dhaval PanchalDhaval Panchal

don't use that, it is our internal custom property. it just retrieves directory path. so you don't worry about that. use your code to get directory. its not from jar files.

SidharthSidharth

How can we get the Bulk Query results in a csv file, from withnin the java code ?
My requirement is to write a java batch which will do bulk query followed by bulk delete those records.
I am assuming for bulk delete, you need to have a csv file. Please correct me if i am wrong.

-Sid
SidharthSidharth
Hi Dhaval Panchal, can you please provide a complete Bulk Query Java Example, whihc can read and print the query results in Java output comsole, OR write the query results in a csv file ? Thanks.
john yungkjohn yungk
@Sidharth I believe that few, if any changes would be needed to pass in a SOQL statement in the input data file specified on line 11 in the createBatchesFromCSVFile method, then retrieve the query results in the getData methios. In fact, if you look at the code, it's configured to do a query in createJob, line 5: job.setOperation(OperationEnum.query)

 
Dipak Sonawane 20Dipak Sonawane 20
Dhaval - Thanks for sharing very good example of Bulk API

its really helpful for newbie 

 
vilas kadudhuramvilas kadudhuram
Hi Dhaval Panchal,

we have requirement to send the BULK messages to SFDC through Java API.

Java code to connect with SFDC:
static final String USERNAME = "userId";
   static final String PASSWORD = "password+securitykey";
    static final String authEndpoint="https://test.salesforce.com/services/Soap/c/37.0/0DF630000008OVd";
    static EnterpriseConnection connection;
    public static void main(String[] args) {
        ConnectorConfig config = new ConnectorConfig();
        config.setUsername(USERNAME);
        config.setPassword(PASSWORD);
       config.setProxy("proxy-url", 8080); 
        config.setAuthEndpoint(authEndpoint);
         try {
            connection = Connector.newConnection(config);
        } 

but while establishing the connction to SFDC getting below error:
com.sforce.ws.ConnectionException: Failed to send request to https://test.salesforce.com/services/Soap/c/37.0/0DF630000008OVd
    at com.sforce.ws.transport.SoapConnection.send(SoapConnection.java:120)
    at com.sforce.soap.enterprise.EnterpriseConnection.login(EnterpriseConnection.java:1)
    at com.sforce.soap.enterprise.EnterpriseConnection.<init>(EnterpriseConnection.java:1)
    at com.sforce.soap.enterprise.Connector.newConnection(Connector.java:1)
    at SFDCBulkAPI.main(SFDCBulkAPI.java:26)
Caused by: java.io.IOException: Unable to tunnel through proxy. Proxy returns "HTTP/1.1 503 Service Unavailable"

Note : 1. We are able to get the session id when we excute it through SAP PI with the same userid,password and proxy url.
          2. USERNAME , PASSWORD and in "proxy-url" we supplied the valid value in the above java code.

Could you please help me here whey we are unable to send the request through the above Java code.

Thanks in advance,
Vilas
sandeep gupta 47sandeep gupta 47
getting bellow error while creating job using Bulk Connection.

I am using below code 
final ConnectorConfig config = new ConnectorConfig();


config.setRestEndpoint(restEndpoint);

config.setSessionId(accessToken);

final BulkConnection connection = new BulkConnection(config);

JobInfo job = new JobInfo();
job.setObject(sobjectType);
job.setOperation(OperationEnum.insert);
job.setContentType(ContentType.CSV);
job = connection.createJob(job);






[AsyncApiException  exceptionCode='ClientInputError'
 exceptionMessage='Failed to parse exception '
]

    at com.sforce.async.BulkConnection.parseAndThrowException(BulkConnection.java:201)
    at com.sforce.async.BulkConnection.createOrUpdateJob(BulkConnection.java:166)
    at com.sforce.async.BulkConnection.createOrUpdateJob(BulkConnection.java:134)
    at com.sforce.async.BulkConnection.createJob(BulkConnection.java:124)
    at com.intuit.platform.crms.CrmsDataPlatformApplicationTests.createJob(CrmsDataPlatformApplicationTests.java:117)
    at com.intuit.platform.crms.CrmsDataPlatformApplicationTests.getConnection(CrmsDataPlatformApplicationTests.java:61)
    at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
    at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
    at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
    at java.lang.reflect.Method.invoke(Method.java:498)..