• Jeffrey Zhang
  • NEWBIE
  • 30 Points
  • Member since 2014

  • Chatter
    Feed
  • 0
    Best Answers
  • 0
    Likes Received
  • 0
    Likes Given
  • 8
    Questions
  • 4
    Replies
JobInfo job = new JobInfo();
			    job.setObject("Account");
			    job.setOperation(OperationEnum.query);
			    job.setConcurrencyMode(ConcurrencyMode.Parallel);
			    job.setContentType(ContentType.CSV);
			    job = sfc.getBulkConnection().createJob(job);
			    assert job.getId() != null;
			    job = sfc.getBulkConnection().getJobStatus(job.getId());
			    String query = "SELECT Name, Id FROM Account";
			    long start = System.currentTimeMillis();
			    BatchInfo info = null;
			    ByteArrayInputStream bout = 
			        new ByteArrayInputStream(query.getBytes());
			    info = sfc.getBulkConnection().createBatchFromStream(job, bout);
			    
			    String[] queryResults = null;
			    
			    for(int i=0; i<10000; i++) {
			      Thread.sleep(30000); //30 sec
			      info = sfc.getBulkConnection().getBatchInfo(job.getId(), 
			          info.getId());
			      
			      if (info.getState() == BatchStateEnum.Completed) {
			        QueryResultList list = 
			        		sfc.getBulkConnection().getQueryResultList(job.getId(), 
			                info.getId());
			        queryResults = list.getResult();
			        break;
			      } else if (info.getState() == BatchStateEnum.Failed) {
			        System.out.println("-------------- failed ----------" 
			            + info);
			        break;
			      } else {
			        System.out.println("-------------- waiting ----------" 
			            + info);
			      }
			    }
			    System.out.println("Account!");
			    
			    if (queryResults != null) {
			      for (String resultId : queryResults) {
			    	 inputStream= (ByteArrayInputStream) sfc.getBulkConnection().getQueryResultStream(job.getId(), 
			            info.getId(), resultId);
			      }
			      

			      
			      //ArrayList<Account> beans = (ArrayList<Account>) rowProcessor.getBeans();
			      
			     // System.out.println("array of accounts:"+beans.toString());
				    System.out.println("Account2!");

			      
			      int c = inputStream.available();
				    System.out.println("Account3!");

			      byte[] bytes = new byte[c];
			      inputStream.read(bytes, 0, c);
				    System.out.println("Account4!");

			      String s = new String(bytes, StandardCharsets.UTF_8); // Or any encoding.
			     s= s.replaceAll("\"", "");
				    System.out.println("Account5!");

		    	  System.out.println("results!:"+ s);
		    	  System.out.println("size:"+ s.length());
		    	  
		    	  
		    	  String[] accounts = s.split("\n");
		    	  ArrayList<Account> accs = new ArrayList<Account>(1000000);

		    	  for(int i=1; i<accounts.length;i++)
		    	  {
		    		  String[] accountparts = accounts[i].split(",");
		    		  Account a = new Account();
		    		  a.setName(accountparts[0]);
		    		  a.setId(accountparts[1]);
		    		  accs.add(a);
		    	  }
Haven't really messed with data this large in java before so clearly I'm doing something wrong. Requirements require to pull down all the accounts (~450k) and I'm assigning it to an arraylist.

From my investigating, 
 
if (queryResults != null) {
			      for (String resultId : queryResults) {
			    	 inputStream= (ByteArrayInputStream) sfc.getBulkConnection().getQueryResultStream(job.getId(), 
			            info.getId(), resultId);
			      }

Is where its hanging... (takes like 5 min)  wonder if there is anything I can do to speed this up, or any other solutions? Thanks!
 
Was wondering if there is any good sample code for java on how to split up queries with giant 'where' section, because I ended up hitting the 20,000 character limit error.

 stateMessage='ClientInputError : Failed to read query. Exceeded max size limit of 20000 with response size 20001'


Thanks! 
Getting this error when trying to do the bulk api example on developer.salesforce.com 
 
2017-02-14 15:03:03.381 ERROR 19228 --- [nio-8080-exec-1] o.a.c.c.C.[.[.[/].[dispatcherServlet]    : Servlet.service() for servlet [dispatcherServlet] in context with path [] threw exception [Request processing failed; nested exception is [AsyncApiException  exceptionCode='InvalidUrl'
 exceptionMessage='Destination URL not reset. The URL returned from login must be set'
]
] with root cause

com.sforce.async.AsyncApiException: InvalidUrl : Destination URL not reset. The URL returned from login must be set
	at com.sforce.async.BulkConnection.parseAndThrowException(BulkConnection.java:125) ~[wsc-23-min.jar:23-min]
	at com.sforce.async.BulkConnection.createOrUpdateJob(BulkConnection.java:109) ~[wsc-23-min.jar:23-min]
	at com.sforce.async.BulkConnection.createJob(BulkConnection.java:89) ~[wsc-23-min.jar:23-min]
	at ves.sfdc.service.AccountService.doBulkQuery(AccountService.java:183) ~[classes/:na]

it's failing at the Createjob here:
 
public boolean login() {
			  boolean success = false;
			  
			  String userId ="awefawf";
			  String passwd = "awfa";
			  String soapAuthEndPoint = "https://test.salesforce.com/services/Soap/c/28.0";
			  //String bulkAuthEndPoint = "https://test.salesforce.com/services/Soap/async/28.0";
			  String bulkAuthEndPoint =soapAuthEndPoint.substring(0, soapAuthEndPoint.indexOf("Soap/")) + "async/28.0";

			 String proxy = "wafaefa";
			

			  try {
			    ConnectorConfig config = new ConnectorConfig();
			    config.setUsername(userId);
			    config.setPassword(passwd);
			    config.setAuthEndpoint(soapAuthEndPoint);
			    config.setCompression(true);
			    config.setTraceFile("traceLogs.txt");
			    config.setTraceMessage(true);
			    config.setPrettyPrintXml(true);
				if ((proxy!=null) && (!proxy.equals(""))) {
					config.setProxy(proxy, 80);
				}
			    config.setRestEndpoint(bulkAuthEndPoint);

			    System.out.println("AuthEndpoint: " + 
			        config.getRestEndpoint());
			    EnterpriseConnection connection = new EnterpriseConnection(config);
			    System.out.println("SessionID: " + config.getSessionId());
			     bulkConnection = new BulkConnection(config);
			    success = true;
			  } catch (AsyncApiException aae) {
			    aae.printStackTrace();
			  } catch (ConnectionException ce) {
			    ce.printStackTrace();
			  } catch (FileNotFoundException fnfe) {
			    fnfe.printStackTrace();
			  }
			  return success;
			}
		public void doBulkQuery() throws AsyncApiException, InterruptedException
		{
//			SForceConnector sfc = new SForceConnector();
//			EnterpriseConnection connection = sfc.getConnection();
			
			if ( ! login() ) {

				    return;

				  }

			JobInfo job = new JobInfo();
			    job.setObject("Account");
			    job.setOperation(OperationEnum.query);
			    job.setConcurrencyMode(ConcurrencyMode.Parallel);
			    job.setContentType(ContentType.CSV);
			    job = bulkConnection.createJob(job);
			    assert job.getId() != null;

i'm not able to find anyone else that got this error from googling which is surprising. I wonder if there is something obvious that I missed. The enterpriseconnection is fine, just something is off at createjob. 

Thanks!
I've successfully gotten a post login to sf and retrieved the access token.  I keep getting a 401 unauthorized error when I try to do a get though. 
 
public Custom [] QueryAllRules(SFDCResult sr)
	{
		
		RestTemplate restTemplate = new RestTemplate();
		HttpHeaders headers = new HttpHeaders();
		headers.setContentType(MediaType.APPLICATION_FORM_URLENCODED);
		headers.setAccept(Arrays.asList(MediaType.APPLICATION_JSON));
		headers.set("Authorization", "OAuth " + sr.getaccess_token());
		log.info("access token "+ sr.getaccess_token());
		HttpEntity<String> entity = new HttpEntity<String>(headers);
		log.info("entity header "+ entity.getHeaders());

		
		Custom [] gr = restTemplate.getForObject("https://vz1--REL170217.cs50.my.salesforce.com/services/data/v38.0/query?q=Select+Id+,+Name+,+Account_Access_Level__c+,+Active__c+From+Custom__c ",  GSAMRules[].class,entity);
	
		return gr;
	}

I'm still learning maven/spring atm so I'm thinking I might have gotten a formatting issue somewhere? Thanks for the help!
Not sure if this is the right section to ask, but I've been tasked with rebuilding an old java application that used SOAP calls for upserting data. The problem is the size of our org often lead to time out issues which is a pain with this app being run on a windows scheduler.

Was thinking of bulk api is a good solution for the upsert, the question is though, haven't really used java in a decade or so and not familiar with the options out there for CSV generation off off of like a map or other list options?  

Anyone have experience developing apps that dl data from salesforce, repackage and bulk api upsert back?

thanks!
Not too familiar with maps and aggregateResult in SF, so I'm not sure if this is a formatting issue or something else, but I keep getting that error in my trigger when this is called.
Map<Id,aggregateResult> maxDates = new Map<id,AggregateResult>([Select whoid id, Max(Sales_Activity_Date__c) myMax, Min(Sales_Activity_Date__c) myMin from Task 
         where whoid = :who and status = 'Completed' and whoid!=NULL GROUP BY whoid]);

Is it a formatting issue, or something else?

Thanks.
trigger ba on Task (after update, after insert) {
    
    GetMaxActivityDate ad= new GetMaxActivityDate();
    
    
    
    Set<ID> ids = Trigger.newMap.keySet();
    
    List<Task> tk = [select id, whoId,OwnerId from Task where id = :ids limit 1000];
    System.debug('b tasks:'+ tk+ ' size:'+tk.size());
    List<ID> idzz = new List<ID>{};
        
        for(Task a: tk)
    {
        idzz.add(a.whoId);
    }
        
    
    List<Lead> ldz = [select name,id, Sales_Last_Activity_Date__c, Sales_First_Activity_Date__c, OwnerId, IsConverted from Lead where id = :idzz limit 1000];
    
    System.debug('b ids:'+ idzz+'. leads:' + ldz + ' size:' + ldz.size());
    
    List<Lead> updateld = new List<Lead>{};
    
    
    for(Lead a : ldz)
    {
        
        
    
    
    
    
    
                    String ldd = a.id;
        
        if(ldd!=null)
        {
           
        String lds = ldd.subString(0,3);
        
        System.debug('lds: '+ lds);
        
        if(lds.equals('00Q'))
        {
            
             List<aggregateResult> dates = new List <aggregateResult>();
        

        
        
        
       // Lead ld = [select name,id, Sales_Last_Activity_Date__c, Sales_First_Activity_Date__c, IsConverted from Lead where id = :a.WhoId limit 1];
        
        
        
        if(a!=null)
        {
            System.debug('b ' + lds);
            
			Set<ID> taskset = new Set<ID>();
			List<ID>result = new List<ID>();            
            taskset.addAll(idzz);
            result.addAll(taskset);
            System.debug('taskset:'+ taskset.size()+' result:'+result.size());
            
            
            for(ID b:result)
            {
            
            if(b == a.Id)
                {
                     dates = ad.getMaxActivity(b);
                    System.debug('b triggered:  max:'+ (Date)dates[0].get('myMax') + ' min: '+ (Date)dates[0].get('myMin') + 'last activity date:'+ a.Sales_Last_Activity_Date__c + ' first activity date ' +  a.Sales_First_Activity_Date__c+  ' isconverted:'+ a.IsConverted );
                        System.debug('b triggered: lead:'+ a);
                    if(a.IsConverted == false && (dates[0].get('myMax')!=null) && dates[0].get('myMin')!=null && (a.Sales_Last_Activity_Date__c != (Date)dates[0].get('myMax')||a.Sales_First_Activity_Date__c != (Date)dates[0].get('myMin')))
                    {
                        System.debug('Activity_date updated');
                        
                        a.Sales_Last_Activity_Date__c = (Date)dates[0].get('myMax');
                        a.Sales_First_Activity_Date__c = (Date)dates[0].get('myMin');
                        
                        updateld.add(a);
                    }
                    
                    
                }
                
            }
            
        }
        }

        }
    }
    
    update updateld;
    
    
    
    
    
    for(Task a: Trigger.New)
    {

      
        
        
        
    }

}

  public  List<aggregateResult> getMaxActivity(ID who)
    {
        List<aggregateResult> maxDates = [Select Max(Sales_Activity_Date__c) myMax, Min(Sales_Activity_Date__c) myMin from Task 
         where whoid = :who and status = 'Completed'];
            System.debug('Max is:  ' + maxDates[0].get('myMax') + 'min '+maxDates[0].get('myMin'));  
     
        return maxDates;
    }
I was getting this governor limit error in the past, and I rewrote most of the trigger to hopefully fix that, yet I still occasionally see this error. And from personally inserting a bunch of data, i can't seem to reproduce so its having a hard time debug. 

I'm thinking its due to getMaxActivity though because techinically its a select in a for loop? I'm not sure how else logically I can address this though.

thanks!
 
We have a visualforce 'edit' page for a custom object. The variable is stored in a String (eg: " a;b;c"). I want to be able to pull the saved values and preselect them on a multi selectList so users don't have to reselect the old values when editing. 

value = [select ....];

<apex:selectList value="{!value}" size="3" multiselect="true" label="blah" id="blah" >
                        <apex:selectOptions value="{!options}" />
                    </apex:selectList>

I've tried string format and String[] list format for 'value' and neither gets me any default selected values for this select list. 
Not sure if this is the right section to ask, but I've been tasked with rebuilding an old java application that used SOAP calls for upserting data. The problem is the size of our org often lead to time out issues which is a pain with this app being run on a windows scheduler.

Was thinking of bulk api is a good solution for the upsert, the question is though, haven't really used java in a decade or so and not familiar with the options out there for CSV generation off off of like a map or other list options?  

Anyone have experience developing apps that dl data from salesforce, repackage and bulk api upsert back?

thanks!
Not too familiar with maps and aggregateResult in SF, so I'm not sure if this is a formatting issue or something else, but I keep getting that error in my trigger when this is called.
Map<Id,aggregateResult> maxDates = new Map<id,AggregateResult>([Select whoid id, Max(Sales_Activity_Date__c) myMax, Min(Sales_Activity_Date__c) myMin from Task 
         where whoid = :who and status = 'Completed' and whoid!=NULL GROUP BY whoid]);

Is it a formatting issue, or something else?

Thanks.
trigger ba on Task (after update, after insert) {
    
    GetMaxActivityDate ad= new GetMaxActivityDate();
    
    
    
    Set<ID> ids = Trigger.newMap.keySet();
    
    List<Task> tk = [select id, whoId,OwnerId from Task where id = :ids limit 1000];
    System.debug('b tasks:'+ tk+ ' size:'+tk.size());
    List<ID> idzz = new List<ID>{};
        
        for(Task a: tk)
    {
        idzz.add(a.whoId);
    }
        
    
    List<Lead> ldz = [select name,id, Sales_Last_Activity_Date__c, Sales_First_Activity_Date__c, OwnerId, IsConverted from Lead where id = :idzz limit 1000];
    
    System.debug('b ids:'+ idzz+'. leads:' + ldz + ' size:' + ldz.size());
    
    List<Lead> updateld = new List<Lead>{};
    
    
    for(Lead a : ldz)
    {
        
        
    
    
    
    
    
                    String ldd = a.id;
        
        if(ldd!=null)
        {
           
        String lds = ldd.subString(0,3);
        
        System.debug('lds: '+ lds);
        
        if(lds.equals('00Q'))
        {
            
             List<aggregateResult> dates = new List <aggregateResult>();
        

        
        
        
       // Lead ld = [select name,id, Sales_Last_Activity_Date__c, Sales_First_Activity_Date__c, IsConverted from Lead where id = :a.WhoId limit 1];
        
        
        
        if(a!=null)
        {
            System.debug('b ' + lds);
            
			Set<ID> taskset = new Set<ID>();
			List<ID>result = new List<ID>();            
            taskset.addAll(idzz);
            result.addAll(taskset);
            System.debug('taskset:'+ taskset.size()+' result:'+result.size());
            
            
            for(ID b:result)
            {
            
            if(b == a.Id)
                {
                     dates = ad.getMaxActivity(b);
                    System.debug('b triggered:  max:'+ (Date)dates[0].get('myMax') + ' min: '+ (Date)dates[0].get('myMin') + 'last activity date:'+ a.Sales_Last_Activity_Date__c + ' first activity date ' +  a.Sales_First_Activity_Date__c+  ' isconverted:'+ a.IsConverted );
                        System.debug('b triggered: lead:'+ a);
                    if(a.IsConverted == false && (dates[0].get('myMax')!=null) && dates[0].get('myMin')!=null && (a.Sales_Last_Activity_Date__c != (Date)dates[0].get('myMax')||a.Sales_First_Activity_Date__c != (Date)dates[0].get('myMin')))
                    {
                        System.debug('Activity_date updated');
                        
                        a.Sales_Last_Activity_Date__c = (Date)dates[0].get('myMax');
                        a.Sales_First_Activity_Date__c = (Date)dates[0].get('myMin');
                        
                        updateld.add(a);
                    }
                    
                    
                }
                
            }
            
        }
        }

        }
    }
    
    update updateld;
    
    
    
    
    
    for(Task a: Trigger.New)
    {

      
        
        
        
    }

}

  public  List<aggregateResult> getMaxActivity(ID who)
    {
        List<aggregateResult> maxDates = [Select Max(Sales_Activity_Date__c) myMax, Min(Sales_Activity_Date__c) myMin from Task 
         where whoid = :who and status = 'Completed'];
            System.debug('Max is:  ' + maxDates[0].get('myMax') + 'min '+maxDates[0].get('myMin'));  
     
        return maxDates;
    }
I was getting this governor limit error in the past, and I rewrote most of the trigger to hopefully fix that, yet I still occasionally see this error. And from personally inserting a bunch of data, i can't seem to reproduce so its having a hard time debug. 

I'm thinking its due to getMaxActivity though because techinically its a select in a for loop? I'm not sure how else logically I can address this though.

thanks!
 
We have a visualforce 'edit' page for a custom object. The variable is stored in a String (eg: " a;b;c"). I want to be able to pull the saved values and preselect them on a multi selectList so users don't have to reselect the old values when editing. 

value = [select ....];

<apex:selectList value="{!value}" size="3" multiselect="true" label="blah" id="blah" >
                        <apex:selectOptions value="{!options}" />
                    </apex:selectList>

I've tried string format and String[] list format for 'value' and neither gets me any default selected values for this select list.