• deepak_naik
  • NEWBIE
  • 19 Points
  • Member since 2012

  • Chatter
    Feed
  • 0
    Best Answers
  • 0
    Likes Received
  • 1
    Likes Given
  • 8
    Questions
  • 14
    Replies
Our application uses Partner.WSDL to integrate with Salesforce Server. The standard extract and load operations work fine. Now that we plan to move this application onto production, due to security concern this question. Is there any minimum permissions that can be assigned to a User (or a profile that can be set up), so that the User can do the extract and load operations using Partner.WSDL, at the same time, security of the ORG to which the user is connecting is not compromised
We have a Java Program which uses Bulk API (with PK Chunking)  to extract records from an Object. The "select query" has a where clause, which should extract a subset of records, to honour the where clause, but we see that huge numer of batches are created on the Salesforce Server side, and many a times, the BULK API limits are hit

For example :
Number of records in the Object Account : 20000 We enable the connection and set the Pk Chunking header as , with chunk size as 2000
 
bulkConnection.addHeader("Sforce-Enable-PKChunking","chunkSize=2000");

Select Query :
SELECT Id, Name FROM Account WHERE CreatedDate >= 2018-07-20T05:00:00.000Z"


As per the query to match the condition, the query should return only 10 records (at most 1 batch), but we see that there are 10 batches created on the Salesforce Server.
Why is there 10 batches that are created, when the number of records to be extracted is only 10 which can fit in 1 batch.
Is there any setting that can be applied, or code tuning that can be done so that only 1 batch is created on the Salesforce Server
For bulk query jobs if the job has PK chunking enabled, will the original batch (that contains the query when the subsequent batches are created) will be created with the id as "0" in an BatchInfo List

For example if we have the following
BatchInfo[] bListInfo = bulkConnection.getBatchInfoList(job.getId()).getBatchInfo();

Will the Original Query Batch will always be in bListInfo[0] ? If not, how can this be ensured
I am using Bulk Query, reading one batch at a time, and writing to the standard output. When the records size is small like 100000 records, the job works fine, but when I try to extract records like for example 7 million records (7000000 records), the job aborts with the following exception
Job id is 750A0000004YXlsIAG [AsyncApiException exceptionCode='ClientInputError' exceptionMessage='Server error returned in unknown format' ]
at com.sforce.async.BulkConnection.parseAndThrowException(BulkConnection.java:190) at com.sforce.async.BulkConnection.doHttpGet(BulkConnection.java:747) at com.sforce.async.BulkConnection.getBatchInfo(BulkConnection.java:557) at com.sforce.async.BulkConnection.getBatchInfo(BulkConnection.java:550)
Attached is the source code, and the Salesforce Jar files that I use is force-partner-api-39.0.0.jar force-wsc-39.0.0.jar jackson-core-asl-1.9.13.jar jackson-mapper-asl-1.9.13.jar

Anything that I am missing here
Hi

Had a very primitive question regarding the number of batches that are created using the Bulk API for "QUERY" Operation.
The BULK API limits says that we can have the maximum records that can be returned from a batch in a bulk query request is 10000 records. If I send in a query in a batch request to the bulk API and the query result is 50000 records, with the createJob() method take care of creating the 5 batches that will return each 10000 records.

When I do say "job = connection.createJob(job); batchInfo = connection.createBatchFromStream(jobInfo, is);" where my input stream is say "Select Id, Name from Account" which should returns 50000 records, how do I get the information of the each batch that is created

Or, is it that I should create the batches such that the number of records does not exceed 10000 for each batch. If yes, then I can get the result for the first 10000 records, but how do I get the result set for the subsequent records.

Regds,
Deepak
Hi

While generating the code using the Axis2 1.7.2 and partner.wsdl, the job hangs and cannot proceed

The command I run is "%AXIS2_HOME%\bin\WSDL2Java -uri partner.wsdl -p com.sforce.soap.partner -Dlog4j.configuration=file:C:\workAXIS2BIN\log4j\log4j.properties

The output I get is the following, the processing just hangs

Using AXIS2_HOME:   C:\workAXIS2BIN\axis172\axis2-1.7.2
Using JAVA_HOME:    C:\JDK\tools
Retrieving document at 'partner.wsdl'.
2016-05-27 02:06:40,496 WARN  - No schemaLocation for import of urn:partner.soap.sforce.com; compilation may fail%n2016-05-27 02:06:40,497 WARN  - No schemaLocation for import of urn:sobject.partner.soap.sforce.com; compilation may fail%n



Am I missing anything. I am using Sun Java 1.7.

Regds,
Deepak
Reagrding the topic "Salesforce disabling TLS 1.0" as mentioned in the link "https://help.salesforce.com/apex/HTViewSolution?id=000221207&language=en_US"
Which is the forum where I can post questions pertiaining to the above link

I am testing the API compatability Tests "How do I test the compatibility of an API (inbound) integration to Salesforce?"
When I run the test I see the following "Handshake Error " error

main, READ: TLSv1 Alert, length = 2
main, RECV TLSv1 ALERT:  fatal, handshake_failure
main, called closeSocket()
main, handling exception: javax.net.ssl.SSLHandshakeException: Received fatal alert: handshake_failure
AxisFault
 faultCode: {http://schemas.xmlsoap.org/soap/envelope/}Server.userException
 faultSubcode:
 faultString: javax.net.ssl.SSLHandshakeException: Received fatal alert: handshake_failure
 faultActor:
 faultNode:
 faultDetail:
        {http://xml.apache.org/axis/}stackTrace:javax.net.ssl.SSLHandshakeException: Received fatal alert: handshake_failure
        at com.ibm.jsse2.j.a(j.java:4)
        at com.ibm.jsse2.j.a(j.java:31)
        at com.ibm.jsse2.qc.b(qc.java:624)

        
Is that my applicaton no compatabile, but I see that we are using TLSv1 which is supposed to eb supported

Regds,
Deepak

 

Hi,
  For Load Operations using the Web Service API, is the value "" (empty string) a valid input to be stored as null value in Salesforce.
I have a java program that uses the Web Service API, version22 and in my code I do compare with value as null "if (value == null). This seems to be fine, but a Customer of ours complained that empty string ("") does not get stored as null, but ignores the value
 We need to understand that an input value "" (empty string), is it a valid value to be stored as null in Salesforce

Regds,
Deepak

We have a Java Program which uses Bulk API (with PK Chunking)  to extract records from an Object. The "select query" has a where clause, which should extract a subset of records, to honour the where clause, but we see that huge numer of batches are created on the Salesforce Server side, and many a times, the BULK API limits are hit

For example :
Number of records in the Object Account : 20000 We enable the connection and set the Pk Chunking header as , with chunk size as 2000
 
bulkConnection.addHeader("Sforce-Enable-PKChunking","chunkSize=2000");

Select Query :
SELECT Id, Name FROM Account WHERE CreatedDate >= 2018-07-20T05:00:00.000Z"


As per the query to match the condition, the query should return only 10 records (at most 1 batch), but we see that there are 10 batches created on the Salesforce Server.
Why is there 10 batches that are created, when the number of records to be extracted is only 10 which can fit in 1 batch.
Is there any setting that can be applied, or code tuning that can be done so that only 1 batch is created on the Salesforce Server
Hi Everyone,
we have a integration environment Between DataStage and Salesforce .Currently we are  using data loader script (process xml) file with the reference to mapping file for "upsert operation ".my requirement is to establish a relationship between Bond (at parent object) to a Coverages(child object ) ..There is a Bond field on child object defined as  "Lookup(Bond)"..In mapping file,for updating LOOKUP (I.e.Bond)  field on child object (i.e Coverage)..I have defined as below 
Bond__r_ExternalID__c=Bond__r\:ExternalID__c
With the above mapping "Bond"field (CHILD object field )is getting updated with ExternalID of Bond record (Parent externalID)
Example :Initially Parent Object "Bond" is created with ExternalID =1234 .After that Child object "Coverages" lookup field "Bond" is updated with value Bond_ExternalID=1234" .So this way coverages are linked to the Bond (Parent object).
For the same scenario mentioned :
We would like to use "salesforce pack for datastage " .How can I define a mapping for child object (Coverages) so that (Bond) Lookup field for upsert operation is updated with Parent Bond record (Parent externalID)?

Please let me know the resolution how can I achieve above requirement to work it properly.

Thanks,
Dev

 
Hi All, 

 Could i please get some assistance in figuring out where the issue is.  what can be possible issues in my sand box when it's outbound message are getting errors while talking to client's web server. 

endurl: https://cramappstage.edgewebhosting.net/EdgeSalesForce/EdgeSalesforce.asmx
error: org.apache.commons.httpclient.NoHttpResponseException: The server cramappstage.edgewebhosting.net failed to respond

I have no clue where to look or if the issue is on my end. Any feedback is appreciated.
For bulk query jobs if the job has PK chunking enabled, will the original batch (that contains the query when the subsequent batches are created) will be created with the id as "0" in an BatchInfo List

For example if we have the following
BatchInfo[] bListInfo = bulkConnection.getBatchInfoList(job.getId()).getBatchInfo();

Will the Original Query Batch will always be in bListInfo[0] ? If not, how can this be ensured
Hello all,

I am trying to do the following but I have no clue where to start, is there an article I can read or a video to watch on this topic?

Write a SOAPUI script to do the following:
1.) Login to your salesforce application
2.) Retrieve the ids of some contacts
3.) Retrieve the names of some ids retrieved earlier

Do not need to worry about parsing the variables from responses.  Can had place them into the next request.

Thanks
I am using Bulk Query, reading one batch at a time, and writing to the standard output. When the records size is small like 100000 records, the job works fine, but when I try to extract records like for example 7 million records (7000000 records), the job aborts with the following exception
Job id is 750A0000004YXlsIAG [AsyncApiException exceptionCode='ClientInputError' exceptionMessage='Server error returned in unknown format' ]
at com.sforce.async.BulkConnection.parseAndThrowException(BulkConnection.java:190) at com.sforce.async.BulkConnection.doHttpGet(BulkConnection.java:747) at com.sforce.async.BulkConnection.getBatchInfo(BulkConnection.java:557) at com.sforce.async.BulkConnection.getBatchInfo(BulkConnection.java:550)
Attached is the source code, and the Salesforce Jar files that I use is force-partner-api-39.0.0.jar force-wsc-39.0.0.jar jackson-core-asl-1.9.13.jar jackson-mapper-asl-1.9.13.jar

Anything that I am missing here
Hi

Had a very primitive question regarding the number of batches that are created using the Bulk API for "QUERY" Operation.
The BULK API limits says that we can have the maximum records that can be returned from a batch in a bulk query request is 10000 records. If I send in a query in a batch request to the bulk API and the query result is 50000 records, with the createJob() method take care of creating the 5 batches that will return each 10000 records.

When I do say "job = connection.createJob(job); batchInfo = connection.createBatchFromStream(jobInfo, is);" where my input stream is say "Select Id, Name from Account" which should returns 50000 records, how do I get the information of the each batch that is created

Or, is it that I should create the batches such that the number of records does not exceed 10000 for each batch. If yes, then I can get the result for the first 10000 records, but how do I get the result set for the subsequent records.

Regds,
Deepak
i am failed to login to salesforce through api
I am getting error invalid_grant , authentication failure.
Reagrding the topic "Salesforce disabling TLS 1.0" as mentioned in the link "https://help.salesforce.com/apex/HTViewSolution?id=000221207&language=en_US"
Which is the forum where I can post questions pertiaining to the above link

I am testing the API compatability Tests "How do I test the compatibility of an API (inbound) integration to Salesforce?"
When I run the test I see the following "Handshake Error " error

main, READ: TLSv1 Alert, length = 2
main, RECV TLSv1 ALERT:  fatal, handshake_failure
main, called closeSocket()
main, handling exception: javax.net.ssl.SSLHandshakeException: Received fatal alert: handshake_failure
AxisFault
 faultCode: {http://schemas.xmlsoap.org/soap/envelope/}Server.userException
 faultSubcode:
 faultString: javax.net.ssl.SSLHandshakeException: Received fatal alert: handshake_failure
 faultActor:
 faultNode:
 faultDetail:
        {http://xml.apache.org/axis/}stackTrace:javax.net.ssl.SSLHandshakeException: Received fatal alert: handshake_failure
        at com.ibm.jsse2.j.a(j.java:4)
        at com.ibm.jsse2.j.a(j.java:31)
        at com.ibm.jsse2.qc.b(qc.java:624)

        
Is that my applicaton no compatabile, but I see that we are using TLSv1 which is supposed to eb supported

Regds,
Deepak

 
Hi

I am working on a java client application which has to use Bulk API v2, is there any java library from salesforce which can simplify using Bulk Api v2, I believe using bulk api v1 was possible with force-wsc-xxx.jar, so checking if there is any available from Salesforce. From earlier discussions on the forum https://github.com/endolabs/salesforce-bulkv2-java was provided, checking if there is any from Salesforce or is it only REST end points provided as part of Bulk API v2.

Thank you,
Venu