• AlexWSFDC
  • NEWBIE
  • 0 Points
  • Member since 2005

  • Chatter
    Feed
  • 0
    Best Answers
  • 0
    Likes Received
  • 0
    Likes Given
  • 0
    Questions
  • 8
    Replies
I'm getting the following two errors when trying to create a new record from a managed Salesforce Site and I have all the FLS setup correctly:

1.  ErrorSystem.TypeException: DML not allowed on CloudConversion__Activity__c (this one goes away if I put everything in the constructor)

2.  System.SObjectException: Field is not writeable: CloudConversion__Activity__c.CloudConversion__Lead__c

Any hints, suggestions or workarounds?

Thanks,
Jon
I am brand new at this so I'm sure it's something obvious to those that have used it before.
I've x'd out the user name and password. There isn't any updates to the log, the trace and the csv file
doesn't get created. No error messages come up either. Can anyone see if I am missing a key piece?
 
I execute the job from the RUN command line, using the code listed below and I get the following results:
 
<!DOCTYPE beans PUBLIC "-//SPRING//DTD BEAN//EN" "http://www.springframework.org/dtd/spring-beans.dtd">
<beans>
    <bean id="serialExportProcess"
          class="com.salesforce.lexiloader.process.ProcessRunner"
          singleton="false">
      <description>Serial Number Export job gets serial number info from salesforce and saves info into a CSV file."</description>
        <property name="name" value="SerialExportProcess"/>
        <property name="configOverrideMap">
            <map>
                <entry key="sfdc.debugMessages" value="true"/>
                <entry key="sfdc.debugMessagesFile" value="C:\Program Files\salesforce.com\Apex Data Loader 11.0\Batch Processing\serialSoapTrace.log"/>
                <entry key="sfdc.endpoint" value="https://www.salesforce.com"/>
                <entry key="sfdc.username" value="xxxxxxx"/>
                <entry key="sfdc.password" value="xxxxxxxx"/>
                <entry key="sfdc.timeoutSecs" value="600"/>
                <entry key="sfdc.loadBatchSize" value="200"/>
                <entry key="sfdc.entity" value="Serial_Number__c"/>
                <entry key="sfdc.extractionRequestSize" value="500"/>
                <entry key="sfdc.extractionSOQL" value="Select Id, Serial_Number_Key__c, Item_Number__c, Name FROM Serial_Number__c"/>
                <entry key="process.operation" value="extract"/>
                <entry key="dataAccess.type" value="csvWrite"/>
                <entry key="dataAccess.writeUTF8" value="true"/>
                <entry key="dataAccess.name" value="z:\SFDC SERIAL TEXT\sernextr.csv"/>
                <entry key="process.statusOutputDirectory" value="C:\Program Files\salesforce.com\Apex Data Loader 11.0\Batch Processing\serialSoaplog.log"/>
            </map>
        </property>
    </bean>
</beans>
 
----------------------------------------

C:\Program Files\salesforce.com\Apex Data Loader 11.0\Batch Processing>call proc

ess "C:\Program Files\salesforce.com\Apex Data Loader 11.0\Batch Processing" ser

ialExportProcess

 

C:\Program Files\salesforce.com\Apex Data Loader 11.0\Batch Processing>if not ["

C:\Program Files\salesforce.com\Apex Data Loader 11.0\Batch Processing"] == [] g

oto run

 

C:\Program Files\salesforce.com\Apex Data Loader 11.0\Batch Processing>set PROCE

SS_OPTION=

 

C:\Program Files\salesforce.com\Apex Data Loader 11.0\Batch Processing>if not [s

erialExportProcess] == [] set PROCESS_OPTION=process.name=serialExportProcess

 

C:\Program Files\salesforce.com\Apex Data Loader 11.0\Batch Processing>..\_jvm\b

in\java.exe -cp ..\DataLoader.jar -Dsalesforce.config.dir="C:\Program Files\sale

sforce.com\Apex Data Loader 11.0\Batch Processing" com.salesforce.lexiloader.pro

cess.ProcessRunner process.name=serialExportProcess

0 [main] INFO com.salesforce.lexiloader.process.ProcessConfig  - Loading process

 configuration from config file: C:\Program Files\salesforce.com\Apex Data Loade

r 11.0\Batch Processing\process-conf.xml

5496 [main] INFO org.springframework.beans.factory.xml.XmlBeanDefinitionReader

- Loading XML bean definitions from file [C:\Program Files\salesforce.com\Apex D

ata Loader 11.0\Batch Processing\process-conf.xml]

5496 [main] DEBUG org.springframework.beans.factory.xml.XmlBeanDefinitionReader

 - Using JAXP implementation [com.sun.org.apache.xerces.internal.jaxp.DocumentBu

ilderFactoryImpl@1256ea2]

5559 [main] DEBUG org.springframework.beans.factory.xml.ResourceEntityResolver

- Trying to resolve XML entity with public ID [-//SPRING//DTD BEAN//EN] and syst

em ID [http://www.springframework.org/dtd/spring-beans.dtd]

5559 [main] DEBUG org.springframework.beans.factory.xml.ResourceEntityResolver

- Trying to locate [spring-beans.dtd] in Spring jar

5575 [main] DEBUG org.springframework.beans.factory.xml.ResourceEntityResolver

- Found beans DTD [http://www.springframework.org/dtd/spring-beans.dtd] in class

path

5622 [main] DEBUG org.springframework.beans.factory.xml.DefaultXmlBeanDefinition

Parser  - Loading bean definitions

5637 [main] DEBUG org.springframework.beans.factory.xml.DefaultXmlBeanDefinition

Parser  - Default lazy init 'false'

5637 [main] DEBUG org.springframework.beans.factory.xml.DefaultXmlBeanDefinition

Parser  - Default autowire 'no'

5637 [main] DEBUG org.springframework.beans.factory.xml.DefaultXmlBeanDefinition

Parser  - Default dependency check 'none'

5653 [main] INFO org.springframework.core.CollectionFactory  - JDK 1.4+ collecti

ons available

5669 [main] INFO org.springframework.core.CollectionFactory  - Commons Collectio

ns 3.x available

5669 [main] DEBUG org.springframework.core.CollectionFactory  - Creating [java.u

til.LinkedHashMap]

5669 [main] DEBUG org.springframework.beans.factory.xml.DefaultXmlBeanDefinition

Parser  - Found 1 <bean> elements in file [C:\Program Files\salesforce.com\Apex

Data Loader 11.0\Batch Processing\process-conf.xml]

 
 
  • June 26, 2008
  • Like
  • 0
I'm trying to get to the API documentation at

https://wiki.apexdevnet.com/index.php/API


It doesn't appear to be working today.  Anyone else having a problem?

I'm using the AppExchange data loader 8.0 to upload from my DB2 database to sforce.
If I use a sfdc.loadBatchSize value of anything other than an exact factor of my
result set size I end up with an error:

2007-02-07 16:11:49,155 ERROR [testObMasterProcess] progress.NihilistProgressAdapter doneError (NihilistProgressAdapter.java:51) - Error encounted trying to get value for column:  for row #75 (database execute query). Database configuration: queryJoborder.  Error: Invalid operation: result set closed.

My expected result set in this case is 74 rows.

Everything works fine with the following values:
sfdc.loadBatchSize=1
sfdc.loadBatchSize=2
sfdc.loadBatchSize=37
sfdc.loadBatchSize=74

It fails with the following values:
sfdc.loadBatchSize=10
sfdc.loadBatchSize=20
sfdc.loadBatchSize=100

Is this a bug in the Data Loader or am I missing something?

-Matt




  I am using the data loader to upload and download data to and from Salesforce.com. However, we get the error of "(407) Proxy Authentication Required".
 
  For solutions, I have gone through the details in this forum, and find that we need to change the source to include "NTCredentials" in order to let the data loader support NTLM authentication. However, we find that that method "NTCredentials" is already included in the source.
 
  Could anyone inform me how to solve it? Further, from Apache client documentation, the authentication does not support some kinds of ISA server. Is it correct? Kindly correct me if I made any mistakes.
 
Regards,
Benny