• Jon Sheldon 42
  • NEWBIE
  • 0 Points
  • Member since 2017

  • Chatter
    Feed
  • 0
    Best Answers
  • 0
    Likes Received
  • 0
    Likes Given
  • 4
    Questions
  • 0
    Replies
I am integrating with an external partner and I need to be able to GZip compress an XML string.  This is NOT part of a web service or API, so GZip support for HttpRequests is not relevant here.  Nor is this for getting a ZIP file.

I don't see any classes which support GZip compression of a string or blob.  EncodingUtil supports Base64 encoding, hex encoding, etc.

Is there a preferred third party API or product that supports this?
I have a scenario where I may need to return plain XML from a community web site.  Do custom community web sites, based on Napoli template, have the capability to return plain XML?  Its necessary to support an SSO integration.
Here is my scenario:
  • Will receive several X12 834 files per day via SFTP to an on premise server. There is potential for dozens and long term possibly 100s of files each day that need to be processed/imported.
  • Each file needs to be imported into a custom salesforce object
  • Some files may contain only 5000 records and some may contain 100,000+ records
To me it seems the Bulk API is the best means to import such data using a custom automated process.

The process would be roughly:
  1. Transform an 834 file into a CSV file with upsert records for the custom object.
  2. Create a Bulk API job for importing the CSV file.
  3. Separate the CSV upsert file into separate chunks (10MB or 10000 rows, whichever comes first) submitting each as a batch to the job.
  4. Check the state of the job/result of each batch.
Is the Bulk API a good solution for this, or is there another API which might be better suited?
Here is my scenario:
- Will receive several X12 834 files per day via SFTP to an on premise server. There is potential for dozens or 100s of files that need to be imported. - Each file needs to be imported into a custom salesforce object
- Some files may contain only 5000 records and some may contain 100,000+ records

To me it seems the Bulk API is the best means to import such data.

The process would be roughly:

1. Transform an 834 file into a CSV file with upsert records for the custom object.
2. Create a Bulk API job for importing the CSV file.
3. Separate the CSV upsert file into separate chunks (10MB or 10000 rows, whichever comes first) submitting each as a batch to the job.
4. Check the state of the job/result of each batch.

Is the Bulk API a good solution for this, or is there another API which might be better suited?