• Sonam Meshram
  • NEWBIE
  • -8 Points
  • Member since 2019
  • Salesforce CPQ Specialist
  • Satrang Techonologies

  • Chatter
    Feed
  • 0
    Best Answers
  • 0
    Likes Received
  • 0
    Likes Given
  • 0
    Questions
  • 18
    Replies
What is the best managed package for Amazon S3 connector?
I am trying to upload files from a lightning component to AWS directly with saving in salesforce. I get 400 Bad Request error.
Can anyone help?

 
public with sharing class TestingAmazon {
    @AuraEnabled
    
    public static String testIntegration(String filename, String base64Data , String contentType) {
        System.debug('filename---->>'+filename);
        System.debug('base64Data---->>'+base64Data);
        System.debug('contentType---->>'+contentType);
        //Blob beforeblob = EncodingUtil.base64Decode(base64Data);
        Blob beforeblob = Blob.valueOf(base64Data);
        String attachmentBody = EncodingUtil.base64Encode(beforeblob);
        String formattedDateString = Datetime.now().formatGMT('EEE, dd MMM yyyy HH:mm:ss z');
        String key = '-------------------------------';
        String secret = '---------------------------------------------';
        String bucketname = 'testfileupoad';
        String host = 's3.ap-south-1.amazonaws.com';
        String method = 'PUT';

        HttpRequest req = new HttpRequest();
        req.setMethod(method);
        req.setEndpoint('https://' + bucketname + '.' + host + '/' + bucketname + '/' + filename);

        req.setHeader('Host', bucketname + '.' + host);
        req.setHeader('Content-Length', String.valueOf(attachmentBody.length()));
        req.setHeader('Content-Encoding', 'base64');
        req.setHeader('Content-type', contentType);
        req.setHeader('Connection', 'keep-alive');
        req.setHeader('Date', formattedDateString);
        req.setHeader('ACL', 'public-read');
        req.setBodyAsBlob(beforeblob);
        System.debug('req---->'+req);
        
       /* String stringToSign = 'PUT\n\n' +
                contentType + '\n' +
                formattedDateString + '\n' +
                '/' + bucketname + '/' + bucketname + '/' + filename; */

        String stringToSign = 'PUT\n' +
                contentType + '\n' +
                formattedDateString + '\n' +
                '/' + bucketname + '/' + bucketname + '/' + filename;
        System.debug('stringToSign---->>'+stringToSign);

        String encodedStringToSign = EncodingUtil.urlEncode(stringToSign, 'UTF-8');
        Blob mac = Crypto.generateMac('HMACSHA1', blob.valueof(stringToSign),blob.valueof(secret));
        String signed = EncodingUtil.base64Encode(mac);
        String authHeader = 'AWS' + ' ' + key + ':' + signed;
        System.debug('authHeader---->>'+authHeader);
        req.setHeader('Authorization',authHeader);
        String decoded = EncodingUtil.urlDecode(encodedStringToSign , 'UTF-8');

        Http http = new Http();
        HTTPResponse res = http.send(req);
        System.debug('RESPONSE STRING: ' + res.toString());
        System.debug('RESPONSE STATUS: ' + res.getStatus());
        System.debug('STATUS_CODE: ' + res.getStatusCode());
        if(res.getStatusCode() == 200){
            String ReturnUrl = 'https://'+host+'/'+bucketname+'/'+bucketname+'/'+filename;
            System.debug('ReturnUrl---->>'+ReturnUrl);
            return ReturnUrl;
        }
        return 'ERROR';
    }
}

 
We have a object with 200,000 records in it and each contains a attached file.
We need to download attachements of 500 identified records.
What is the best approach for this ?
  • September 26, 2016
  • Like
  • 0
Is there any method to export the attacments of custom object?
Hi ,I AHve 1000+ data export zip files and looking for a automated way to download the files
Hi,

I need to upload few images to Amazon S3storage. Can anyone help how can i achieve this with apex code.

thanks in advance..

HI ALl,

I have a requirement with Salesforce with Amazon server with S3 Integration. I need to upload the files and same time download the files or URL of file form Amazon to  Salesforce.

I have uploaded the file using PUT method. Response i m receiving is like only [Status=OK and Statuscode=200] successful
but i m not getting any other things like uploadid or URL of the file.

i need URL of the File uploaded in Salesforce. Help me out from this

Thanks,

Reddy

Hello everyone!

I am using Salesforce Community to develop a mobile application. I want to store files (file feeds) into Amazon S3 instead of Salesforce. Does anyone know how to do that? Any help would be much appreciated.
Hi ,

I'm new to integration . Have referred Salesforce documents but didnt understood much . I do not know where to start , I have a requirement to integrate Salesforce with Amazon S3 . So if anyone can help me with the sample code will be very useful . 

Also nice blogs ,useful links for integrations if given would be great.


Thanks,
Marc.
I want to upload large files to S3 server from visualforce page. How can we achieve this? Now I am able to upload small files. Any help?
Hi Experts,

I am trying to upload a file on S3. I have been verified everything couple of times and all looks correct to me, but whenever I am trying to upload file, getting error message saying

<Br/>
responseText =
<?xml version="1.0" encoding="UTF-8"?>
<Error><Code>InvalidArgument<BR/><Message>POST requires exactly one file upload per request.</Message><BR/><ArgumentValue>0</ArgumentValue><BR/><ArgumentName>file</ArgumentName><BR/><RequestId>3670E4EE52B3BCD5</RequestId><BR/><HostId>b3rOF/9WJHymo1ZENIOlrct/ZusAJ50AnSIP0df3K3+DdEcAFolJDx8qU6DH2N1l</HostId><BR/></Error></Code>

Can someone please help me to findout what I am doing wrong here?

<body>
       
        <div id="s3-fileuploader" class="dropArea"></div>
       
        <script type="text/javascript">

            j$ = jQuery.noConflict();

            //block and unblock UIbased on endpoint url
            function setUI(){
                j$('div.dropArea').unblock();
            }

            $(document).ready(function () {
           
                $('#s3-fileuploader').fineUploader({
               
                    request: {
                        endpoint: "https://{!bucketname}.s3.amazonaws.com",
                        accessKey: "{!key}"
                    },
                    signature: {
                       
                        //always included
                        "expiration": "{!expireStr}",
                       
                        signature : "{!signedPolicy}",
                        policy: "{!policy}",

                        "conditions":
                        [
                            //always included
                            {"acl": "public-read"},
                    
                            //always included
                            {"bucket": "{!bucketname}"},
                    
                            //not included in IE9 and older or Android 2.3.x and older
                            {"Content-Type": "{!ContentType}"},
                    
                            //always included
                            {"key": "{!key}"},
                    
                            //always included
                            {"x-amz-meta-qqfilename": "{!URLENCODE('test.jpg')}"},
                        ]
                    },
                    cors: {
                        expected: true, //all requests are expected to be cross-domain requests
                        sendCredentials: false, //if you want cookies to be sent along with the request
                        allowXdr: true
                    },

                    autoUpload: true,
                    multiple:false,
                    debug: true,
                   
                    text: {
                        uploadButton: '<i class="icon-plus icon-white">Select Files</i> '
                    },    
                    uploadSuccess: {
                        endpoint: "{!redirectURL}"
                    }
                }).on('submit',function(event,id,name){

                    //set endpoint
                    console.log('https://{!bucketname}.s3.amazonaws.com');
                    $(this).fineUploader('setEndpoint','https://{!bucketname}.s3.amazonaws.com');

                });
                setUI();
            });

        </script>
    </body>

Hi,

 We have around 1000's of uploads done over the course of years and reaching our limit on storage. I can download the file but one file at a time is not a solution production wise. Please could you recommend on how to achieve downloading/exporting the pdf's to computer in a bulk method. 

 

Thanks.

Hello All,

 

Good Time,

 

I would like to download attachments of Opportunity automatically, could you please suggest how do you want me to procced.

 

is there any tool which is avaliable in Appexchange.com or do i need to write SOQL if yes then could you please provide me some solution for this.

 

Thanks in advance

 

Perfectionist

Hi

can any one help me how to integrate salesfore to amazon s3

 

I want store some salesforce data in amazon ,how can i integrate can any one tell me how can i do,

what is the process ,

how can i do it?

 

 

 

Thanks,

venkatesh

 

 

 

Our users need to download multiple ContentDocument files, based on queries of related metadata (custom object) and/or file content (ContentVersion). 

 

And there could be 100s of such files that they want to download to their local computer based on one query....

 

We're replacing a  legacy system with SF, and that legacy system allows users to build such queries resulting in hundreds of files.  The legacy system then combines the files (along with a few standard report pages of the associated metadata) into a single (BIG) .PDF file which they then download to their local computer and print (or not) - so clicking hundreds of individual download links, giving a location/name for each file, would be quite tedious.  Users generally burn CDs with the PDF file - they don't normally print this much paper.

 

Any suggestions on how to do this, or something reasonably equivalent, in SF / VF?  

 

By the way, the users have Adobe Professional on their local computer, so if we could just get the files from SF into a single folder (or combine the file(s) into a single file or two) the users could then use Adobe to build their own PDF locally - I'm really interested in automating the download of ContentDocument file(s)  if at all possible...

Hi,

 

We have a process where customers need to submitted / upload scanned documents via a force.com app

Is it possible to store the documents externally via Google Docs OR Amazon S3 services?

 

There will be 100's of jpg scanned docs per week uploaded via the application.

 

Appreciate any advise on best practice.

 

Many thanks

Matt

  • March 08, 2011
  • Like
  • 0

Hi All,

 

We are using Amazon s3 as storage server to upload files from salesforce.It is working fine.

But some clients are facing issues with file upload/download due to their proxy restrictions.So the user who is uploading

the file is clueless about what went wrong.To handle this, I am trying to connect to https://s3.amazonaws.com

using XMLHttpRequest in Java script.If the request fails, I am displaying that "Unable to connect to

https://s3.amazonaws.com. Reason: Proxy Restriction".

XMLHttpRequest is failing because of cross domain problem.Can anyone guide me through this.Does anyone have

better approach?

 

Thanks

Gops

  • July 06, 2009
  • Like
  • 0