I would like to upload a file with Amazon S3 inside a .NET Core project. Is there any reference on how to create and use an AmazonS3 client? All i can find in AmazonS3 documentation for .Net Core is this(http://docs.aws.amazon.com/sdk-for-net/v3/developer-guide/net-dg-config-netcore.html) which is not very helpfull.
Active2 months ago
kostas
Upload binary stream to Amazon S3 using REST API PUT operation I have a need to upload binary stream PDF files to Amazon S3. I've seen the sample code available to use the REST API with the POST operation on visualforce page, however, I need to upload the file via APEX without user involvment, as I'm retrieiving the files from another database via their SOAP API.
kostas
19411 gold badge66 silver badges1818 bronze badges
4 Answers
I did using IFormFile, like this:
(You need to install AWSSDK.S3)
Tiago ÁvilaTiago Ávila
1,65811 gold badge1818 silver badges2424 bronze badges
For simple file uploading in a .netcore project, I followed this link.
After finishing the simple file upload procedure, I followed the documentation on this and this links, which were very helpful. Following two links were also helpful for a quick start.
This was my final code snippets in the controller for file upload (I skipped the view part, which is elaborately explained in the link shared above).
This is the method to upload files to Amazon S3:
This was all for uploading files in Amazon S3 bucket. I worked on .netcore 2.0 and also, don't forget to add necessary dependencies for using Amazon API. These were:
Hope, this would help.
Tasnim FabihaTasnim Fabiha
Per AWS SDK docs, .Net Core support was added in late 2016.
Amazon S3 File Upload Api Crofton Ky
So the instructions for uploading files to S3 should be identical to any other instructions for .Net.
The 'getting started' guide for the AWS SDK for .Net is literally the case you describe of connecting and uploading a file to S3 - and included as a sample project ready for you to run if you've installed the 'AWS Toolkit for Visual Studio' (which should be installed with the .Net AWS SDK).
So all you need to do is open visual studio, find their sample S3 project, or you can look at it here:
This assumes you have instantiated an Amazon.S3.AmazonS3Client after including the namespace, and configured it with your own credentials.
Ryan WeirRyan Weir
5,07244 gold badges3131 silver badges5454 bronze badges
You first need to install in the
Package Manager Console :
Then you need to have the credentials file in the directory:
The credential file should have this format:
I uploaded in github an example of a basic CRUD in ASP.NET CORE for S3 buckets.
Stephen Rauch
33k1515 gold badges4444 silver badges6969 bronze badges
dayanrr91dayanrr91
Not the answer you're looking for? Browse other questions tagged amazon-s3asp.net-core or ask your own question.
Don't have an account?
Signup for a Developer Edition
Browse by Topic
You need to sign in to do that
Don't have an account?
ShowAll Questionssorted byDate Posted
Show
sorted by
You need to sign in to do that
Don't have an account?
I have a need to upload binary stream PDF files to Amazon S3. I've seen the sample code available to use the REST API with the POST operation on visualforce page, however, I need to upload the file via APEX without user involvment, as I'm retrieiving the files from another database via their SOAP API.
I'm trying to do this using the PUT operation, but I don't think I'm doing the authentication correctly as I'm getting a 403 Forbidden response. Any ideas?
well I guess I should have waited a bit longer to post this question...haha
turns out my signing string didn't need to be UTF-8 encoded (see old code 'String encodedStringToSign = EncodingUtil.urlEncode(stringToSign,'UTF-8');' ) Amazon's doucmentation mentions 'The string to sign (verb, headers, resource) must be UTF-8 encoded', however I removed this piece and just ran my createSignature class using stringToSign (not UTF-8 encoded) and it worked!! Also, it turns out that you can decode the binary stream and use the decoded Blob as the body of the response. Otherwise, Amazon just displays the binary stream text on screen. Here is my final code Hope this helps someone else!
well I guess I should have waited a bit longer to post this question...haha Jason Flammang
turns out my signing string didn't need to be UTF-8 encoded (see old code 'String encodedStringToSign = EncodingUtil.urlEncode(stringToSign,'UTF-8');' ) Amazon's doucmentation mentions 'The string to sign (verb, headers, resource) must be UTF-8 encoded', however I removed this piece and just ran my createSignature class using stringToSign (not UTF-8 encoded) and it worked!! Also, it turns out that you can decode the binary stream and use the decoded Blob as the body of the response. Otherwise, Amazon just displays the binary stream text on screen. Here is my final code Hope this helps someone else!
Here is some updated sample code that I'm currently using
You'll get the 403(Forbidden) status code when there is something wrong with your signature. For me it was the dateTime and use my local time zone. Hope this helps
Hi Jason,
I have one issue with the 'x-amz-acl' header. Along with the new fiel, I also want to set the ACL for that new file. I have setted this extra header for that but its not working. Do you have any idea? Here is the code i am trying to use: public static void PutFile(String fileContent, String filekey, String bucketName, String contentType, String region, String key, String secret){ String formattedDateString = Datetime.now().format('EEE, dd MMM yyyy HH:mm:ss z','America/Denver'); String filename = filekey; HttpRequest req = new HttpRequest(); Http http = new Http(); req.setHeader('Content-Type', contentType); req.setMethod('PUT'); req.setHeader('x-amz-acl', 'public-read-write'); req.setHeader('Host','s3' + region + '.amazonaws.com'); req.setEndpoint('https://s3' + region + '.amazonaws.com' + '/'+ bucketName + '/' + filename); req.setHeader('Date', formattedDateString); String stringToSign = 'PUTnn'+contentType+'nx-amz-acl:public-read-writen'+formattedDateString+'n/'+bucketName+'/'+filename; req.setHeader('Authorization',createAuthHeader(stringToSign, key, secret)); if(fileContent != null && fileContent != '){ Blob pdfBlob = EncodingUtil.base64Decode(fileContent); req.setBodyAsBlob(pdfBlob); req.setHeader('Content-Length', string.valueOf(fileContent.length())); // Execute web service call try { HTTPResponse res = http.send(req); System.debug('***RESPONSE STRING: ' + res.toString()); System.debug('***RESPONSE STATUS: '+res.getStatus()); System.debug('***STATUS CODE:' +res.getStatusCode()); } catch(System.CalloutException e) { system.debug('***ERROR: ' + e.getMessage()); } } }
Hi jason,
i have done this successfully. i want to upload multiple attachments on s3 bucket almost at one time i need to transfer 25 to 30 attachments. how i can do this ? currently i hit on my apex through scheduler and soap connection with sales force and get attachments from sf and upload it on s3. in one time it upload only 4 or 5 attachments. how i can upload multiple attachments on s3 at a time ? Your early response will be appreciated. Thanks umer
I don't know your specific example, but my best guess would be for you to check out Batch Apex: https://developer.salesforce.com/docs/atlas.en-us.apexcode.meta/apexcode/apex_batch_interface.htm
I know there are limits as to the number of callouts that can be made duing a batch process, so you'll want to make sure your batch size keeps you within those limits. You'll also want to make sure you specify Database.AllowCallouts when setting up your batch APEX class otherwise you'll get an error. I currently use a batch APEX class to make callouts to a 3rd party SOAP API that only allows for one record to be retrieved at a time. Using the Batch processing I can update thousands of records a day. Hope this helps.
This was massively helptful. It works a treat. Thankyou!
S3- Link is FREE App for Salesforce - Amazon Connector. Its also available on Appexchange.
Attach file related to any Salesforce object on Amazon. 5 GB free storage for one year. Multiple file uplaod. No file size limit for upload. File access control capabiliy. Track file downloads by users. File exlorer capability. https://appexchange.salesforce.com/listingDetail?listingId=a0N3000000CW1OXEA1 Here is our email address. Let us know if you have any query. [email protected] Thanks. Amazon S3 Upload Folder
Anil,
I think you just have to re-arrange the stringToSign so that the 'x-amz-acl' is AFTER the formattedDateString. Yours:Corrected:
And just for anyone who cares. If you want your files to be stored in the cheaper S3 storage (Infrequent Access) and with public read-only access you would do this... Change thisTo this: And in the createAuthHeader method, change this: to this: (Refactor as desired) Amazon S3 File Upload Api Crofton Nc
Hi Jason,
Hey Jason,
Hi have to do a get request. I copied your code and made some changes. I need to get all the files from a buket. I am getting Status=Bad Request, StatusCode=400. Please help me with this. I am posting my code below.
Thanks,
Manohar Mega File Upload
You need to sign in to do that.
Have an account?Sign In
Dismiss
Amazon S3 File Upload Api Crofton MdComments are closed.
|
AuthorWrite something about yourself. No need to be fancy, just an overview. ArchivesCategories |