I’m amazed at how many S3 tutorials leave the permissions/acl as ‘public-read’. I’m using edgee:slingshot
for uploads. I would like to secure downloads from clients to my S3 buckets.
Idea:
On a click event the client calls a method
My server gets the temporary URL from AWS and returns that in the method - http://docs.aws.amazon.com/AmazonS3/latest/dev/RESTAuthentication.html#RESTAuthenticationQueryStringAuth
The client redirects to the URL
Am I on the right track? Has anyone else implemented this?
2 Likes
I have not attempted this… I will be soon though, so I am interested in the implementation details.
I believe @ffxsam has. Maybe he can chime in with some thoughts.
2 Likes
looshi
December 24, 2015, 6:35pm
3
Yes, I believe you are on the right track.
Here is an SO answer describing the flow so the client can download from s3 using a signed URL :
javascript, meteor
This is the package I’m using to create the signed URL for downloads ( can also be used for uploads )
3 Likes
ffxsam
December 24, 2015, 7:55pm
4
You’ll need the peerlibrary:aws-sdk
package installed. And then this Meteor method should get you pointed in the right direction:
Meteor.methods({
'aws/getSignedUrl': function (filePath) {
const s3 = new AWS.S3();
let url;
url = s3.getSignedUrlSync('getObject', {
Bucket: Meteor.settings.AWSBucket,
Key: `${filePath}`,
Expires: 30 //seconds
});
return {url};
}
});
3 Likes
ffxsam
December 25, 2015, 1:14am
5
One thing to consider with signed URLs is that since the browser (correct me if I’m wrong) sees it as a different URL each time, it can never cache it, and hence it eats up more of your S3 bandwidth.
Another way to secure your S3 files is to restrict access by checking the HTTP referrer. Of course someone who’s savvy could spoof that via headers in curl
, but it’s definitely something to consider.
The bucket policy for something like that looks like this: (censoring some things with xxx - I’m not totally sure if policy IDs are private or not)
{
"Version": "2012-10-17",
"Id": "xxx",
"Statement": [
{
"Sid": "xxx",
"Effect": "Allow",
"Principal": "*",
"Action": "s3:GetObject",
"Resource": "arn:aws:s3:::mybucket/*",
"Condition": {
"StringLike": {
"aws:Referer": [
"https://app.myapp.com/*",
"http://localhost:3000/*"
]
}
}
}
]
}
3 Likes
I got this working - thanks @looshi and @ffxsam !
For the security policy: I just set the acl to private (and didn’t worrry about referrer) with edgee:slingshot
.
Slingshot.createDirective("myFileUploads", Slingshot.S3Storage, {
acl: "private",
3 Likes
aadams
December 29, 2015, 3:20pm
7
The only issue I’ve run across with edgee:slingshot is that, in terms of IE support, it’s only supports IE 10 or higher.
batist
November 15, 2016, 11:11am
8
Exactly! Is everyone overlooking this? Or is a ‘secret url’ supposed to be secure enough?
Perhaps the following code can be of help to anyone:
import { Router } from 'meteor/iron:router';
import { Meteor } from 'meteor/meteor';
import { Accounts } from 'meteor/accounts-base';
import S3 from 'aws-sdk/clients/s3';
import S3S from 's3-streams';
Router.route('/files/:type/:id', function() {
let authenticated = false;
if(this.request.cookies.meteor_login_token)
{
let u = Meteor.users.findOne({'services.resume.loginTokens.hashedToken': Accounts._hashLoginToken(this.request.cookies.meteor_login_token)});
if(u){
authenticated = true;
}
}
if(!authenticated) {
return function(response) {
response.statusCode = 404;
response.end();
}(this.response);
}
let s3Client = new S3({
region: ...,
accessKeyId: ...,
secretAccessKey: ...
});
let getObjectOptions = {
Bucket: ...,
Key: ...folder + '/' + this.params.id
};
var src = new S3S.ReadStream(s3Client, getObjectOptions);
src
.on('open', (object) => {
this.response.writeHead(200, {
'Content-Type': object.ContentType,
'Content-Length': object.ContentLength
});
})
.pipe(this.response)
.on('finish', () => {})
.on('error', (err) => {
console.error('Unable to download file:', err);
return function(response) {
response.statusCode = 404;
response.end();
}(this.response);
});
}, {where: 'server', name: 'files'});
The cool extra is that the images will be cached by the browser.