Uploading large files to Azure storage

Any working implementations of the azure-blob-upload package out there?

I’m working a project with a requirement to send large (video) file uploads to an Azure storage container. I thought it would be an easy-peasy using jamesfebin’s package, but I can’t get it to work. Wondering if there’s something obvious I’m missing. Don’t know if it’s more of a Meteor implementation issue or an Azure issue.

I get this vague error on the response from Azure.

Error: One of the request inputs is out of range.
RequestId:a4b21fe1-0001-0032-6677-71c3e6000000
Time:2015-04-07T21:09:07.9330207Z

To reproduce, you have to have Azure storage container of course. Add the azure-blob-upload package to your /packages folder with git so you can modify the package code (git clone https://github.com/jamesfebin/azure-blob-upload.git). Then, on line 329 of azureupload.js, add a console.log(err) to log out the Azure error response to your terminal.

Here’s some code to do a test upload:

// client/upload.html

<template name="uploadVideo">
        <div class="panel panel-default">
            <div class="panel-heading">
                <h3>Upload Video</h3>
            </div>
            <div class="panel-body">
                <form class="form-group">
                    <input type="file" id="uploadFile" class="uploadFile" />
                </form>
            </div>
        </div>
</template>

// client/upload.js

Template.uploadVideo.events({
	'change .uploadFile': function(event,template){
	    var files = event.target.files;
	    var file = files[0];
	    AzureFile.upload(
            file,"uploadFile",
            {},
            function(error,success){
                if (error) console.log(error);
                else console.log(success);
            }
        );
    }
})

// server/methods.js

Meteor.methods({
	'uploadFile': function(file) {
		var response;
		if (file === void 0) {
		throw new Meteor.Error(500, "Missing File", "", "");
		}
		response = file.azureUpload(file.name, "YouAzureStorageAccount", "EnterYourAzureStorageKey", "EnterYourContainerName");
		return console.log(response: '+response);
	}
})

// sample values from what I’m passing in on

blobService: [object Object]
blockId: QmxvY2tObzE0
container: invidiad
fileName: minions.mp4
stream: [object Object]:
stream.size(): 220262

After Googling the error message, I referenced https://msdn.microsoft.com/en-us/library/dd135715.aspx to see if I could figure out which of these values might be illegal and causing the error, but it mainly addresses the name of the blob and each blob needing a unique name if it’s in the same container. I think I’ve eliminated those two possibilities.

For reference, here’s the NodeJS SDK documentation on creating blocks from streams: https://www.omniref.com/js/npm/azure-storage/0.3.2/symbols/BlobService%23createPageBlobFromStream


Alternatively, if someone is aware of another way to send blobs to Azure from Meteor, then I’m all ears. I was initially hoping to skip sending the file through the Meteor server by using the Slingshot package, but looking at the Azure documentation, I can’t figure out how to adapt the NodeJS SDK documentation to Slingshot. I’ve noted that in the past month, there has been some discussion on the scarcity of documentation for figuring out how to implement Azure into Slingshot https://gitter.im/CulturalMe/meteor-slingshotabout.

Hi,

Im busy creating a resumable chucked file uploader meteor package for windows azure that will handle very large files with pause/resume functionality across sessions. It sends the file directly (using SAS) to the blob store (like slingshot) to save server resources and utilize the azure cloud. Lots of focus is going into making it as secure as possible by keeping SAS as short as possible, ability to add and remove access, monitoring with guards against potential misuse. Will also integrate azure media to convert video files as needed.

Unfortunately its still a long way to go until its done or production ready but maybe I can get an initial release (with just plain uploading and downloading) out (in week or two) if you want to test or help out.

Im not sure what is causing the issue with the other package - one thing I noticed is that it uses old node style streams which does not provide proper flow control. So its just sending data as fast as possible (not always good) from server to azure even if the writestream (azure blob storage) is saturated - so buffers may overflow etc.

To implement azure in slingshot should not be that difficult.You will have to create a new azure service in https://github.com/CulturalMe/meteor-slingshot/tree/master/services to generate a SAS using https://www.npmjs.com/package/azure-storage .

Regards,
Riaan

You should not send your Secret Storage Key for Azure to the client on a public app. Your practically giving the client full write access to your storage container. That is just wrong.

I was looking for signed file upload support for Azure so that an extension can be made for Slingshot, but I couldn’t find any docs for it.

If Azure does not support signed file upload like S3 does here https://aws.amazon.com/articles/1434 or single file upload authorizations then you should not upload files directly from the client/browser on public apps.

If Azure does support signed file uploads then please send me the docs to that.

Azure support signed file uploads - its called SAS (shared access signature) http://azure.microsoft.com/en-us/documentation/articles/storage-dotnet-shared-access-signature-part-1/ . You can use the npm azure-storage package to generate one on your meteor server and send it to the client via a meteor method http://dl.windowsazure.com/nodestoragedocs/BlobService.html#generateSharedAccessSignature

Excellent, that is exactly what I was looking for, thank you @riaan53.

In regards to your earlier comment:

To implement azure in slingshot should not be that difficult.You will have to create a new azure service in https://github.com/CulturalMe/meteor-slingshot/tree/master/services to generate a SAS using https://www.npmjs.com/package/azure-storage .

You don’t have to put your implementation in that directory. It can be anywhere on the server side. Even a package, say “slingshot-azure”. There is a section in readme on how it can be done.

Thanks for pointing out the SAS. Your pointer led me to this example of clientside upload using the Azure mobile services API and NodeJS: http://blogs.msdn.com/b/brunoterkaly/archive/2014/06/13/how-to-provision-a-shared-access-signatures-that-allows-clients-to-upload-files-to-to-azure-storage-using-node-js-inside-of-azure-mobile-services.aspx

In Figure 20 on line 72,

var signature = blobService.generateSharedAccessSignature(containerName, 
    blobname, sharedAccessPolicy);  

Also, just realized that SAS is covered in the NPM package Readme towards the bottom:

var azure = require('azure-storage');
var blobService = azure.createBlobService();

var startDate = new Date();
var expiryDate = new Date(startDate);
expiryDate.setMinutes(startDate.getMinutes() + 100);
startDate.setMinutes(startDate.getMinutes() - 100);

var sharedAccessPolicy = {
  AccessPolicy: {
    Permissions: azure.BlobUtilities.SharedAccessPermissions.READ,
    Start: startDate,
    Expiry: expiryDate
  },
};

var token = blobService.generateSharedAccessSignature(containerName, blobName, sharedAccessPolicy);
var sasUrl = blobService.getUrl(containerName, blobName, token);

@riaan53 Your plan is fantastic. No need to rush it out, but if you feel like you can share it with basic uploading/downloading, I’d love to be a first adopter and help where I can.

@soupala
Hi,

so you managed to get that slingshot azure working in the end?
Do you have working config example?