Streaming files from Azure Blob Storage through Node.js server

Hello,
I am currently trying to implement download server with Node.js Express app in a way that website visitors would be able to download files via stream straight from Azure Blob storage - so the server won't need to download files to its local storage. Here's
how I do that:
outStream is the response object (res) received by Express router; blobSvc is initialized like this:
blobSvc = azure.createBlobService(conf.azureStorageConfig.account, conf.azureStorageConfig.key, conf.azureStorageConfig.host);
. There are several issues I face, for example - while small files get downloaded well, the bigger ones (even 40 MB) do not finish the download successfully, throwing an error in console:
It has to be mentioned that the files are zip-archives, and while the error is thrown, the file is not completely downloaded to the client's machine - the archive is broken. 
There are other issues, like the fact that the client can't download more than 1 file from the server simultaneously - he gets response timeout after trying to start downloading the second file.
What is the right way to use streaming with azure storage library in Node for client downloads?
Best Regards,
Petr.
Arction Ltd.

Hi,
Thank you for posting in here.
We are checking on this and will get back at earliest.
Regards,
Manu Rekhar

Similar Messages

  • How to import DB from Azure BLOB storage

    There was option "Import" in Azure portal when open SQL Server. Now only "Add" available which is not what is needed.
    "Export" works fine when open database.
    Please help. Why this option disappeared?

    Hi,
    >>There was option "Import" in Azure portal when open SQL Server.
    From my research, yes there is an "Import" button in the old version, but it disappeared now, from my experience, this is by design, we can submit this as a custom voice in
    http://feedback.windowsazure.com/forums/34192-general-feedback-
    Best Regards
    We are trying to better understand customer views on social support experience, so your participation in this interview project would be greatly appreciated if you have time. Thanks for helping make community forums a great place.
    Click
    HERE to participate the survey.

  • Force file download for IE9 on Azure Blob Storage

    Hello,
    I am trying to use Azure Blob Storage as a location for secure file downloads using Shared Access Signature. Everything is working very well, however the problem I am having is I am trying to allow the user to save files from the browser and I have all browsers
    except IE9 working.
    Reviewing this question,
    What content type to force download of text response?
    this works well when I can control all of the headers, however in Azure Blob Storage, I have set the Content-Type to application/octet-stream and this allows all browsers except IE to ask the user to save the file, IE simply opens the file. It appears that
    known file types will open (example .jpg, .wmv etc…).
    In Azure, I have found no way to set
    Content-Disposition: attachment;filename="My Text File.txt"
    Is there a way, using Azure Blob Storage, to use IE to download any file directly from Azure Blob Storage?
    Thanks in advance.

    Hi,
    Actually, we can't set Content-Disposition for blobs, and I can't think of any other workarounds. From my experience, in most case IE's behavior is fine. I would like to know why you have to prompt a download? The user can see the text file, and
    if they wish to save it locally, they have more than one way to do that (copy paste, save file, etc.). If they simply want to read the text and then forget it, that's also fine. They don't even have to download it and then double click a local file to read
    the content.
    If you have to modify the behavior, the only workaround I can think of is to use a web role as an intermediate bridge, and add the Content-Disposition from your web role.
    Best Regards,
    Ming Xu.
    Please mark the replies as answers if they help or unmark if not.
    If you have any feedback about my replies, please contact
    [email protected]
    Microsoft One Code Framework

  • Can I remove backuped files/folders from Azure Backup storage without un-registor the servers ?

    Can I remove backuped files/folders from Azure Backup storage without un-registor the servers ?
    I want to remove some backuped files/folders to compact Azure Backup Storage and to save the cost of Azure Backup.
    Scenario 1: remove tempolary files from Azure Backup Storage.
    Scenario 2: remove no-recovery files from Azure Backup Storage.
    Scenario 3: move the files from backup target folder to other storage's folder.
    Now, To remove backup files/folder from Azure Backup storage, I seems to need un-register the servers, and register the server.
    Regards,
    Yoshihiro Kawabata

    Thank you Giri, for quick reply.
    I want to save the Azure Backup storage cost, now.
    I don't want to pay the Azure Backup storage cost for no-restore-need files until the backup point gets aged.
    and I don't want to remove / re-backup the restore need files by un-register/register operation.
    I feedback this issue to feedback.azure.com.
    "Remove files/folders from Backup"
    http://feedback.azure.com/forums/258995-azure-backup-and-scdpm/suggestions/7421659-remove-files-folders-from-backup
    Regards,
    Yoshihiro Kawabata

  • How to Secure or mask the Video Streaming on Azure Blob Storage

    I have a problem on using my Azure Blob Storage, this is for streaming specifically MP4 files.
    I have a wordpress site and I use Azure Blob Storage for my Storage, I use CloudberryExplorer for my browsing and editting
    My problem is that, the wordpress site Video Option which is MediaElemtJs have an exposed Video Source (This mean they can use the video source for their own project)
    Is their a way to mask or hide the video source? or I can limit the video source that it should only be coming from my site

    hi Franz,
    Base on my experience, there has two approaches for this issue.
    1.Using the SAS in js request.
    Like this:
    <source src="https://MY-AZURE-STORAGE.blob.core.windows.net/asset-b1ebfc63-4e1f-4c76-b2dd-b040113aa493/MigratingWebSitesAndDatabasesToWindowsAzure_mid.mp4?sv=2012-02-12&st=2013-09-03T17%3A17%3A20Z&se=2015-09-03T17%3A17%3A20Z&sr=c&si=efc57c3c-a025-41cd-a8b1-908231c7aed2&sig=xyiIxmqNdF%2Bo1x7eq89xCUgXkenZLEEpq%2B%2Bo715dT0U%3D" type="video/mp4">
    Please see this blog:http://azure.microsoft.com/en-us/documentation/articles/storage-dotnet-shared-access-signature-part-1/
    You could set the start time,expire time  and permission into your request. And you could use javascript to create the request and set src value. please see this sample (http://gauravmantri.com/2013/02/16/uploading-large-files-in-windows-azure-blob-storage-using-shared-access-signature-html-and-javascript/
    2.Create a web handler to handle the request url
    You could create a .ashx page and handle the url in .ashx page. Like this:
    <video src="videos.ashx?source=file.mp4" width="640" height="480" controls></video>
    Please try it.
    Also, you could see this threads
    http://stackoverflow.com/questions/9756837/prevent-html5-video-from-being-downloaded-right-click-saved form some ideas.
    Regards,
    Will 
    We are trying to better understand customer views on social support experience, so your participation in this interview project would be greatly appreciated if you have time. Thanks for helping make community forums a great place.
    Click
    HERE to participate the survey.

  • How data in transferred to data nodes from Azure Blob?

    I started a HDInsight cluster which was using a Azure blob storage. After looking at the jobs counters I noticed that there are some 'local' and 'rack-local' mappers for the job. I wonder to know what this local mappers mean? as far as I understood the architecture
    of HDInsight, the data nodes and blob storage are located in two different clusters and data is transferred to data nodes over a high-speed network. So how come there exist 'local mappers'? how about 'rack local mapper'? Are the data first copy to local disk
    of the data nodes and then get passed to the data nodes to get processed?

    Hi,
    When HDInsight is performing its task, it is streaming data from the storage node to the compute node.  But many of the map, sort, shuffle,and reduce tasks that Hadoop is performing is being done on the local disk residing with the compute nodes themselves. 
    The map, reduce, and sort tasks typically will be performed on compute nodes with minimal network load while the shuffle tasks will use some network to move the data from the mappers nodes to less reduce nodes.  The final step of storing the dat back
    to the storage is typically a much smaller dataset (e.g. a query dataset or report).  In the end, the network is being more heavily utilized during the initial and final streaming phases while most of the other tasks are being performed intra-nodally
    (i.e. minimal network utilization).
    Regards.
    Debarchan Sarkar - MSFT ( This posting is provided AS IS with no warranties, and confers no rights.)

  • How to set CORS properties for BLOB Storage using node?

    Hi - I just got started with Azure using a Node-based web site and mobile services.
    I am following various documentation in order to provide an API for users to upload images via a time-restricted SAS for the BLOB Storage.
    In order to upload my image, I need to set the CORS configuration for the BLOB Storage. Unfortunately this cannot be done via the management portal.
    I'm unclear as to how to accomplish this. I'm considering using the startup.js file in my mobile service to make a post request to the BLOB Storage REST API:
    http://msdn.microsoft.com/en-us/library/windowsazure/hh452235.aspx
    Are there appropriate methods in the Node SDK to make this easier, especially the signing part?
    What is the recommended way for setting CORS properties for the BLOB Storage via Node?
    Thanks for your help
    Stefan

    Unfortunately Node SDK does not support CORS functionality yet. Your option would be to write code which consumes the REST API for setting CORS. Not sure if it helps but there's a free tool out there written by my company which you can use to set CORS
    on your storage account. More information about this tool can be found here:
    http://blog.cynapta.com/2013/12/cynapta-azure-cors-helper-free-tool-to-manage-cors-rules-for-windows-azure-blob-storage/
    Hope this helps.

  • Will giving up bundling and minification in lieu of serving js and css from Azure blob be beneficial?

    Hi,
    We have an MVC web site deployed as a service on Microsoft Azure. For boosting performance some of my colleagues suggested that we scrap the bundling and minification and instead serve .js and .css files from an Azure blob instead. Please note that the solution
    does not use a CDN, merely serving the files from a blob instead of using the bundling and minification feature. 
    My take on this is that just serving the files this way will not cause any major performance benefits. Since we are not using  CDN, the files will get served from the region in which our storage is deployed all the time. Also, since they are not bundled
    and kept as individual files, the server requests will not reduce. So we are forfeiting the benefits of bundling and minification. The only benefit I see to this approach is that we can do changes to the js and css files and upload them without a need to re-deploy.
    Can anyone please tell me which of the two options is preferable in terms of performance?

    Hi Nikhil,
     Thanks for posting.
     I agree with you on this one, but again it depends on the scale of your website and the number of requests that you expect. If your page requests are low then you might as well store all the static content on the blob storage, the blob storage is very
    scalable and it will be good enough in most cases. bear in mind that each time a page request that includes a link to the storage, it counts as a storage transaction. currently, they are priced at $0.005 per 100,000 transactions, so it would be a while to
    be costly. Next step would be to expose the container over CDN to get the distributed edge caching.
      All in all there are performance benefits in using either one, just depends on careful considerations based on the expected traffic and location of your customers and users.
    you can refer to the following threads, let us know if you have any questions
    http://social.technet.microsoft.com/Forums/azure/en-US/5fc30aae-8f72-42b5-9202-3778c28033dc/best-performance-for-storing-static-files-images-css-javascript-cdn-blobstorage-or-webrole?forum=windowsazuredata
    http://stackoverflow.com/questions/6968011/storing-css-files-on-windows-azure
    Regards,
    Nithin.Rathnakar

  • E4200 - Can't stream content from direct attched storage

    I did not have any issues with the Linksys routers except for:
    One major issue I had with Linksys E4200 and E3000 was with regards to content sharing/streaming.
    Linksys allows you to attach an external storage to the router and share the content with other computers. However, this product nor the earlier, E3000, didn't allow streaming via TV, bluray or media with DNLA capability.
    We could attach a storage to the router then map that in other computer and by providing a username/password you could access the content.
    These TV's and blurays do not have the ability of providing a username/password to the router to be recognized. Which your tech support missed to understand and instead advised me to open up a port on TV! (WHY?). Unfortunately, your tech support couldn't get this part and suggested to install the Samsung AllShare software on the External drive! Which I explained over the phone, that NAS/external storage doesn't have an OS to host the software.
    Therefore, TV/Bluray can't stream content from the attached storage, Although TV/bluray could see the Media attached to router.
    That is deal breaker. There should be an option in the router to allow the devices in internal network to be able to stream media from an attached Storage.
    I chatted with Linksys support twice with regards to E3000. Both stated as long as I could map the drive on another computer then there were no problem. So what about your statement about upnp "The built-in UPnP AV Media Server enables seamless streaming of video and media files on your attached storage to an Xbox 360, PS3, or other UPnP compatible device"? That is why I didn't bother chatting when I got the E4200.
    Just a note that Linksys support test the streaming via Seagate 2TB External drive and a 2G Kingston Traveler USB flash drive (since USB flash was in the list of tested USB storage).
    This is matter of some thinking, few hours of developers work, QA and a firmware update.

    OK,
    Got a new E3000, and I updated the firmware to the latest.
    Now going to format a supported USB Key,/External drive and attached it to see what will happen.
    I will do as follow;
    1- Now, today I will reset the device as it was advised previously, meaning holding the reset botton at the end for 30 secound and then wait for it to comeup. Or just unplug it for 30 secound?
    2- Will enable the Media Server
    3- Specify which folders to be scanned for media, I will leave it as is cause I want the whole USB to be scanned
    So When I insert and format the USB with the Router, it will create a directory call public and andother inside this one call public.  I will place my file in that folder as follow
    \public\public\mymovie.avi
    will scan the USB key/flash drive, or should I specify anything? please let me know I am noob in that scanning part
    specify the interval I will set it to every hours for today and then laterto 24 hours.

  • Why I cannot open camera raw files from camera Nikon 3200 through Adobe Bridge and/or Photoshop?

    Why I cannot open raw files (NEF) from camera Nikon D3200 through Adobe Bridge and/or Photoshop ?

    Thanks for the tip. It worked out perfectly.
    De: ssprengel [email protected]
    Enviada em: terça-feira, 14 de janeiro de 2014 19:14
    Para: Aecio Paes Leme
    Assunto: Why I cannot open camera raw files from camera Nikon 3200 through Adobe Bridge and/or Photoshop?
    Re: Why I cannot open camera raw files from camera Nikon 3200 through Adobe Bridge and/or Photoshop?
    created by ssprengel <http://forums.adobe.com/people/ssprengel>  in Adobe Camera Raw - View the full discussion <http://forums.adobe.com/message/6012775#6012775

  • Copying Files From a Remote Machine through "rcp" command not working.

    Hi All,
    I'm a new comer to this famous forum. I was trying to go through the PDF "Solaris Advanced User's Guide" .So in chapter 9-"Using the network" i came across "Copying Files From a Remote Machine". And the syntax was "rcp machinename:source destination" . And i got another note. It is like
    "The rcp command enables you to copy files from one machine to another. This command uses the remote machine's /etc/hosts.equiv and /etc/passwd files to determine whether you have unchallenged access privileges. The syntax for rcp is similar to the command syntax for cp.".
    But i maintained remote machine's IP address in my system's /etc/hosts file. But still i am unable to do the rcp from remote system to my system or vice versa.
    Always i am getting error message " **Connection refused**".
    Therefore please some one guide me how to perform the " Copying Files From a Remote Machine" through rcp command.
    Reghards
    Kartik

    Hi
    The inconvenience of using scp is that you have to type the password every time you stablish a connection. You can work around this, adding a key into the remote hosts_allow file. This implies in more maintenance.
    From the rcp man page:
    +rcp does not prompt for passwords. It either  uses  Kerberos authentication which is enabled through command-line options or your current local user name must exist on  hostname  and allow remote command execution by rsh(1).+
    From the rsh man page:
    + If you omit command, instead of executing a single command, rsh logs you in on the remote host using rlogin(1).+
    By default, rlogin is disabled on Solaris 10
    [SunOS 5.10/bash] root@wgtsinf01:/store/sun/operating-systems
    # svcs -a|grep -i rlog
    disabled       May_11   svc:/network/login:rloginSo, to use rcp you have to enable the rlogin service and set up all the configuration files. Particularly, as already suggested, I too suggest you to use scp. :)
    Cheers
    -- Andreas
    Edited by: Bank_Of_New_Zealand on 15/06/2009 13:09

  • How to select data from AZure table storage without row key and partition key

    Hi 
    I need to select a data from azure table storage without rowkey and partition key. how  in azure storage emulator click query it display all data from that table. 
    thanks 
    rajesh 

    Hi rajesh,
    It seems that you didn't click query data using storage emulator. But I recommend you could use the azure server explore in your VS to view your data and query data. Please see this document (http://msdn.microsoft.com/en-us/library/azure/ff683677.aspx).
    And base on my experience, you may need input the command on Azure storage emulator, such as this page(http://msdn.microsoft.com/en-us/library/azure/gg433005.aspx).
    Regards,
    Will
    We are trying to better understand customer views on social support experience, so your participation in this interview project would be greatly appreciated if you have time. Thanks for helping make community forums a great place.
    Click
    HERE to participate the survey.

  • Connect SAP BO Data Services to Azure Blob Storage

    Hi,
    I want to connect SAP BO Data service to Azure Blob Storage. Is there a way to achieve this?
    Thanks,

    Hi Anton,
    As of now, it is not possible for SAP BO Data Services to connect with Azure Blob Storage. However, you may put down your requirement in Azure Feedback, so that concerned development team can get it done in near future.
    http://feedback.azure.com/forums/34192--general-feedback
    Regards,
    Manu Rekhar

  • Access blob storage files by specific domain. (Prevent hotlinking in Azure Blob Storage)

    Hi,
    My application deployed on azure, and I managed all my file to blob storage.
    When i created container with public permission then it accessible for all anonymous users. When i hit URL of file (blob) from different browser, then i will get that file.
    In Our application we have some important file and images that we don't want to expose. When we render HTML page then in <img> tag we define src="{blob file url}" when i mention this then public file are accessible, but same URL i copied
    and hit to anther browser then still it is visible. My requirement is my application domain only able to access that public file in blob storage.
    Amazon S3 which provide bucket policy where we define that for specific domain only file will accessible. see http://docs.aws.amazon.com/AmazonS3/latest/dev/example-bucket-policies.html
    Restricting Access to a Specific HTTP Referrer

    hi Prasad,
    Thanks for your post back.
    All of SAS and CORS could work, but not comprehensive.
    For your requirement, " My requirement is my application domain only able to access that public file in blob storage.", If you want to stop the other domain site access your blob, you may need set the CORS for your blob. When
    the origin domain of the request is checked against the domains listed for the
    AllowedOrigins element. If the origin domain is included in the list, or all domains are allowed with the wildcard character '*', then rules evaluation proceeds. If the origin domain is not included, then the request fails. So other domain didn't access
    your resource. You also try the Gaurav's blog:
    http://gauravmantri.com/2013/12/01/windows-azure-storage-and-cors-lets-have-some-fun/
    If you access CROS resource, you also need use SAS authenticated.
    However SAS means that you can grant a client limited permissions to your blobs, queues, or tables for a specified period of time and with a specified set of permissions, without having to share your account access keys. The SAS is a URI that encompasses
    in its query parameters all of the information necessary for authenticated access to a storage resource (http://azure.microsoft.com/en-us/documentation/articles/storage-dotnet-shared-access-signature-part-1/
    ).  So if your SAS URI is available and not expired ,this URI could be used to other domain site. I think you can try to test it.
    If I misunderstood, please let me know.
    Regards,
    Will
    We are trying to better understand customer views on social support experience, so your participation in this interview project would be greatly appreciated if you have time. Thanks for helping make community forums a great place.
    Click
    HERE to participate the survey.

  • Azure blob storage can not load file

    I am trying to follow the MSDN tutorial on creating a container in blob storage and am getting the following error:
    Additional information: Could not load file or assembly 'Microsoft.Data.Services.Client, Version=5.6.0.0, Culture=neutral, PublicKeyToken=31bf3856ad364e35' or one of its dependencies. The located assembly's manifest definition does not match the assembly reference. (Exception from HRESULT: 0x80131040)
    I have tried numerous stack overflow suggestions such as this
    one , to no avail. I have also tried uninstalling Azure storage, and re-installing it as well. Just in case it is helpful here is my test case for Azure storage:
    public void TestCloudStorage()
    CloudStorageAccount storageAccount = CloudStorageAccount.Parse(ConfigurationManager.AppSettings["StorageConnectionString"]);
    CloudBlobClient blobClient = storageAccount.CreateCloudBlobClient();
    CloudBlobContainer container = blobClient.GetContainerReference("pictures");
    container.CreateIfNotExists();
    Thank you in advance for your help!

    HI
    I know what happend, that's the Nuget Package error.
    Azure Storage nuget package 3.0+ have this error.
    My solution is use Azure nuget package 2.1+
    1 delete all the dll reference for Azure storage (Or recreate a new project)
    2 input below command in package Manager console
    Install-Package
    WindowsAzure.Storage -Version 2.1.0.4
    That will resolve your problem absolutely .
    Aslo you can refer to other peoples solution.
    http://stackoverflow.com/questions/20457846/missing-microsoft-data-services-client-version-5-6-on-azure-websites
    My Blog
    Please use Make as Answer if my post solved your problem and use
    Vote As Helpful if a post was useful.

Maybe you are looking for