Slow Uploads to Azure Storage

We've just started using Azure Storage for backups.  Using both Veeam Cloud edition to connect to Azure as well as the Azure Throughput Analyzer, we've only seeing 250-500 kb/sec upload.  Our Internet connection is a 20 megabit connection with
less than half in use.  We've also ruled out saturated switch ports on the internal side of the network.  Downloading a file from Azure that we've upload also only downloads at a few hundred kb/sec.
The Throughput Analyzer reports an upload of 1.6 megabit.  It reports the same thing when off our business network and from my home cable connection.  I'm thinking I'm being throttled on the Azure side of the connection?  Is there any way
to check or confirm that?  We just moved from the free trial to Pay-As-You-Go thinking that would remove any bandwidth limits that may be imposted on the free trial, but so far that doesn't seem to have made any different.  Also found out the paid
support is separate from paying for Azure....so here I am in forums.  Any ideas?  :-)

Hi,
Thanks for your post.
Base on my experience, it is a lots of reason about slow uploaded or downloaded to azure storage, such as network, network provider and so on. So MS didn't have any SLA about the upload or download speed. I recommend you could view the azure
speed form this websites (http://azurespeedtest.azurewebsites.net/), it shows the blob Storage Latency in every Data Center. And you could see this
Azure Storage Scalability and Performance Targets
Regards,
Will
We are trying to better understand customer views on social support experience, so your participation in this interview project would be greatly appreciated if you have time. Thanks for helping make community forums a great place.
Click
HERE to participate the survey.

Similar Messages

  • How to connect a web upload form(for video uploads) to Azure Storage / Media services?

    Hi I have a form on my Joomla site (see image under for viewers to upload and send me their videos, images and info.
    Question: 
    1.Instead of receiving this form content via email,  I just need a
    HTML and/or PHP code with tags code so I can send the content viewers give me (videos with up to 500MB) directly into my existing Azure blob.
    Seems I can also direct the form to a any database... They gave me this code sample but it was for AWS
    https://www.chronoengine.com/faqs/63-cfv4/cfv4-working-with-form-data/5196-can-i-save-files-to-aws.html
    Therefore they suggested me to reach out to Azure devs.
    I don't code so please show me the exact code HTML and/or PHP code to connect my "pre-built forms" with Azure Storage/Media services.
    kd9000

    You may refer the following links which may help you in building forms to direct to azure table/blob storage:
    http://www.joshholmes.com/blog/2010/04/15/creating-a-simple-php-blog-in-azure/
    http://stackoverflow.com/questions/27464839/use-form-post-php-to-upload-a-file-to-azure-storage-and-return-the-link
    Regards,
    Manu

  • Large Files from FTP to Azure Storage Account.

             We required to transfer files from FTP to Azure Storage Account.
    Our FTP contains Almost 4 TB data and in future it will be growing.
    We require some automate process which will transfer data from FTP to Windows Azure Storage.
    Currently Implemented Solution:  We have created
    a windows service which will download files from FTP and converting to file stream and with Blob Storage object it will be uploaded to
    Storage account.
    With the mentioned approach bandwidth is utilized at 2 places
    For downloading file from FTP
    For Uploading to Azure Storage 
    Can you please provide any solution/suggestion which will reduce the consumption of bandwidth and large data will transfer in less time

    Hi,
    Please have a look at below article, it is talking about Blob Transfer Utility tool. Blob Transfer Utility is a GUI tool to upload and download thousands of small/large files to/from Windows Azure Blob Storage.
    #https://blobtransferutility.codeplex.com/
    Disclaimer: This is not a Microsoft official tool and it is not supported by Microsoft. It's just a (very useful) sample.
    Best Regards,
    Jambor
    We are trying to better understand customer views on social support experience, so your participation in this interview project would be greatly appreciated if you have time. Thanks for helping make community forums a great place.
    Click
    HERE to participate the survey.

  • How to upload 100 Mb+ video on Azure storage using PHP

    Dear Sirs,
    I have uploaded 1.2 Mb video to Azure storage through PHP code with no problem. When I tried to upload a 90 Mb video I got the following error message:
    Fatal error: Allowed memory size of 268435456 bytes exhausted (tried to allocate 90363852 bytes) in C:\xampp\htdocs\projetos\ajao\azure-sdk-for-php-master\WindowsAzure\Blob\BlobRestProxy.php on line 1319
    Can somebody please instruct me how to upload 100 Mb+ file size into Azure storage using PHP code?
    In advance thanks for your help.
    Best regards,
    Luiz Doi

    Hi Manu, Im having the same problem as Luiz, getting the same type of errors. I'm not a dev so I thought I ask for more details to clear things out.
    I have a Joomla site that was configured in though Azure hosting but I can't upload anything into Joomla less than 8MB. So basically having Azure storage blobs and media services for Joomla is useless if I can't use the larger files into the site or out
    of the site into Azure storage/Media services acct.   
    I even manually changed the upload_max_filesize and post_max_size parameters on the admin and then directly on the site code and even added a php.ini file on the root to override all as I read
    here but still nothing worked. 
    Therefore the code you posted above seems the solution but just wanted to be sure if it will solve my issue.
    1. I need my Azure hosted site to allow larger files up to 500MB.
    2. I need the files people upload on my site front end through a form see
    here to be sent to my Azure blob. What code do I need to paste on the Joomla's forms admn. So basically files first go into site and then to storage.
    3. After I upload the Php SDk and then the Http_Request2 PEAR
    and paste your code in the SDK how does the code recognize my Azure site and my azure blob/containers? Where in the code do I plug in my info?
    Kindly advise.
    Karin
    kd9000

  • Question about uploading file to Azure Storage

    Can we upload ISO to azure storage and deploy to Azure Virtual Machine?

    hi Abdul,
    Yes. Please see those tutorials:
    windows server:
    http://azure.microsoft.com/en-us/documentation/articles/virtual-machines-create-upload-vhd-windows-server/
    Linux:
    http://azure.microsoft.com/en-us/documentation/articles/virtual-machines-linux-create-upload-vhd/
    Regards,
    Will
    We are trying to better understand customer views on social support experience, so your participation in this interview project would be greatly appreciated if you have time. Thanks for helping make community forums a great place.
    Click
    HERE to participate the survey.

  • Uploading Page Blob to Azure - Storage Exception: Make sure the value of Authorization header is formed correctly

    I am using Microsoft Azure Java SDK for Uploading Page Blobs (Size: 20G/40G). IN mid of Uploading, SDK throws Storage Exception:
    java.io.IOException: null
    at com.microsoft.azure.storage.core.Utility.initIOException(Utility.java:584) ~[classes/:na]
    at com.microsoft.azure.storage.blob.BlobOutputStream$2.call(BlobOutputStream.java:414) ~[classes/:na]
    at com.microsoft.azure.storage.blob.BlobOutputStream$2.call(BlobOutputStream.java:398) ~[classes/:na]
    at java.util.concurrent.FutureTask$Sync.innerRun(FutureTask.java:334) ~[na:1.7.0_25]
    at java.util.concurrent.FutureTask.run(FutureTask.java:166) ~[na:1.7.0_25]
    at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:471) ~[na:1.7.0_25]
    at java.util.concurrent.FutureTask$Sync.innerRun(FutureTask.java:334) ~[na:1.7.0_25]
    at java.util.concurrent.FutureTask.run(FutureTask.java:166) ~[na:1.7.0_25]
    at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1145) ~[na:1.7.0_25]
    at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615) ~[na:1.7.0_25]
    at java.lang.Thread.run(Thread.java:724) ~[na:1.7.0_25]
    Caused by: com.microsoft.azure.storage.StorageException: Server failed to authenticate the request. Make sure the value of Authorization header is formed correctly including the signature.
    at com.microsoft.azure.storage.StorageException.translateException(StorageException.java:162) ~[azure-storage-1.2.0.jar:na]
    at com.microsoft.azure.storage.core.StorageRequest.materializeException(StorageRequest.java:306) ~[azure-storage-1.2.0.jar:na]
    at com.microsoft.azure.storage.core.ExecutionEngine.executeWithRetry(ExecutionEngine.java:178) ~[classes/:na]
    at com.microsoft.azure.storage.blob.CloudPageBlob.putPagesInternal(CloudPageBlob.java:642) ~[classes/:na]
    at com.microsoft.azure.storage.blob.CloudPageBlob.uploadPages(CloudPageBlob.java:971) ~[classes/:na]
    at com.microsoft.azure.storage.blob.BlobOutputStream$2.call(BlobOutputStream.java:402) ~[classes/:na]
    ... 9 common frames omitted
    SDK Version:
    <groupId>com.microsoft.azure</groupId>
    <artifactId>azure-storage</artifactId>
    <version>1.2.0</version>
    Please note this error shows up in the mid of Upload randomly: For example: Transferred bytes: 27325890560 out of total 42949673472 successfully and then fails with Exception pasted above.
    Is there a chance of time drift on a Linux Box causing this issue ? Any other pointers would help.
    Thanks.

    Hi Kriti,
    We believe its a timing issue. Would you be able to refer the following link and check if it helps?
    http://blogs.msdn.com/b/kwill/archive/2013/08/28/http-403-server-failed-to-authenticate-the-request-when-using-shared-access-signatures.aspx
    If not, would you be able to share the fiddler trace for this?
    Regards,
    Malar.

  • Uploading Stream (Images/Audio/Video files/Doc files) to Windows Azure Storage Blob by using SharePoint Online 2013 Library

    Dear All,
    How I can store the Images/Audio/Video/documents files in blob storage by help of SharePoint document library and keep a reference to them in SharePoint by putting metadata in a SharePoint Library.
    I searched a lot, but not finding suitable source.
    Thanks,
    Sharad
    sharadpatil

    hi,
    Base on my experience, you could use azure storage reference sharepoint app. I suggest you could recommend this blog via (http://sachintana.blogspot.com/2012/08/azure-blob-storage-for-sharepoint.html ).
    Regards,
    Will
    We are trying to better understand customer views on social support experience, so your participation in this interview project would be greatly appreciated if you have time. Thanks for helping make community forums a great place.
    Click
    HERE to participate the survey.

  • How do I improve performance while doing pull, push and delete from Azure Storage Queue

           
    Hi,
    I am working on a distributed application with Azure Storage Queue for message queuing. queue will be used by multiple clients across the clock and thus it is expected that it would be heavily loaded most on the time in usage. business case is typical as in
    it pulls message from queue, process the message then deletes the message from queue. this module also sends back a notification to user indicating process is complete. functions/modules work fine as in they meet the logical requirement. pretty typical queue
    scenario.
    Now, coming to the problem statement. since it is envisaged that the queue would be heavily loaded most of the time, I am pushing towards to speed up processing of the overall message lifetime. the faster I can clear messages, the better overall experience
    it would be for everyone, system and users.
    To improve on performance I did multiple cycles for performance profiling and then improving on the identified "HOT" path/function.
    It all came down to a point where only the Azure Queue pull and delete are the only two most time consuming calls outside. I can further improve on pull, which i did by batch pulling 32 message at a time (which is the max message count i can pull from Azure
    queue at once at the time of writing this question.), this returned me a favor as in by reducing processing time to a big margin. all good till this as well.
    i am processing these messages in parallel so as to improve on overall performance.
    pseudo code:
    //AzureQueue Class is encapsulating calls to Azure Storage Queue.
    //assume nothing fancy inside, vanila calls to queue for pull/push/delete
    var batchMessages = AzureQueue.Pull(32); Parallel.ForEach(batchMessages, bMessage =>
    //DoSomething does some background processing;
    try{DoSomething(bMessage);}
    catch()
    //Log exception
    AzureQueue.Delete(bMessage);
    With this change now, profiling results show that up-to 90% of time is only taken by the Azure Message delete calls. As it is good to delete message as soon as processing is done, i remove it just after "DoSomething" is finished.
    what i need now is suggestions on how to further improve performance of this function when 90% of the time is being eaten up by the Azure Queue Delete call itself? is there a better faster way to perform delete/bulk delete etc?
    with the implementation mentioned here, i get speed of close to 25 messages/sec. Right now Azure queue delete calls are choking application performance. so is there any hope to push it further.
    Does it also makes difference in performance which queue delete call am making? as of now queue has overloaded method for deleting message, one which except message object and another which accepts message identifier and pop receipt. i am using the later
    one here with message identifier nad pop receipt to delete message from queue.
    Let me know if you need any additional information or any clarification in question.
    Inputs/suggestions are welcome.
    Many thanks.

    The first thing that came to mind was to use a parallel delete at the same time you run the work in DoSomething.  If DoSomething fails, add the message back into the queue.  This won't work for every application, and work that was in the queue
    near the head could be pushed back to the tail, so you'd have to think about how that may effect your workload.
    Or, make a threadpool queued delete after the work was successful.  Fire and forget.  However, if you're loading the processing at 25/sec, and 90% of time sits on the delete, you'd quickly accumulate delete calls for the threadpool until you'd
    never catch up.  At 70-80% duty cycle this may work, but the closer you get to always being busy could make this dangerous.
    I wonder if calling the delete REST API yourself may offer any improvements.  If you find the delete sets up a TCP connection each time, this may be all you need.  Try to keep the connection open, or see if the REST API can delete more at a time
    than the SDK API can.
    Or, if you have the funds, just have more VM instances doing the work in parallel, so the first machine handles 25/sec, the second at 25/sec also - and you just live with the slow delete.  If that's still not good enough, add more instances.
    Darin R.

  • Import files, sub folders Into Azure Storage

    Hi,
    Since Azure storage can have containers and blobs within those containers, I like to understand how Microsoft will upload the files / folders to Azure Storage if I start an import job. The situation is, I have a main folder say "c-amazingelephants-wotws"
    that will be container in Azure Storage. In that folder I have some subfolders and files that I need to upload. See snapshot below;
    I have written the utility that uploads "c-amazingelephants-wotws" as a container, uploads flipbook with blob name "files\assets\basic-html\flipbook", uploads "cart.css" in style subfolder as "files/assets/basic-html/style/cart.css"
    etc. Is that how Microsoft would upload the files? If yes then that will make my life easier and I can have Microsoft take care of 300 GB worth of my data.
    Appreciate your help,
    Thanks,
    Ahmar

    hi Ahmar,
    Thanks for you posting!
    Yes, you can upload your files on Azure blob. But you may need set your blob name such as "files/assets/basic-html/style/cart.css". You could coding a project to do this job. About how to use Azure blob, I suggest you refer to this document:
    http://azure.microsoft.com/en-us/documentation/articles/storage-dotnet-how-to-use-blobs/
    Also, for this common issue, I suggest you could refer to this Gaurav's answers in this post(https://social.technet.microsoft.com/forums/azure/en-US/0a184e05-1d20-4b37-9ce5-bee8a5b6d09c/how-to-create-sub-folder-inside-container-in-azure-storage
    Any questions about this issue, Please feel free to let me know.
    Regards,
    Will 
    We are trying to better understand customer views on social support experience, so your participation in this interview project would be greatly appreciated if you have time. Thanks for helping make community forums a great place.
    Click
    HERE to participate the survey.

  • URGENT: Azure Storage Table Outage

    UPDATE: Problem appears to have been fixed, are affected accounts eligible for a refund due to the downtime?
    Hi. I'm having trouble querying an Azure Storage table that is essential to my business operations. The queries seem to simply not be going through, and when using the Azure Storage Explorer (a 3rd party problem); I encounter the same issue. For some reason;
    Azure does not seem to be responding to my storage requests. I also cannot open a support ticket with Microsoft; as our small business does not have a support agreement with Microsoft. The storage account name is chapteradviser, and it is not even possible
    to query the table service for a list of tables (it keeps timing out). This seems to be a problem at the Azure datacenter, not my end. I am also an IT consultant for the company and do not have the binding authority to enter into a support agreement with Azure.
    Thanks for any assistance,
    - Brian Bosak
    - Consulting with ChapterAdviser Llc.

    Yup I see it too.. it looks like tables is really slow :(
    Adding curl tracing if someone from MS is looking ..  (i redacted out the account name..)
    $ curl -v --trace-time -X GET https://xxxxxxxxxx.table.core.windows.net
    17:14:22.296000 * Adding handle: conn: 0x1e67e80
    17:14:22.296000 * Adding handle: send: 0
    17:14:22.296000 * Adding handle: recv: 0
    17:14:22.296000 * Curl_addHandleToPipeline: length: 1
    17:14:22.312000 * - Conn 0 (0x1e67e80) send_pipe: 1, recv_pipe: 0
    17:14:22.312000 * About to connect() to xxxxxxxxxx.table.core.windows.net port 443 (#0)
    17:14:22.312000 *   Trying 23.99.32.80...
    17:14:25.375000 * Connected to xxxxxxxxxx.table.core.windows.net (23.99.32.80) port 443 (#0)
    17:14:25.640000 * successfully set certificate verify locations:
    17:14:25.656000 *   CAfile: C:\Program Files (x86)\Git\bin\curl-ca-bundle.crt
      CApath: none
    17:14:25.656000 * SSLv3, TLS handshake, Client hello (1):
    17:14:30.859000 * SSLv3, TLS handshake, Server hello (2):
    17:14:30.875000 * SSLv3, TLS handshake, CERT (11):
    17:14:30.890000 * SSLv3, TLS handshake, Server finished (14):
    17:14:30.921000 * SSLv3, TLS handshake, Client key exchange (16):
    17:14:30.921000 * SSLv3, TLS change cipher, Client hello (1):
    17:14:30.937000 * SSLv3, TLS handshake, Finished (20):
    17:14:41.937000 * SSLv3, TLS change cipher, Client hello (1):
    17:14:41.953000 * SSLv3, TLS handshake, Finished (20):
    17:14:41.953000 * SSL connection using AES128-SHA
    17:14:41.968000 * Server certificate:
    17:14:41.984000 *        subject: CN=*.table.core.windows.net
    17:14:42.000000 *        start date: 2014-02-20 12:59:18 GMT
    17:14:42.000000 *        expire date: 2016-02-20 12:59:18 GMT
    17:14:42.031000 *        subjectAltName: xxxxxxxxxx.table.core.windows.net matched
    17:14:42.046000 *        issuer: DC=com; DC=microsoft; DC=corp; DC=redmond; CN=MSIT Machine Auth CA 2
    17:14:42.062000 *        SSL certificate verify ok.
    17:14:42.078000 > GET / HTTP/1.1
    17:14:42.078000 > User-Agent: curl/7.30.0
    17:14:42.078000 > Host: xxxxxxxxxx.table.core.windows.net
    17:14:42.078000 > Accept: */*
    17:14:42.078000 >
    17:15:35.078000 < HTTP/1.1 400 The requested URI does not represent any resource on the server.
    17:15:35.093000 < Content-Length: 360
    17:15:35.093000 < Content-Type: application/xml
    17:15:35.093000 * Server Microsoft-HTTPAPI/2.0 is not blacklisted
    17:15:35.109000 < Server: Microsoft-HTTPAPI/2.0
    17:15:35.109000 < x-ms-request-id: f2e0b20e-5888-43ce-bbf0-68589e7ad972
    17:15:35.109000 < Date: Sat, 09 Aug 2014 00:15:30 GMT
    17:15:35.125000 <
    <?xml version="1.0" encoding="utf-8" standalone="yes"?>
    <error xmlns="http://schemas.microsoft.com/ado/2007/08/dataservices/metadata">
      <code>InvalidUri</code>
      <message xml:lang="en-US">The requested URI does not represent any resource on the server.
    RequestId:f2e0b20e-5888-43ce-bbf0-68589e7ad972
    Time:2014-08-09T00:15:31.4896331Z</message>
    </error>17:15:35.125000 * Connection #0 to host xxxxxxxxxx.table.core.windows.net left intact

  • AzCopy error "The destination azure storage location type cannot be inferred"

    Hi All,
    I'd like to try out Microsoft Azure File Service.
    I created a storage account with File Service support. I created a File Share. I was able to mount this File Share using "net use" from my Azure VM. I can manage files from this VM.
    Now I'd like to copy files from my home computer.
    C:\Program Files (x86)\Microsoft SDKs\Azure>AzCopy /Source:c:\mydir /Dest:https://myendpoint.file.core.windows.net/myshare1/ /DestKey:%key% /S
    I get an error:
    [2015.02.21 11:08:21][ERROR] The syntax of the command is incorrect. The destination azure storage location type can not be inferred. Use /destType:blob to specify the location type explicitly.

    Hi,
    AzCopy version 2.4 now supports moving your local files to Azure files and back. Please check if the AzCopy version is 2.4 or higher.
    In addition, empty folders will not be uploaded. Please make sure that the source you defined is not an empty folder.
    Best regards,
    Susie
    Please remember to mark the replies as answers if they help and unmark them if they provide no help. If you have feedback for TechNet Subscriber Support, contact [email protected]

  • How to send xml file from local folder to azure storage

    Hi,
    My plan is i have xml files which are under folders in my local.
    I want to use mobile service to send xml files to azure storage,
    how shall i do that, either by c# or mobile service.
    If internet stop, I will use my mobile service to transfer all xml files to azure storage and run web job to do to update azure
    sql by xml file.
    please advice.
    Superman

    Hi,
    You could refer the following link for assistance with uploading image files to Azure Blob Storage using Mobile Services:
    http://azure.microsoft.com/en-us/documentation/articles/mobile-services-windows-phone-upload-data-blob-storage/
    And for image files you could refer the following link:
    http://stackoverflow.com/questions/25977406/upload-a-text-file-with-azure-mobile-services
    Regards,
    Malar.

  • Meteor -- Azure Storage blobs

    I have a requirement where I need users to be able to upload really large files (videos) from a Meteor app into an Azure container.
    For those of you unfamiliar with Meteor, it's young but important full stack Javascript platform.  BTW, Meteor just released official support for Windows.   That may be one of the reasons why there are tons of S3-friendly file upload packages in
    the Meteor ecosystem, but hardly anything that works for Azure storage yet.   
    In any case, there is this one new package, https://github.com/jamesfebin/azure-blob-upload that seemed promising.   However, I'm running into issues as documented here: https://forums.meteor.com/t/uploading-large-files-to-azure-storage/2741
    There's another possible path.  One thing the Azure team + community could do to onboard Meteor developers is add an example of Azure storage support to the Slingshot package. https://github.com/CulturalMe/meteor-slingshot  The reason why this
    is important is that for gigantic files, we'd rather not have to stream from the client through the Meteor server (NodeJS), and then over to Azure.  Slingshot supports a secure method for doing that with S3, Google Cloud Storage, and even Rackspace Storage.
    Seems like Azure could be part of the mix if there was someone who could interpret from the NodeJS SDK and translate it to what Slingshot is doing.  I'm just not saavy enough to do it myself at the moment.

    Hi Soupala,
    Looks like you have got a solution for this issue from the
    Meteor forum.
    However, I will try to find out if any other solutions are available for this scenario.
    Regards,
    Manu

  • Performance is too slow on SQL Azure box

    Hi,
    Performance is too slow on SQL Azure box (Located in Europe)
    Below query returns 500,000 rows in 18 Min. on SQL Azure box (connected via SSMS, located in India)
    SELECT * FROM TABLE_1
    Whereas, on local server it returns 500,000 rows in (30 sec.)
    SQL Azure configuration:
    Service Tier/Performance Level : Premium/P1
    DTU       : 100
    MAX DB Size : 500GB     
    Max Worker Threads : 200          
    Max Sessions     : 2400
    Benchmark Transaction Rate      : 105 transactions per second
    Predictability : Best
    Any suggestion would be highly appreciated.
    Thanks,

    Hello,
    Can you please explain in a little more detail the scenario you testing? Are you comparing a SQL Database in Europe against a SQL Database in India? Or a SQL Database with a local, on-premise SQL Server installation?
    In case of the first scenario, the roundtrip latency for the connection to the datacenter might play a role. 
    If you are comparing to a local installation, please note that you might be running against completely different hardware specifications and without network delay, resulting in very different results.
    In both cases you can use the below blog post to assess the resource utilization of the SQL Database during the operation:
    http://azure.microsoft.com/blog/2014/09/11/azure-sql-database-introduces-new-near-real-time-performance-metrics/
    If the DB utilizes up to 100% you might have to consider to upgrade to a higher performance level to achieve the throughput you are looking for.
    Thanks,
    Jan 

  • How to set web job like a ssis which can take xmil file from azure storage to up date azure sql

    Hi,
    I have xml files under azure storage, i would like to set up web job which should be scheduled and
    load xml to update/insert/delete azure database.
    I have done SSIS locally and but cant use in azure.
    According to several advices and forum pages,I need to create azure web job.
    Please advice link or advice.
    Thanks.
    Superman

    Hi,
    To get started with Azure webjobs you could refer the following link:
    http://azure.microsoft.com/en-in/documentation/articles/websites-dotnet-webjobs-sdk-get-started/
    The article shows how to create a multi-tier ASP.NET MVC application that uses the WebJobs SDK to work with Azure blobs in an Azure Website.
    Meanwhile, we will research regarding your specific requirement and revert with our finds.
    Regards,
    Malar

Maybe you are looking for

  • I'm really worried

    I had tried to put more music than can fit on my iPod, on my iTunes, and a window popped up that said that it was going to make a new folder of my selected songs. I was downloading some music at the time, and I put it on my iPod, and accidentally era

  • Using value selected in selectOneChoice as variable for execute method

    I have already created data controls for the session bean method so I can drag and drop on to the jsp. This works fine when entering the data into the input text fields but I really need the first value to come from an initial page and the second val

  • Can't Find PhoneGap in Dreamweaver CC

    Hello everyone. I have the latest version of Dreamweaver CC. I can't seem to find the PhoneGap Build option under Site Menu. This is a fresh install of DreamWeaver and I haven't installed any of the iOS or Android SDKs yet but I noticed PhoneGap Buil

  • Business process in MM!!

    Hello All, Please go through the below given process: We have two plants , one central stock created as the plant and one production plant. The production plant uses SAP but the central stock only uses a legacy system and SAP is connected to it in th

  • Revealing an address bar

    Hi everyone! Just learning bridge and it seems good, especially for viewing vector images which my explorer replacement (directory opus) doesn't do currently. Often times I'm in the folder I already need to be with my explorer and I just want to copy