Learning resource for Azure Storage
Dive Deep into Networking Storage and Disaster Recovery Scenarios
http://www.microsoftvirtualacademy.com/training-courses/dive-deep-into-networking-storage-and-disaster-recovery-scenarios?WT.mc_id=13361-ITP-ming-man-chan-mvp-content
chanmm
There are some other great resources available.
1) Azure storage documentation: It's just awesome. you will get all latest bits here.
http://azure.microsoft.com/en-us/documentation/services/storage/
2) Checkout MVA course for azure storage
http://www.microsoftvirtualacademy.com/training-courses/windows-azure-storage-design-and-implementation-jump-start
http://www.microsoftvirtualacademy.com/training-courses/windows-azure-storage-design-and-implementation-jump-start
There are some cool examples also there on Azure Storage from the following link.
https://azurestoragesamples.codeplex.com/
Jalpesh Vadgama, http://www.dotnetjalps.com
Similar Messages
-
How to set same domain name for Azure Storage and Hosted Service
I have a web application running on azure and using azure storage with blob. My application allows to edit html files that are in the azure storage to save with another name and publish them later. I am using javascript to edit the html content that
I display in an Iframe but as the domain of my web application and the html that I try to edit are not the same, I got and this error "Unsafe JavaScript
attempt to access frame with URL "URL1" from frame with URL "URL2". Domains, protocols and ports must match".
I have been doing some research about it and the only way to make it work is to have the web application and the html that I want to access using javascript under the same domain.
So my question is: is it possible to have the same domain name in azure for the hosted service and the storage.
Like this:
hosted service: example.com
storage: example.com
By the way I already customize the domain names so they looks like this:
hosted service <mydomainname>.com
storage <blob.mydomainname>.com
Actually I have my application running in another hosting and I have no problem there since I have a folder where I am storing the files that I am editing so they are in the same domain as the application. I have been thinking in to do the same with Azure,
at least to have a folder where I can store the html file meanwhile I am editing it but I am not sure how much space I have in the hosted service to store temporary files.
let me know if you have a good tip or idea about how to solve this issue.Hi Rodrigo,
Though both Azure Blob and Azure applications support custom domain, one domain could have only one DNS record (in this case is CNAME record) at one time. For Steve's case, he has 3 domains, blog.smarx.com, files.blog.smarx.com and cdn.blog.smarx.com.
> I would like to find a way to storage my html page that I need to edit under the same domain.
For this case, a workaround will be adding a http handler in your Azure application to serve file requests. That means we do not use the actual blob url to access blob content but send the request to a http handler then the http handler gets the content
from blob storage and returns it.
Please check
Accessing blobs in private container without Shared Access Secret key for a code sample.
Thanks.
Wengchao Zeng
Please mark the replies as answers if they help or unmark if not.
If you have any feedback about my replies, please contact
[email protected]
Microsoft One Code Framework -
Hi,
I am currently trying to find a book or video mentor series to familiarise myself with the ACE module a bit more. I know there is an exam that revolves around the product (642-975 DCASI) but i cant find any cisco press books and it seems the only teaching modules for this is through a channel partner. I also found a administration guide online but i was wondering if anyone knew of any additional books or resources that i could use to get a better understanding of Cisco ACE appliances/modulesThis was recently discussed here. You should find plenty reading material in the ACE Config Guides.
-
Can anyone recommend a good learning resource for photshop scripting? Something other than the Adobe scripting JS reference guides or Java Script manuals. Ideally, a course focused on Photoshop Scripting.
As no one else seems to have had anything particular to recommend maybe this book by Jeffrey Tranberry has a chapter of interest to offer.
http://www.amazon.com/Power-Speed-Automation-Adobe-Photoshop/dp/0240820835/ref=sr_1_1?ie=U TF8&qid=1390916925&sr=8-1&keywords=tranberry+photoshop -
Sales material for Cisco Storage
Does anyone knows where i can get access to some online resources for Cisco Storage products / technology??
Would appreciated any help.Storage portfolio pages
http://www.cisco.com/en/US/products/hw/ps4159/index.html
Technical Documentation
http://www.cisco.com/univercd/cc/td/doc/product/sn5000/index.htm
If you found this helpful, please rate it.
--Colin -
I have a requirement where I need users to be able to upload really large files (videos) from a Meteor app into an Azure container.
For those of you unfamiliar with Meteor, it's young but important full stack Javascript platform. BTW, Meteor just released official support for Windows. That may be one of the reasons why there are tons of S3-friendly file upload packages in
the Meteor ecosystem, but hardly anything that works for Azure storage yet.
In any case, there is this one new package, https://github.com/jamesfebin/azure-blob-upload that seemed promising. However, I'm running into issues as documented here: https://forums.meteor.com/t/uploading-large-files-to-azure-storage/2741
There's another possible path. One thing the Azure team + community could do to onboard Meteor developers is add an example of Azure storage support to the Slingshot package. https://github.com/CulturalMe/meteor-slingshot The reason why this
is important is that for gigantic files, we'd rather not have to stream from the client through the Meteor server (NodeJS), and then over to Azure. Slingshot supports a secure method for doing that with S3, Google Cloud Storage, and even Rackspace Storage.
Seems like Azure could be part of the mix if there was someone who could interpret from the NodeJS SDK and translate it to what Slingshot is doing. I'm just not saavy enough to do it myself at the moment.Hi Soupala,
Looks like you have got a solution for this issue from the
Meteor forum.
However, I will try to find out if any other solutions are available for this scenario.
Regards,
Manu -
Hi I have a form on my Joomla site (see image under for viewers to upload and send me their videos, images and info.
Question:
1.Instead of receiving this form content via email, I just need a
HTML and/or PHP code with tags code so I can send the content viewers give me (videos with up to 500MB) directly into my existing Azure blob.
Seems I can also direct the form to a any database... They gave me this code sample but it was for AWS
https://www.chronoengine.com/faqs/63-cfv4/cfv4-working-with-form-data/5196-can-i-save-files-to-aws.html
Therefore they suggested me to reach out to Azure devs.
I don't code so please show me the exact code HTML and/or PHP code to connect my "pre-built forms" with Azure Storage/Media services.
kd9000You may refer the following links which may help you in building forms to direct to azure table/blob storage:
http://www.joshholmes.com/blog/2010/04/15/creating-a-simple-php-blog-in-azure/
http://stackoverflow.com/questions/27464839/use-form-post-php-to-upload-a-file-to-azure-storage-and-return-the-link
Regards,
Manu -
Azure Storage Maximum Threshold for alert
Hi Azure team,
Azure Storage:
What is the maximum threshold after which azure will send notification when maximum capacity is going to be reached?
Thanks,You can create a rule based upon capacity. The units has to be in bytes. So if you wanted to get an Alert when the storage account is at 80% of 1 TB. you would set the value to 858993459. Note not to forget to enable the storage account for Capacity Alerts.
Johnny Coleman [MSFT] Any code posted to this Forum is [As-Is] With No Warranties -
Websites, books, and other resources for learning to use Premiere Pro
What websites, books, tutorials, and other resources have you found useful for learning to use Premiere Pro?
I want to make sure that we have a comprehensive list of good resources for Premiere Pro. See this page for a list of high-quality After Effects resources that I put together; I want something like this for Premiere Pro.
I'll start. In no particular order, here are some resources that I've found valuable:
this forum (of course)
Premiere Pro Help documents and Help & Support pages in several languages
Community Help search for Premiere Pro
Getting started with Premiere Pro by video2brain
Premiere Pro CS5: Learn By Video by video2brain
video tutorials on Adobe TV and Video Workshop
Premiere Pro CS5 Classroom in a Book
Premiere Pro team blog
Premiere Pro team Facebook page and Twitter feed
PPBM5 (Premiere Pro CS5 benchmark) for performance information
Best of Premiere Pro by Eran Stern> I see mention above of Lynda.com, but either missed Total Training, or perhaps they have merged?
I didn't mention Lynda.com.
And, no, Total Training and Lynda.com certainly haven't merged.
Total Training has some good Premiere Pro stuff, but it's not the first place that I'd point someone for Premiere Pro. -
Some good resources for learning Unity scripting in C#.
Title said evrything.Recomend me some good resource for learning scripting in Unity3d for C#.
But because i write on C# for year and half i really want to lessons be more about new features for scripting in C# in Unity,then introducing some basic features in C#.Unity webpage provides best tutorials for learning Unity, including C# scripting.
http://unity3d.com/learn/tutorials/modules/beginner/scripting -
What are the best resources for learning JSP?
In your experience, what have been the best resources for learning JSP? I am new to this, and so far have been plugging away at a "Dummies" book. Hopefully there are other inexpensive ways (I checked the book out at the library), to learn it, and possibly even less boring (though I doubt it). I can deal with the boring, but I was wondering if I was going about learning it the best way.
Hi ..
GO to website www.Apache.org
down load Apache tomcat server .
install it ...
Go to sun site
c and download jsp tutorials ...
Go to
(www.coreservlets.com)download book core servlets
u can find lots of nice books supported by sun on sunwebsite ...
Now u r equipped ....
start doing everything (R &D) on server ..
Join This JSP forum and ask any thing u didnt got ..or learn or get confused..
I thing once u go through this ..it will do all what u want
Hope it helps
Best of Luck :)
regds & take care -
URGENT: Azure Storage Table Outage
UPDATE: Problem appears to have been fixed, are affected accounts eligible for a refund due to the downtime?
Hi. I'm having trouble querying an Azure Storage table that is essential to my business operations. The queries seem to simply not be going through, and when using the Azure Storage Explorer (a 3rd party problem); I encounter the same issue. For some reason;
Azure does not seem to be responding to my storage requests. I also cannot open a support ticket with Microsoft; as our small business does not have a support agreement with Microsoft. The storage account name is chapteradviser, and it is not even possible
to query the table service for a list of tables (it keeps timing out). This seems to be a problem at the Azure datacenter, not my end. I am also an IT consultant for the company and do not have the binding authority to enter into a support agreement with Azure.
Thanks for any assistance,
- Brian Bosak
- Consulting with ChapterAdviser Llc.Yup I see it too.. it looks like tables is really slow :(
Adding curl tracing if someone from MS is looking .. (i redacted out the account name..)
$ curl -v --trace-time -X GET https://xxxxxxxxxx.table.core.windows.net
17:14:22.296000 * Adding handle: conn: 0x1e67e80
17:14:22.296000 * Adding handle: send: 0
17:14:22.296000 * Adding handle: recv: 0
17:14:22.296000 * Curl_addHandleToPipeline: length: 1
17:14:22.312000 * - Conn 0 (0x1e67e80) send_pipe: 1, recv_pipe: 0
17:14:22.312000 * About to connect() to xxxxxxxxxx.table.core.windows.net port 443 (#0)
17:14:22.312000 * Trying 23.99.32.80...
17:14:25.375000 * Connected to xxxxxxxxxx.table.core.windows.net (23.99.32.80) port 443 (#0)
17:14:25.640000 * successfully set certificate verify locations:
17:14:25.656000 * CAfile: C:\Program Files (x86)\Git\bin\curl-ca-bundle.crt
CApath: none
17:14:25.656000 * SSLv3, TLS handshake, Client hello (1):
17:14:30.859000 * SSLv3, TLS handshake, Server hello (2):
17:14:30.875000 * SSLv3, TLS handshake, CERT (11):
17:14:30.890000 * SSLv3, TLS handshake, Server finished (14):
17:14:30.921000 * SSLv3, TLS handshake, Client key exchange (16):
17:14:30.921000 * SSLv3, TLS change cipher, Client hello (1):
17:14:30.937000 * SSLv3, TLS handshake, Finished (20):
17:14:41.937000 * SSLv3, TLS change cipher, Client hello (1):
17:14:41.953000 * SSLv3, TLS handshake, Finished (20):
17:14:41.953000 * SSL connection using AES128-SHA
17:14:41.968000 * Server certificate:
17:14:41.984000 * subject: CN=*.table.core.windows.net
17:14:42.000000 * start date: 2014-02-20 12:59:18 GMT
17:14:42.000000 * expire date: 2016-02-20 12:59:18 GMT
17:14:42.031000 * subjectAltName: xxxxxxxxxx.table.core.windows.net matched
17:14:42.046000 * issuer: DC=com; DC=microsoft; DC=corp; DC=redmond; CN=MSIT Machine Auth CA 2
17:14:42.062000 * SSL certificate verify ok.
17:14:42.078000 > GET / HTTP/1.1
17:14:42.078000 > User-Agent: curl/7.30.0
17:14:42.078000 > Host: xxxxxxxxxx.table.core.windows.net
17:14:42.078000 > Accept: */*
17:14:42.078000 >
17:15:35.078000 < HTTP/1.1 400 The requested URI does not represent any resource on the server.
17:15:35.093000 < Content-Length: 360
17:15:35.093000 < Content-Type: application/xml
17:15:35.093000 * Server Microsoft-HTTPAPI/2.0 is not blacklisted
17:15:35.109000 < Server: Microsoft-HTTPAPI/2.0
17:15:35.109000 < x-ms-request-id: f2e0b20e-5888-43ce-bbf0-68589e7ad972
17:15:35.109000 < Date: Sat, 09 Aug 2014 00:15:30 GMT
17:15:35.125000 <
<?xml version="1.0" encoding="utf-8" standalone="yes"?>
<error xmlns="http://schemas.microsoft.com/ado/2007/08/dataservices/metadata">
<code>InvalidUri</code>
<message xml:lang="en-US">The requested URI does not represent any resource on the server.
RequestId:f2e0b20e-5888-43ce-bbf0-68589e7ad972
Time:2014-08-09T00:15:31.4896331Z</message>
</error>17:15:35.125000 * Connection #0 to host xxxxxxxxxx.table.core.windows.net left intact -
Windows Azure Storage Account Location change
Hi Experts,
I have deployed my Web site on Windows azure. I have created Windows Azure Storage Account. By mistakely i have set the location wrong. How can we change the location of Windows storage account?
Thanks!Hello CRM Inn,
I am sorry to know that you have chosen an incorrect location and would like to change the location for your storage account. Unfortunately, there is no option available to change the storage location for an existing storage account.
Your only option would be to create a new storage account, migrate data to the new storage account and delete the existing one. You can refer to this article that will give you information on migrating data from one account to another:
http://blogs.msdn.com/b/windowsazurestorage/archive/2012/06/12/introducing-asynchronous-cross-account-copy-blob.aspx
I also suggest that you create affinity groups. By creating affinity groups you can be sure that all your related services are under the same data center. You can refer to this article to learn about Affinity groups:
http://msdn.microsoft.com/en-us/library/azure/jj156209.aspx
Syed irfan Hussain -
Diagnostics not working in web role for Azure SDK 2.5.1
I am working with Azure SDK 2.5.1, mainly on the new designed diagnostics stuffs. However, I found I cannot get it run for my web role.
So, I created a cloud service project, added a web role. Then, I appended one Trace message at the end of Application_Start in Global.asax.cs:
Trace.TraceInformaction("Application_Start end.");
After that, I right clicked the WebRole and opened the Properties tab.
In the diagnostics config window:
General: I choose 'Custom
plan', also specified the storage account, keep the 'Disk
Quota in MB' as default '4096'
Application Logs: 'Log
level' switch to 'All',
others kept as default
Other tabs are in default config settings
After I deployed the project to cloud, I found some unexpected things:
There is no WADLogsTable exists
in Table storage. That's very strange, if I use a Worker Role, it would work as expected. So in web role, I just cannot find the Trace logging?
For the performance counters, since I am using the default config with 8 counters, I can only see 8 in WADPerformanceCountersTable table
storage. In my assumption, over time, there would be more and more values of this 8 counters transferred to this table. But it was just not happened, after several hours, it still had that 8 counter values.
I just thought the diagnostics for Web Role just crashed or not working at all. Meanwhile, I checked the logs located at "C:\Logs\Plugins\Microsoft.Azure.Diagnostics.PaaSDiagnostics\1.4.0.0\DiagnosticsPlugin.log" in
server, at the end of this file there is an exception said some diagnostics process exit:
DiagnosticsPlugin.exe Error: 0 : [4/25/2015 5:38:21 AM] System.ArgumentException: An item with the same key has already been added.
at System.Collections.Generic.Dictionary`2.Insert(TKey key, TValue value, Boolean add)
at Microsoft.Azure.Plugins.Diagnostics.dll.PluginConfigurationSettingsProvider.LoadWadXMLConfig(String fullConfig)
DiagnosticsPlugin.exe Error: 0 : [4/25/2015 5:38:21 AM] Failed to load configuration file
DiagnosticsPlugin.exe Information: 0 : [4/25/2015 5:38:21 AM] DiagnosticPlugin.exe exit with code -105
From MSDN,
the code -105 means:
The Diagnostics plugin cannot open the Diagnostics configuration file.
This is an internal error that should only happen if the Diagnostics plugin is manually invoked, incorrectly, on the VM.
It doesn't make sense, I did nothing to the VM.
As I said above, I just did very tiny changes to the scaffold code generated by Visual Studio 2013. Did I do something wrong or it's a bug for Azure SDK 2.5?
By the way, it seems everything is ok for Worker Role.Hi,
This issue could be an internal issue, I do not have any update on this as of now. There was a similar issue due to a regression last month that has been resolved, however i dont think this issue is related.
Please follow the below article on how to enable Diagnostics and let us know if it works.
http://azure.microsoft.com/en-in/documentation/articles/cloud-services-dotnet-diagnostics/
I will let you know if I have any update on this issue from my side.
Regards,
Nithin Rathnakar. -
How do I improve performance while doing pull, push and delete from Azure Storage Queue
Hi,
I am working on a distributed application with Azure Storage Queue for message queuing. queue will be used by multiple clients across the clock and thus it is expected that it would be heavily loaded most on the time in usage. business case is typical as in
it pulls message from queue, process the message then deletes the message from queue. this module also sends back a notification to user indicating process is complete. functions/modules work fine as in they meet the logical requirement. pretty typical queue
scenario.
Now, coming to the problem statement. since it is envisaged that the queue would be heavily loaded most of the time, I am pushing towards to speed up processing of the overall message lifetime. the faster I can clear messages, the better overall experience
it would be for everyone, system and users.
To improve on performance I did multiple cycles for performance profiling and then improving on the identified "HOT" path/function.
It all came down to a point where only the Azure Queue pull and delete are the only two most time consuming calls outside. I can further improve on pull, which i did by batch pulling 32 message at a time (which is the max message count i can pull from Azure
queue at once at the time of writing this question.), this returned me a favor as in by reducing processing time to a big margin. all good till this as well.
i am processing these messages in parallel so as to improve on overall performance.
pseudo code:
//AzureQueue Class is encapsulating calls to Azure Storage Queue.
//assume nothing fancy inside, vanila calls to queue for pull/push/delete
var batchMessages = AzureQueue.Pull(32); Parallel.ForEach(batchMessages, bMessage =>
//DoSomething does some background processing;
try{DoSomething(bMessage);}
catch()
//Log exception
AzureQueue.Delete(bMessage);
With this change now, profiling results show that up-to 90% of time is only taken by the Azure Message delete calls. As it is good to delete message as soon as processing is done, i remove it just after "DoSomething" is finished.
what i need now is suggestions on how to further improve performance of this function when 90% of the time is being eaten up by the Azure Queue Delete call itself? is there a better faster way to perform delete/bulk delete etc?
with the implementation mentioned here, i get speed of close to 25 messages/sec. Right now Azure queue delete calls are choking application performance. so is there any hope to push it further.
Does it also makes difference in performance which queue delete call am making? as of now queue has overloaded method for deleting message, one which except message object and another which accepts message identifier and pop receipt. i am using the later
one here with message identifier nad pop receipt to delete message from queue.
Let me know if you need any additional information or any clarification in question.
Inputs/suggestions are welcome.
Many thanks.The first thing that came to mind was to use a parallel delete at the same time you run the work in DoSomething. If DoSomething fails, add the message back into the queue. This won't work for every application, and work that was in the queue
near the head could be pushed back to the tail, so you'd have to think about how that may effect your workload.
Or, make a threadpool queued delete after the work was successful. Fire and forget. However, if you're loading the processing at 25/sec, and 90% of time sits on the delete, you'd quickly accumulate delete calls for the threadpool until you'd
never catch up. At 70-80% duty cycle this may work, but the closer you get to always being busy could make this dangerous.
I wonder if calling the delete REST API yourself may offer any improvements. If you find the delete sets up a TCP connection each time, this may be all you need. Try to keep the connection open, or see if the REST API can delete more at a time
than the SDK API can.
Or, if you have the funds, just have more VM instances doing the work in parallel, so the first machine handles 25/sec, the second at 25/sec also - and you just live with the slow delete. If that's still not good enough, add more instances.
Darin R.
Maybe you are looking for
-
I don't know what more I can provide until maybe it pops up again....which I'm sure it will.
-
Hi, I'm currently putting together some videos in FCP for our new website. I've used FCP a lot for creating video for DVD's and exporting to .mov but converting to flash is relatively new to me. I am using the On2 Flix FLV exporter to export direct f
-
Error when I print !!!
Hi, I use javahelp 1.1.3 with IBM virtual machine and when I going to print give me a java.exe error (can�t read XXXXXX...)!!! can I change this buttons?? if I don�t want have this buttons?? and can I change the toolTipText of the previous... buttons
-
Problem in using c:foreach
hi My taglib direcive is : <%@ taglib uri="http://java.sun.com/jstl/core" prefix="c" %>And i m using this code : <c:forEach var="list" items="${movie}"> <tr><td>${list}</td></tr> </c:forEach>The error is : org.apache.jasper.JasperException: The absol
-
Safari won't open at all. Any suggestions please?
SSafari won't open any web pages. Wifi connection good. Any suggestions, please?