Encode shared key signature in Azure Storage Queue
I'd like to use Azure storage REST api for queue, and in order to add the authentication infomation to the http header, I have to get the shared key.
From http://msdn.microsoft.com/en-us/library/azure/dd179428.aspx, I saw the string to sign like this:
I found it was very strange, I prefer it's C# code, and some place "\n" missed and so is "+", as I made up all stuffs, the authentication still failed, I do not know why, here is what I did:
using (var httpClient = new HttpClient())
var verb = "PUT";
var contentEncoding = "";
var contentLanguage = "";
var contentLenght = "0";
var contentMd5 = "";
var contentType = "text/plain; charset=utf-8";
var date = DateTime.UtcNow.ToString("R", CultureInfo.InvariantCulture);
var ifModifiedSince = "";
var ifMatch = "";
var ifNoneMatch = "";
var ifUnmodifiedSince = "";
var range = "";
var canonicalizedHeaders = "";
var canonicalizedResource = "";
var stringToSign = verb + "\n" + contentEncoding + "\n" + contentLanguage + "\n" + contentLenght + "\n" +
contentMd5 + "\n" + contentType + "\n" + date + "\n" + ifModifiedSince + "\n" +
ifMatch + "\n" + ifNoneMatch + "\n" + ifUnmodifiedSince + "\n" + range + "\n" +
canonicalizedHeaders + "\n" + canonicalizedResource;
httpClient.BaseAddress = new Uri(string.Format("https://{0}.queue.core.windows.net", accountName));
httpClient.DefaultRequestHeaders.Clear();
httpClient.DefaultRequestHeaders.Add("Authorization", string.Format("SharedKey {0}:{1}", accountName, Hash(accountKey, stringToSign)));
httpClient.DefaultRequestHeaders.Add("x-ms-date", date);
var result = httpClient.PutAsync("myqueue", new StringContent(""));
result.Wait();
static string Hash(string message, string secret)
var messageBytes = Encoding.UTF8.GetBytes(message);
var secrectBytes = Encoding.UTF8.GetBytes(secret);
using (var hash = new HMACSHA256(secrectBytes))
var hashMessage = hash.ComputeHash(messageBytes);
return Convert.ToBase64String(hashMessage);
Anyone can help me?
Hi,
Please try this code, I tested it on my side, it works fine.
String accountName = "***";
String accountKey = "***+**";
String signature = "";
String requestMethod = "POST";
String urlPath = String.Format("{0}/messages", "inputtext");
String storageServiceVersion = "2012-02-12";
String dateInRfc1123Format = DateTime.UtcNow.ToString("R", CultureInfo.InvariantCulture);
String messageText = String.Format(
"<QueueMessage><MessageText>{0}</MessageText></QueueMessage>", "test2");
UTF8Encoding utf8Encoding = new UTF8Encoding();
Byte[] messageContent = utf8Encoding.GetBytes(messageText);
Int32 messageLength = messageContent.Length;
String canonicalizedHeaders = String.Format(
"x-ms-date:{0}\nx-ms-version:{1}",
dateInRfc1123Format,
storageServiceVersion);
String canonicalizedResource = String.Format("/{0}/{1}", accountName, urlPath);
String stringToSign = String.Format(
"{0}\n\n\n{1}\n\n\n\n\n\n\n\n\n{2}\n{3}",
requestMethod,
messageLength,
canonicalizedHeaders,
canonicalizedResource);
using (HMACSHA256 hmacSha256 = new HMACSHA256(Convert.FromBase64String(accountKey)))
Byte[] dataToHmac = System.Text.Encoding.UTF8.GetBytes(stringToSign);
signature = Convert.ToBase64String(hmacSha256.ComputeHash(dataToHmac));
String authorizationHeader = String.Format(
CultureInfo.InvariantCulture,
"{0} {1}:{2}",
"SharedKey",
accountName,
signature
Uri uri = new Uri(string.Format("https://{0}.queue.core.windows.net/", accountName) + urlPath);
HttpWebRequest request = (HttpWebRequest)WebRequest.Create(uri);
request.Method = requestMethod;
request.Headers.Add("x-ms-date", dateInRfc1123Format);
request.Headers.Add("x-ms-version", storageServiceVersion);
request.Headers.Add("Authorization", authorizationHeader);
request.ContentLength = messageLength;
using (Stream requestStream = request.GetRequestStream())
requestStream.Write(messageContent, 0, messageLength);
using (HttpWebResponse response = (HttpWebResponse)request.GetResponse())
String requestId = response.Headers["x-ms-request-id"];
Also, you could see this page:http://convective.wordpress.com/2010/08/18/examples-of-the-windows-azure-storage-services-rest-api/
Regards,
Will
We are trying to better understand customer views on social support experience, so your participation in this interview project would be greatly appreciated if you have time. Thanks for helping make community forums a great place.
Click
HERE to participate the survey.
Similar Messages
-
How do I improve performance while doing pull, push and delete from Azure Storage Queue
Hi,
I am working on a distributed application with Azure Storage Queue for message queuing. queue will be used by multiple clients across the clock and thus it is expected that it would be heavily loaded most on the time in usage. business case is typical as in
it pulls message from queue, process the message then deletes the message from queue. this module also sends back a notification to user indicating process is complete. functions/modules work fine as in they meet the logical requirement. pretty typical queue
scenario.
Now, coming to the problem statement. since it is envisaged that the queue would be heavily loaded most of the time, I am pushing towards to speed up processing of the overall message lifetime. the faster I can clear messages, the better overall experience
it would be for everyone, system and users.
To improve on performance I did multiple cycles for performance profiling and then improving on the identified "HOT" path/function.
It all came down to a point where only the Azure Queue pull and delete are the only two most time consuming calls outside. I can further improve on pull, which i did by batch pulling 32 message at a time (which is the max message count i can pull from Azure
queue at once at the time of writing this question.), this returned me a favor as in by reducing processing time to a big margin. all good till this as well.
i am processing these messages in parallel so as to improve on overall performance.
pseudo code:
//AzureQueue Class is encapsulating calls to Azure Storage Queue.
//assume nothing fancy inside, vanila calls to queue for pull/push/delete
var batchMessages = AzureQueue.Pull(32); Parallel.ForEach(batchMessages, bMessage =>
//DoSomething does some background processing;
try{DoSomething(bMessage);}
catch()
//Log exception
AzureQueue.Delete(bMessage);
With this change now, profiling results show that up-to 90% of time is only taken by the Azure Message delete calls. As it is good to delete message as soon as processing is done, i remove it just after "DoSomething" is finished.
what i need now is suggestions on how to further improve performance of this function when 90% of the time is being eaten up by the Azure Queue Delete call itself? is there a better faster way to perform delete/bulk delete etc?
with the implementation mentioned here, i get speed of close to 25 messages/sec. Right now Azure queue delete calls are choking application performance. so is there any hope to push it further.
Does it also makes difference in performance which queue delete call am making? as of now queue has overloaded method for deleting message, one which except message object and another which accepts message identifier and pop receipt. i am using the later
one here with message identifier nad pop receipt to delete message from queue.
Let me know if you need any additional information or any clarification in question.
Inputs/suggestions are welcome.
Many thanks.The first thing that came to mind was to use a parallel delete at the same time you run the work in DoSomething. If DoSomething fails, add the message back into the queue. This won't work for every application, and work that was in the queue
near the head could be pushed back to the tail, so you'd have to think about how that may effect your workload.
Or, make a threadpool queued delete after the work was successful. Fire and forget. However, if you're loading the processing at 25/sec, and 90% of time sits on the delete, you'd quickly accumulate delete calls for the threadpool until you'd
never catch up. At 70-80% duty cycle this may work, but the closer you get to always being busy could make this dangerous.
I wonder if calling the delete REST API yourself may offer any improvements. If you find the delete sets up a TCP connection each time, this may be all you need. Try to keep the connection open, or see if the REST API can delete more at a time
than the SDK API can.
Or, if you have the funds, just have more VM instances doing the work in parallel, so the first machine handles 25/sec, the second at 25/sec also - and you just live with the slow delete. If that's still not good enough, add more instances.
Darin R. -
Is there any way to configure storage queue message encoding for Web Job's QueueTrigger?
We have a web job that listens to Azure storage queue via QueueTrigger. The queue messages are not encoded when they are added to the queue:
CloudStorageAccount account = CloudStorageAccount.Parse("...");
CloudQueueClient client = account.CreateCloudQueueClient();
CloudQueue queue = client.GetQueueReference("test1");
queue.EncodeMessage = false;
queue.AddMessage(new CloudQueueMessage("hello world"));
And in the web job, we use the below method to listen to the queue.
public static Task ProcessQueueMessageAsync([QueueTrigger("test1")]string message)
And the web job crashes when it gets a message, with the below output. Unfortunately we cannot control the encode setting of the coming messages. So, our question is, is there any way to configure queue message encoding for Web Job's QueueTrigger?
Thank you for any help and/or suggestion.
Found the following functions:
WebJobTest1.Functions.ProcessQueueMessageAsync
Job host started
Unhandled Exception: System.FormatException: Invalid length for a Base-64 char array or string.
at System.Convert.FromBase64_Decode(Char* startInputPtr, Int32 inputLength, Byte* startDestPtr, Int32 destLength)
at System.Convert.FromBase64CharPtr(Char* inputPtr, Int32 inputLength)
at System.Convert.FromBase64String(String s)
at Microsoft.WindowsAzure.Storage.Queue.CloudQueueMessage.get_AsString()
at Microsoft.Azure.WebJobs.Host.Storage.Queue.StorageQueueMessage.get_AsString()
at Microsoft.Azure.WebJobs.Host.Queues.StorageQueueMessageExtensions.TryGetAsString(IStorageQueueMessage message)
at Microsoft.Azure.WebJobs.Host.Queues.QueueCausalityManager.GetOwner(IStorageQueueMessage msg)
at Microsoft.Azure.WebJobs.Host.Queues.Listeners.QueueTriggerExecutor.<ExecuteAsync>d__0.MoveNext()
--- End of stack trace from previous location where exception was thrown ---
at System.Runtime.CompilerServices.TaskAwaiter.ThrowForNonSuccess(Task task)
at System.Runtime.CompilerServices.TaskAwaiter.HandleNonSuccessAndDebuggerNotification(Task task)
at Microsoft.Azure.WebJobs.Host.Queues.Listeners.QueueListener.<ProcessMessageAsync>d__11.MoveNext()
--- End of stack trace from previous location where exception was thrown ---
at Microsoft.Azure.WebJobs.Host.Timers.BackgroundExceptionDispatcher.<>c__DisplayClass1.<Throw>b__0()
at System.Threading.ThreadHelper.ThreadStart_Context(Object state)
at System.Threading.ExecutionContext.RunInternal(ExecutionContext executionContext, ContextCallback callback, Object state, Boolean preserveSyncCtx)
at System.Threading.ExecutionContext.Run(ExecutionContext executionContext, ContextCallback callback, Object state, Boolean preserveSyncCtx)
at System.Threading.ExecutionContext.Run(ExecutionContext executionContext, ContextCallback callback, Object state)
at System.Threading.ThreadHelper.ThreadStart()
Press any key to continue . . .Hi,
Thanks for posting here.
I am not too sure if the storage queue messages can be encoded before the webjob listens to it.
I am currently researching to gather more information with regards to your request.
I shall revert back to you with an update at the earliest.
Sincerely appreciate your patience.
Regards,
Shirisha Paderu -
Sharing a container in windows azure storage
Hi,
How can i share a container with specific users only? What i want is have a container shared, for instance, with 2 specifics users, where each one can upload data to that container and each one can see other's uploads. How can i do that? Is that possible?
Thanks in advance,
Regards,
Tiago OliveiraHi,
You can use
shared access signatures for your blob and provide this to the 2 specific users.
Hope this helps.
Edward -
How to set same domain name for Azure Storage and Hosted Service
I have a web application running on azure and using azure storage with blob. My application allows to edit html files that are in the azure storage to save with another name and publish them later. I am using javascript to edit the html content that
I display in an Iframe but as the domain of my web application and the html that I try to edit are not the same, I got and this error "Unsafe JavaScript
attempt to access frame with URL "URL1" from frame with URL "URL2". Domains, protocols and ports must match".
I have been doing some research about it and the only way to make it work is to have the web application and the html that I want to access using javascript under the same domain.
So my question is: is it possible to have the same domain name in azure for the hosted service and the storage.
Like this:
hosted service: example.com
storage: example.com
By the way I already customize the domain names so they looks like this:
hosted service <mydomainname>.com
storage <blob.mydomainname>.com
Actually I have my application running in another hosting and I have no problem there since I have a folder where I am storing the files that I am editing so they are in the same domain as the application. I have been thinking in to do the same with Azure,
at least to have a folder where I can store the html file meanwhile I am editing it but I am not sure how much space I have in the hosted service to store temporary files.
let me know if you have a good tip or idea about how to solve this issue.Hi Rodrigo,
Though both Azure Blob and Azure applications support custom domain, one domain could have only one DNS record (in this case is CNAME record) at one time. For Steve's case, he has 3 domains, blog.smarx.com, files.blog.smarx.com and cdn.blog.smarx.com.
> I would like to find a way to storage my html page that I need to edit under the same domain.
For this case, a workaround will be adding a http handler in your Azure application to serve file requests. That means we do not use the actual blob url to access blob content but send the request to a http handler then the http handler gets the content
from blob storage and returns it.
Please check
Accessing blobs in private container without Shared Access Secret key for a code sample.
Thanks.
Wengchao Zeng
Please mark the replies as answers if they help or unmark if not.
If you have any feedback about my replies, please contact
[email protected]
Microsoft One Code Framework -
URGENT: Azure Storage Table Outage
UPDATE: Problem appears to have been fixed, are affected accounts eligible for a refund due to the downtime?
Hi. I'm having trouble querying an Azure Storage table that is essential to my business operations. The queries seem to simply not be going through, and when using the Azure Storage Explorer (a 3rd party problem); I encounter the same issue. For some reason;
Azure does not seem to be responding to my storage requests. I also cannot open a support ticket with Microsoft; as our small business does not have a support agreement with Microsoft. The storage account name is chapteradviser, and it is not even possible
to query the table service for a list of tables (it keeps timing out). This seems to be a problem at the Azure datacenter, not my end. I am also an IT consultant for the company and do not have the binding authority to enter into a support agreement with Azure.
Thanks for any assistance,
- Brian Bosak
- Consulting with ChapterAdviser Llc.Yup I see it too.. it looks like tables is really slow :(
Adding curl tracing if someone from MS is looking .. (i redacted out the account name..)
$ curl -v --trace-time -X GET https://xxxxxxxxxx.table.core.windows.net
17:14:22.296000 * Adding handle: conn: 0x1e67e80
17:14:22.296000 * Adding handle: send: 0
17:14:22.296000 * Adding handle: recv: 0
17:14:22.296000 * Curl_addHandleToPipeline: length: 1
17:14:22.312000 * - Conn 0 (0x1e67e80) send_pipe: 1, recv_pipe: 0
17:14:22.312000 * About to connect() to xxxxxxxxxx.table.core.windows.net port 443 (#0)
17:14:22.312000 * Trying 23.99.32.80...
17:14:25.375000 * Connected to xxxxxxxxxx.table.core.windows.net (23.99.32.80) port 443 (#0)
17:14:25.640000 * successfully set certificate verify locations:
17:14:25.656000 * CAfile: C:\Program Files (x86)\Git\bin\curl-ca-bundle.crt
CApath: none
17:14:25.656000 * SSLv3, TLS handshake, Client hello (1):
17:14:30.859000 * SSLv3, TLS handshake, Server hello (2):
17:14:30.875000 * SSLv3, TLS handshake, CERT (11):
17:14:30.890000 * SSLv3, TLS handshake, Server finished (14):
17:14:30.921000 * SSLv3, TLS handshake, Client key exchange (16):
17:14:30.921000 * SSLv3, TLS change cipher, Client hello (1):
17:14:30.937000 * SSLv3, TLS handshake, Finished (20):
17:14:41.937000 * SSLv3, TLS change cipher, Client hello (1):
17:14:41.953000 * SSLv3, TLS handshake, Finished (20):
17:14:41.953000 * SSL connection using AES128-SHA
17:14:41.968000 * Server certificate:
17:14:41.984000 * subject: CN=*.table.core.windows.net
17:14:42.000000 * start date: 2014-02-20 12:59:18 GMT
17:14:42.000000 * expire date: 2016-02-20 12:59:18 GMT
17:14:42.031000 * subjectAltName: xxxxxxxxxx.table.core.windows.net matched
17:14:42.046000 * issuer: DC=com; DC=microsoft; DC=corp; DC=redmond; CN=MSIT Machine Auth CA 2
17:14:42.062000 * SSL certificate verify ok.
17:14:42.078000 > GET / HTTP/1.1
17:14:42.078000 > User-Agent: curl/7.30.0
17:14:42.078000 > Host: xxxxxxxxxx.table.core.windows.net
17:14:42.078000 > Accept: */*
17:14:42.078000 >
17:15:35.078000 < HTTP/1.1 400 The requested URI does not represent any resource on the server.
17:15:35.093000 < Content-Length: 360
17:15:35.093000 < Content-Type: application/xml
17:15:35.093000 * Server Microsoft-HTTPAPI/2.0 is not blacklisted
17:15:35.109000 < Server: Microsoft-HTTPAPI/2.0
17:15:35.109000 < x-ms-request-id: f2e0b20e-5888-43ce-bbf0-68589e7ad972
17:15:35.109000 < Date: Sat, 09 Aug 2014 00:15:30 GMT
17:15:35.125000 <
<?xml version="1.0" encoding="utf-8" standalone="yes"?>
<error xmlns="http://schemas.microsoft.com/ado/2007/08/dataservices/metadata">
<code>InvalidUri</code>
<message xml:lang="en-US">The requested URI does not represent any resource on the server.
RequestId:f2e0b20e-5888-43ce-bbf0-68589e7ad972
Time:2014-08-09T00:15:31.4896331Z</message>
</error>17:15:35.125000 * Connection #0 to host xxxxxxxxxx.table.core.windows.net left intact -
I am using Microsoft Azure Java SDK for Uploading Page Blobs (Size: 20G/40G). IN mid of Uploading, SDK throws Storage Exception:
java.io.IOException: null
at com.microsoft.azure.storage.core.Utility.initIOException(Utility.java:584) ~[classes/:na]
at com.microsoft.azure.storage.blob.BlobOutputStream$2.call(BlobOutputStream.java:414) ~[classes/:na]
at com.microsoft.azure.storage.blob.BlobOutputStream$2.call(BlobOutputStream.java:398) ~[classes/:na]
at java.util.concurrent.FutureTask$Sync.innerRun(FutureTask.java:334) ~[na:1.7.0_25]
at java.util.concurrent.FutureTask.run(FutureTask.java:166) ~[na:1.7.0_25]
at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:471) ~[na:1.7.0_25]
at java.util.concurrent.FutureTask$Sync.innerRun(FutureTask.java:334) ~[na:1.7.0_25]
at java.util.concurrent.FutureTask.run(FutureTask.java:166) ~[na:1.7.0_25]
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1145) ~[na:1.7.0_25]
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615) ~[na:1.7.0_25]
at java.lang.Thread.run(Thread.java:724) ~[na:1.7.0_25]
Caused by: com.microsoft.azure.storage.StorageException: Server failed to authenticate the request. Make sure the value of Authorization header is formed correctly including the signature.
at com.microsoft.azure.storage.StorageException.translateException(StorageException.java:162) ~[azure-storage-1.2.0.jar:na]
at com.microsoft.azure.storage.core.StorageRequest.materializeException(StorageRequest.java:306) ~[azure-storage-1.2.0.jar:na]
at com.microsoft.azure.storage.core.ExecutionEngine.executeWithRetry(ExecutionEngine.java:178) ~[classes/:na]
at com.microsoft.azure.storage.blob.CloudPageBlob.putPagesInternal(CloudPageBlob.java:642) ~[classes/:na]
at com.microsoft.azure.storage.blob.CloudPageBlob.uploadPages(CloudPageBlob.java:971) ~[classes/:na]
at com.microsoft.azure.storage.blob.BlobOutputStream$2.call(BlobOutputStream.java:402) ~[classes/:na]
... 9 common frames omitted
SDK Version:
<groupId>com.microsoft.azure</groupId>
<artifactId>azure-storage</artifactId>
<version>1.2.0</version>
Please note this error shows up in the mid of Upload randomly: For example: Transferred bytes: 27325890560 out of total 42949673472 successfully and then fails with Exception pasted above.
Is there a chance of time drift on a Linux Box causing this issue ? Any other pointers would help.
Thanks.Hi Kriti,
We believe its a timing issue. Would you be able to refer the following link and check if it helps?
http://blogs.msdn.com/b/kwill/archive/2013/08/28/http-403-server-failed-to-authenticate-the-request-when-using-shared-access-signatures.aspx
If not, would you be able to share the fiddler trace for this?
Regards,
Malar. -
Is "sp_purge_data" available only to datawarehouses or can it be used with both a normal database and Azure storage as well ?
Thank you for the reply Qiuyun , the article was really helpful!
I do have couple of other questions for you :
How do we execute our SQL queries on Windows Azure tables and create horizontal partitions ? (I know that we have our SQL Server management Studio to execute normal queries on a SQL database , do we have a similar platform for Azure or do we have to get a local
copy of the database to execute our queries and the publish everything back to Azure)? I am looking to partition data on one of our databases and would like to know if it can be done in Azure directly or if we have to bring a local copy down ,write the
partition function and
and partition scheme or create a partition key and a row key - do the needful and publish it back to Azure?
Also, how do I create a partition key and row key in Windows Azure?
I am in the process of designing data archiving strategy for my team and would like to know more about the questions I just mentioned.
Hoping to hear back from you soon.
Thanks in advance for all the help!
-Lalitha. -
Nodejs query azure storage table by timestamp
I can query the table with partition key by using the query:
var query=new azure.TableQuery().where('PartitionKey ne ?','test')
But I tried to query with the timestamp but failed.
var query=new azure.TableQuery().where('Timestamp ge ?',azure.TableUtilities.entityGenerator.DateTime(new Date(Date.UTC(2014,11,11,00,00,00))));
I also tried
var query=new azure.TableQuery().where('Timestamp ge datetime?','2014-11-11T00:00:00Z')
But it still didn't work
in the azure sdk says that I should add datetime keyword before UTC time format. And in nodejs I think the DateTime is the keyword for the datetime. But the query is wrong. Can anyone help me with that?
ThanksPlease see my comment on your question on Stack Overflow:
http://stackoverflow.com/questions/28198972/nodejs-query-azure-storage-table-according-to-timestamp.
Essentially, can you try the following code for specifying where clause:
where("Timestamp ge datetime?", '2014-11-11T00:00:00Z')
I ran this code and didn't get any error.
Hope this helps. -
Access SAS URI through tool like Azure Storage Explorer
Is there any tool available where we are able to access the Storage Account container using SAS URI instead of Access Keys. I have verified that Azure Storage Explorer CloudBerry Explorer don't support SAS URI. The client wants to download files using
SAS URI and we don't want to share the Storage Account Keys.Hi,
As my reply in this thread:
https://social.msdn.microsoft.com/Forums/en-US/6219ae6c-bb2c-4e8f-a49c-4a590a450faa/storage-account-access-using-roles?forum=windowsazuredata, please try to give some feedback to the tool author or submit a feedback at Azure feedback forum.
Best Regards,
Jambor
We are trying to better understand customer views on social support experience, so your participation in this interview project would be greatly appreciated if you have time. Thanks for helping make community forums a great place.
Click
HERE to participate the survey. -
Having issues with VS Server Explorer viewing Azure Storage
Hi,
I've been using VS server explorer pretty successfully for awhile. Recently (not sure of the exact point), the Azure storage portion started giving me trouble. It would not auto-populate with my storage account, and I now have to manually input the account
name and key manually each time to get my storage to pop up.
Also, I can no longer access any blobs via server explorer. It says invalid key - even though I can access via other means with the same account name and key. I have already tried making sure my computer has the correct time synced up. Are there any other
steps I can try do fix this issue? For now I am using a third party client to perform blob admin operations, but it would be really convenient if I could continue to do so with VS server explorer.
Thanks,
ThomasHi Thomas,
Thanks for your post.
I 'd like to suggest you can try to sing out and re-sign in the Azure server explore.
1.right click the Azure node on server explore.
2.Select "Connect to with Microsoft Azure " and then login using your account.
3.Make sure your select the right subscription.
4.If storage list didn't include your account, you could right-click the storage node, and select Attach External Storage.
Another approach is that you can re-install the
Azure tool.
Regards,
Will
We are trying to better understand customer views on social support experience, so your participation in this interview project would be greatly appreciated if you have time. Thanks for helping make community forums a great place.
Click
HERE to participate the survey. -
AzCopy error "The destination azure storage location type cannot be inferred"
Hi All,
I'd like to try out Microsoft Azure File Service.
I created a storage account with File Service support. I created a File Share. I was able to mount this File Share using "net use" from my Azure VM. I can manage files from this VM.
Now I'd like to copy files from my home computer.
C:\Program Files (x86)\Microsoft SDKs\Azure>AzCopy /Source:c:\mydir /Dest:https://myendpoint.file.core.windows.net/myshare1/ /DestKey:%key% /S
I get an error:
[2015.02.21 11:08:21][ERROR] The syntax of the command is incorrect. The destination azure storage location type can not be inferred. Use /destType:blob to specify the location type explicitly.Hi,
AzCopy version 2.4 now supports moving your local files to Azure files and back. Please check if the AzCopy version is 2.4 or higher.
In addition, empty folders will not be uploaded. Please make sure that the source you defined is not an empty folder.
Best regards,
Susie
Please remember to mark the replies as answers if they help and unmark them if they provide no help. If you have feedback for TechNet Subscriber Support, contact [email protected] -
How to set web job like a ssis which can take xmil file from azure storage to up date azure sql
Hi,
I have xml files under azure storage, i would like to set up web job which should be scheduled and
load xml to update/insert/delete azure database.
I have done SSIS locally and but cant use in azure.
According to several advices and forum pages,I need to create azure web job.
Please advice link or advice.
Thanks.
SupermanHi,
To get started with Azure webjobs you could refer the following link:
http://azure.microsoft.com/en-in/documentation/articles/websites-dotnet-webjobs-sdk-get-started/
The article shows how to create a multi-tier ASP.NET MVC application that uses the WebJobs SDK to work with Azure blobs in an Azure Website.
Meanwhile, we will research regarding your specific requirement and revert with our finds.
Regards,
Malar -
Can't install windows azure storage sdk on windows phone 8
From Visual studio 2013 ultimate, I opened Nuget package manager console and typed following command for my windows phone 8 solution:
install-package windowsazure.storage
And I got the following error:
install-package : Could not install package 'Microsoft.WindowsAzure.ConfigurationManager 1.8.0.0'. You are trying to install this package into a project that targets 'WindowsPhone,Version=v8.0', but the package does not contain any assembly references or
content files that
are compatible with that framework. For more information, contact the package author.
At line:1 char:1
I get the same problem using UI version of package manager as well.
Did something change with the newer version of windows azure storage?The package you are trying to install does not include the Windows Phone library. Please use the following command instead:
Install-Package WindowsAzure.Storage-Preview -Pre
Thanks for this. It worked for me. Can I just add, for those that don't know how to run this command. You go to tools > Library package manager > package manager console then paste in the command and hit enter.
Thanks again! -
When trying to install Windows Azure SDK for .NET (VS 2013) - 2.3 from Web Platform Installer 4.6, the install fails because Windows Azure Storage Emulator - 3.0 (Dependency) does not install successfully.
Possibly relevant lines from the install log are:
CAQuietExec: Entering CAQuietExec in C:\WINDOWS\Installer\MSI1223.tmp, version 3.6.3303.0
CAQuietExec: "C:\Program Files (x86)\Microsoft SDKs\Windows Azure\Storage Emulator\WAStorageEmulator.exe" init -forcecreate -autodetect
CAQuietExec: Windows Azure Storage Emulator 3.0.0.0 command line tool
CAQuietExec: Error: No available SQL Instance was found.
CAQuietExec: Error 0xfffffff6: Command line returned an error.
CAQuietExec: Error 0xfffffff6: CAQuietExec Failed
CustomAction RunInitialize returned actual error code 1603 (note this may not be 100% accurate if translation happened inside sandbox)
Action ended 11:50:13: InstallFinalize. Return value 3.
Action ended 11:50:13: INSTALL. Return value 3.
In terms of SQL Instance, SQL LocalDB is installed and working properly. SQL Server 2012 is also installed and working properly.Hi,
It is a SDK version issue. I suggest you could remove all azure sdk form your PC and use WPI to install the latest version again.
If you have any questions, please let me know.
Regards,
Will
We are trying to better understand customer views on social support experience, so your participation in this interview project would be greatly appreciated if you have time. Thanks for helping make community forums a great place.
Click
HERE to participate the survey.
Maybe you are looking for
-
PDF documents that cannot be edited in Reader for IOS
Hello, I've been trying to edit (highlight and add comments) a PDF document on my iPad, but the app is not cooperating with me. I was able to edit it with Adobe Reader on my PC, but the same file cannot be edited with the ios app. I tried some other
-
Everytime I restart my Macbook Pro, I have to retype my 802.1x connection's password. Is there any way to avoid this? A script maybe?
-
Replication of Custom Include IS-U to CRM
Hello gents! CRM 4.0 WebClient, IS-U 4.72. Currently custom fields where added to the technical objects in ISU. I am currently facing the following problem. A custom include was added into the IFLOT table (Functional Location) in IS-U. It appears on
-
Keyframes not moving with clips
I have a large number of audio clips with lots of keyframes - the video edit (which was locked) now needs some minor tweaking. But when I go to move my audio clips forward to sync with the now shortened video, the keyframes don't move with the clips.
-
A good post about what is still missing in Photoshop... http://tinyurl.com/2dpx46o