Latest Azure Storage SDK 4.2.1 fails under dev emulator when called at CloudTableClient.CreateIfNotExists()

I am using Azure SDK 2.2 and Azure Storage SDK 4.2.1 and storage config. 2.0 in my cloud solution.  I am trying to initialize the storage as follows.
CloudStorageAccount storageAccount = CloudStorageAccount.Parse(connectionKey);
                // Create the table client.
                CloudTableClient tableClient = storageAccount.CreateCloudTableClient();
                // Create the table if it doesn't exist.
                table = tableClient.GetTableReference(tableName);
                table.CreateIfNotExists();
Table.CreateIfNotExists() throws an exception in my dev emulator. I have been seeing this issue for quite sometime. It looks like MSFT has not fixed this issue at all. The same code works fine when I connect to the real Azure storage in the cloud. It just
does not work under emulator. How can I debug anything. If I use Storage SDK 2.1, it works fine in dev emulator. After 3.0, nothing seems to work. Am I only one or anybody else have the same issue. Is there any workaround for this version or should I stick
to the old version. Any help is appreciated.
here is the detail exception.
ex {"The remote server returned an error: (400) Bad Request."}
System.Exception {Microsoft.WindowsAzure.Storage.StorageException}
+ [Microsoft.WindowsAzure.Storage.StorageException]
{"The remote server returned an error: (400) Bad Request."}
Microsoft.WindowsAzure.Storage.StorageException
_className
null string
+ _data
{System.Collections.ListDictionaryInternal}
System.Collections.IDictionary {System.Collections.ListDictionaryInternal}
_dynamicMethods
null object
_exceptionMethod
null System.Reflection.MethodBase
_exceptionMethodString
null string
_helpURL
null string
_HResult
-2146233088 int
+ _innerException
{"The remote server returned an error: (400) Bad Request."}
System.Exception {System.Net.WebException}
+ _ipForWatsonBuckets
8791483017288 System.UIntPtr
_message
"The remote server returned an error: (400) Bad Request."
string
_remoteStackIndex
0 int
_remoteStackTraceString
null string
+ _safeSerializationManager
{System.Runtime.Serialization.SafeSerializationManager}
System.Runtime.Serialization.SafeSerializationManager
_source
null string
+ _stackTrace
{sbyte[192]} object {sbyte[]}
_stackTraceString
null string
_watsonBuckets
null object
_xcode
-532462766 int
+ _xptrs
0 System.IntPtr
+ Data
{System.Collections.ListDictionaryInternal}
System.Collections.IDictionary {System.Collections.ListDictionaryInternal}
HelpLink
null string
HResult
-2146233088 int
+ InnerException
{"The remote server returned an error: (400) Bad Request."}
System.Exception {System.Net.WebException}
+ IPForWatsonBuckets
8791483017288 System.UIntPtr
IsTransient
false bool
Message
"The remote server returned an error: (400) Bad Request."
string
RemoteStackTrace
null string
Source
"Microsoft.WindowsAzure.Storage" string
StackTrace
"   at Microsoft.WindowsAzure.Storage.Core.Executor.Executor.ExecuteSync[T](RESTCommand`1 cmd, IRetryPolicy policy, OperationContext operationContext)\r\n   at Microsoft.WindowsAzure.Storage.Table.TableOperation.Execute(CloudTableClient client,
CloudTable table, TableRequestOptions requestOptions, OperationContext operationContext)\r\n   at Microsoft.WindowsAzure.Storage.Table.CloudTable.Exists(Boolean primaryOnly, TableRequestOptions requestOptions, OperationContext operationContext)\r\n  
at Microsoft.WindowsAzure.Storage.Table.CloudTable.CreateIfNotExists(TableRequestOptions requestOptions, OperationContext operationContext)\r\n   at PSI.Applications.RemoteManagement.TraceLogManagement.AzureTableStore.InitializeStorageTable(String tableName,
String connectionKey) in c:\\AzureTraceCloudService\\AzureTableStoreLib\\AzureTableStore.cs:line 27"
string
+ TargetSite
{T ExecuteSync[T](Microsoft.WindowsAzure.Storage.Core.Executor.RESTCommand`1[T], Microsoft.WindowsAzure.Storage.RetryPolicies.IRetryPolicy, Microsoft.WindowsAzure.Storage.OperationContext)}
System.Reflection.MethodBase {System.Reflection.RuntimeMethodInfo}
WatsonBuckets
null object
+ Static members
Thanks,
Ravi

Hi Ravi,
This issue maybe caused by storage client library is still not compatible with storage emulator and that's why your code is failing. I would recommend downgrading your storage client library to the previous version and your code should work just fine,
the best option I think is to upgrade the SDK to the latest version, refer to
http://azure.microsoft.com/blog/2014/08/04/announcing-release-of-visual-studio-2013-update-3-and-azure-sdk-2-4/ for more details.
Best Regards,
Jambor
We are trying to better understand customer views on social support experience, so your participation in this interview project would be greatly appreciated if you have time. Thanks for helping make community forums a great place.
Click
HERE to participate the survey.

Similar Messages

  • Export Table Data to JSON via Azure Storage SDK

    Hi,
    I'm looking to export the contents of a table in JSON format and then save that JSON to one or more files.
    I see that we can have the table return JSON using: tableClient.DefaultRequestOptions.PayloadFormat = TablePayloadFormat.Json;
    I also see how to deserialize the JSON into an array of objects using something like this to get an array of CustomerEntity:
    IQueryable<CustomerEntity> query = from customer in table.CreateQuery<CustomerEntity>()
    where string.Compare(customer.PartitionKey, "I") >= 0 &&
    string.Compare(customer.PartitionKey, "X") <= 0 &&
    customer.Rating >= 2
    select customer;
    CustomerEntity[] customers = query.ToArray();
    But what if I don't want the results as CustomerEntity objects, I just want the raw JSON?
    The
    CloudTable.CreateQuery method requires a type that inherits from ITableEntity...
    I guess I could switch from using the Azure Storage SDK client to an HTTP client and query via OData, but I'd prefer a solution within the Azure Storage SDK...
    Thanks,
    Aron

    Thanks Will,
    Here is a more complete code snippet. As you can see, I have the payload set to JSON.
    const string customersTableName = "Customers";
    string connectionString = string.Format("DefaultEndpointsProtocol=https;AccountName={0};AccountKey={1}", accountName, accountKey);
    CloudStorageAccount storageAccount = CloudStorageAccount.Parse(connectionString);
    CloudTableClient tableClient = storageAccount.CreateCloudTableClient();
    // Values supported can be AtomPub, Json, JsonFullMetadata or JsonNoMetadata with Json being the default value
    tableClient.DefaultRequestOptions.PayloadFormat = TablePayloadFormat.Json;
    // Create the Customers table
    CloudTable table = tableClient.GetTableReference(customersTableName);
    table.CreateIfNotExists();
    // Insert a couple of customers into the Customers table
    foreach (CustomerEntity customer in CustomerEntity.GetCustomersToInsert())
    table.Execute(TableOperation.Insert(customer, echoContent: false));
    // The response have a payload format of JSON no metadata and the
    // client library will map the properties returned back to the CustomerEntity object
    IQueryable<CustomerEntity> query = from customer in table.CreateQuery<CustomerEntity>()
    where string.Compare(customer.PartitionKey, "I") >= 0 &&
    string.Compare(customer.PartitionKey, "X") <= 0 &&
    customer.Rating >= 2
    select customer;
    CustomerEntity[] customers = query.ToArray();
    However, the way the query is set up it automatically casts the results as CustomerEntity. That's the challenge - How to get the JSON payload before it gets cast to CustomerEntity...
    Thanks,
    Aron

  • Azure PHP SDK createBlobService() not working - fails at $httpClient = $this- httpClient() in ServicesBuilder.php

    We are writing a PHP application to run in Azure but the Azure SDK does not seem to work when trying to use Azure Storage.  Here is the relevant code snippet:
    $connectionString = "DefaultEndpointsProtocol=http;AccountName=<accountname>;AccountKey=<accountkey>";echo $connectionString;$blobRestProxy = ServicesBuilder::getInstance()->createBlobService($connectionString);
    We are of course doing this at the top of the php file that has the above snippet in it:
    require_once 'WindowsAzure\WindowsAzure.php';use WindowsAzure\Common\ServicesBuilder;
    Everything after calling 'createBlobService' fails.  We dug into it a bit more and found out that the $httpClient = $this->httpClient(); call in the below snippet just kills everything (the call exists in WindowsAzure\Common\ServicesBuilder.php).  If
    we comment out the $httpClient line and everything below in createQueueService(), the rest of the code in the page will be run (we have some echos and stuff so we know it ran the rest of the page):
    public function createQueueService($connectionString)
    $settings = StorageServiceSettings::createFromConnectionString(
    $connectionString
    $httpClient = $this->httpClient();
    $serializer = $this->serializer();
    $uri = Utilities::tryAddUrlScheme(
    $settings->getQueueEndpointUri()
    PS.  We also found a StackOverflow question asking this exact thing but there's no answer to it:
    http://stackoverflow.com/questions/13930880/connect-to-azure-blob-through-azure-website-with-php
    We are at a loss and could really use a bit of help here.
    Thanks
    ArcDatum

    Hi ArcDatum,
    I understand that you are writing a PHP application for creating a blob service createBlobService() and it doesn't work.
    Firstly, Do you receive any error Messages ?
    Please check this if you are receving any error codes :
    Blob Service Error Codes
    Recommend you to follow the steps from this
    document
    also check 3rd party doc if this of any help
    Please feel free to write back to us incase of any queries .
    Regards,
    Shirisha Paderu.
    Disclaimer :  This response contains a reference to a third party World Wide Web site. Microsoft is providing this information as a convenience to you. Microsoft does not control these sites and has not tested any software
    or information found on these sites; therefore, Microsoft cannot make any representations regarding the quality, safety, or suitability of any software or information found there. There are inherent dangers in the use of any software found on the Internet,
    and Microsoft cautions you to make sure that you completely understand the risk before retrieving any software from the Internet.

  • Can't install windows azure storage sdk on windows phone 8

    From Visual studio 2013 ultimate, I opened Nuget package manager console and typed following command for my windows phone 8 solution:
    install-package windowsazure.storage
    And I got the following error:
    install-package : Could not install package 'Microsoft.WindowsAzure.ConfigurationManager 1.8.0.0'. You are trying to install this package into a project that targets 'WindowsPhone,Version=v8.0', but the package does not contain any assembly references or
    content files that
    are compatible with that framework. For more information, contact the package author.
    At line:1 char:1
    I get the same problem using UI version of package manager as well.
    Did something change with the newer version of windows azure storage?

    The package you are trying to install does not include the Windows Phone library. Please use the following command instead:
    Install-Package WindowsAzure.Storage-Preview -Pre
    Thanks for this. It worked for me. Can I just add, for those that don't know how to run this command. You go to tools > Library package manager > package manager console then paste in the command and hit enter.
    Thanks again!

  • Azure Dedicated Role Cache does not fire notification to client when call DataCache.Clear method

    Hi,
    I'm using Azure Dedicated Role cache and  Enabled the local cache and notification.
    <localCache isEnabled="true" sync="NotificationBased" objectCount="20000" ttlValue="3600" />
    <clientNotification pollInterval="1" maxQueueLength="10000" />
    but when I call DataCache.Clear method in one client, the notification does not send to all clients, the client's local cached content still been used.
    is this a bug of Azure SDK 2.5?

    I assume that you have correctly configured and enabled Cache notifications, instead of using the DataCache's clear method - have you tried clearing all regions in your cache because I am not sure if DataCache.Clear triggers the cache notification or not.
    so your code should look something like this
    foreach (string regionName in cache.GetSystemRegions())
    cache.ClearRegion(regionName)
    You can refer the link which explains about Triggering Cache Notifications  https://msdn.microsoft.com/en-us/library/ee808091(v=azure.10).aspx
    Bhushan | Blog |
    LinkedIn | Twitter

  • Failed to load TemplateStream when called from API, but OK in Workbench

    I am getting a "Failed to load TemplateStream for FormQuery" error when I invoke a simple process using the Java API.  When I invoke the same process from WorkBench it's fine.  I am using the FormsService in ES 8.2 turnkey installation.  The operation is "renderPDFForm".  Content Root URI is set to repository:// and Form to Render is set to /CANS_ASP/CANS_ASP.pdf, which is in the repository.  Why would this work when invoked by way of right clicking in WorkBench, but not work when the process is invoked by the Java API?
    Here's the whole error:
    2009-08-25 20:31:23,109 ERROR [com.adobe.workflow.AWS] An exception was thrown with name com.adobe.livecycle.formsservice.exception.RenderFormException message:Failed to load TemplateStream for FormQuery=/CANS_ASP/CANS_ASP.pdf from location URI =repository://. while invoking service FormsService and operation renderPDFForm and no fault routes were found to be configured.
    Thanks.
    Jared

    This problem is not solved, but I have a workaround.  I have gotten it working by avoiding the repository and specifying the form location from the file system.  This works:
    Content Root URI: file:///C:\\
    Form to Render: CANS_ASP.pdf
    In case it wasn't clear, I'm not using a FormServiceClient object.  I'm invoking an orchestration using a ServiceClient.  The Content Root URI and Form To Render are specified in the properties of my renderPDFForm component in Workbench.  I'm able to move ahead with development using this workaround for now, but the question remains about why repository:// does not work when my orchestration is invoked using the Java API but it works fine when tested from WorkBench.
    Jared Langdon

  • Windows Azure Tools: Failed to initialize Windows Azure storage emulator. Unable to start the storage emulator.

    Hi,
    I created a cloud service using Visual Studio 2013. When I tried to run the service, it fails with
    Windows Azure Tools: Failed to initialize Windows Azure storage emulator. Unable to start the storage emulator.
    Is there a way to debug this error? At a bare minimum, it would be great to get an error message on why it could not start.
    Thanks

    I ran Windows Azure Storage Emulator -v3.1 to launch the command line tool. I wanted to see if I could get some debug information, but that failed with the following exception.
    C:\Program Files (x86)\Microsoft SDKs\Windows Azure\Storage Emulator>WAStorageEmulator.exe start
    Windows Azure Storage Emulator 3.1.0.0 command line tool
    Unhandled Exception: System.TimeoutException: Unable to open wait handle.
       at Microsoft.WindowsAzure.Storage.Emulator.Controller.EmulatorProcessController.InternalWaitForStorageEmulator(Int32 timeoutInMilliseconds)
       at Microsoft.WindowsAzure.Storage.Emulator.Controller.EmulatorProcessController.EnsureRunning(Int32 timeoutInMilliseconds)
       at Microsoft.WindowsAzure.Storage.Emulator.StartCommand.RunCommand()
       at Microsoft.WindowsAzure.Storage.Emulator.Program.Main(String[] args)

  • Cannot install Windows Azure Storage Emulator - 3.0 Error: No available SQL Instance was found

    When trying to install Windows Azure SDK for .NET (VS 2013) - 2.3 from Web Platform Installer 4.6, the install fails because  Windows Azure Storage Emulator - 3.0 (Dependency) does not install successfully.  
    Possibly relevant lines from the install log are:
    CAQuietExec:  Entering CAQuietExec in C:\WINDOWS\Installer\MSI1223.tmp, version 3.6.3303.0
    CAQuietExec:  "C:\Program Files (x86)\Microsoft SDKs\Windows Azure\Storage Emulator\WAStorageEmulator.exe" init -forcecreate -autodetect
    CAQuietExec:  Windows Azure Storage Emulator 3.0.0.0 command line tool
    CAQuietExec:  Error: No available SQL Instance was found.
    CAQuietExec:  Error 0xfffffff6: Command line returned an error.
    CAQuietExec:  Error 0xfffffff6: CAQuietExec Failed
    CustomAction RunInitialize returned actual error code 1603 (note this may not be 100% accurate if translation happened inside sandbox)
    Action ended 11:50:13: InstallFinalize. Return value 3.
    Action ended 11:50:13: INSTALL. Return value 3.
    In terms of SQL Instance, SQL LocalDB is installed and working properly.  SQL Server 2012 is also installed and working properly.

    Hi,
    It is a SDK version issue. I suggest you could remove all azure sdk form your PC and use WPI to install the latest version again.
    If you have any questions, please let me know.
    Regards,
    Will 
    We are trying to better understand customer views on social support experience, so your participation in this interview project would be greatly appreciated if you have time. Thanks for helping make community forums a great place.
    Click
    HERE to participate the survey.

  • How do I improve performance while doing pull, push and delete from Azure Storage Queue

           
    Hi,
    I am working on a distributed application with Azure Storage Queue for message queuing. queue will be used by multiple clients across the clock and thus it is expected that it would be heavily loaded most on the time in usage. business case is typical as in
    it pulls message from queue, process the message then deletes the message from queue. this module also sends back a notification to user indicating process is complete. functions/modules work fine as in they meet the logical requirement. pretty typical queue
    scenario.
    Now, coming to the problem statement. since it is envisaged that the queue would be heavily loaded most of the time, I am pushing towards to speed up processing of the overall message lifetime. the faster I can clear messages, the better overall experience
    it would be for everyone, system and users.
    To improve on performance I did multiple cycles for performance profiling and then improving on the identified "HOT" path/function.
    It all came down to a point where only the Azure Queue pull and delete are the only two most time consuming calls outside. I can further improve on pull, which i did by batch pulling 32 message at a time (which is the max message count i can pull from Azure
    queue at once at the time of writing this question.), this returned me a favor as in by reducing processing time to a big margin. all good till this as well.
    i am processing these messages in parallel so as to improve on overall performance.
    pseudo code:
    //AzureQueue Class is encapsulating calls to Azure Storage Queue.
    //assume nothing fancy inside, vanila calls to queue for pull/push/delete
    var batchMessages = AzureQueue.Pull(32); Parallel.ForEach(batchMessages, bMessage =>
    //DoSomething does some background processing;
    try{DoSomething(bMessage);}
    catch()
    //Log exception
    AzureQueue.Delete(bMessage);
    With this change now, profiling results show that up-to 90% of time is only taken by the Azure Message delete calls. As it is good to delete message as soon as processing is done, i remove it just after "DoSomething" is finished.
    what i need now is suggestions on how to further improve performance of this function when 90% of the time is being eaten up by the Azure Queue Delete call itself? is there a better faster way to perform delete/bulk delete etc?
    with the implementation mentioned here, i get speed of close to 25 messages/sec. Right now Azure queue delete calls are choking application performance. so is there any hope to push it further.
    Does it also makes difference in performance which queue delete call am making? as of now queue has overloaded method for deleting message, one which except message object and another which accepts message identifier and pop receipt. i am using the later
    one here with message identifier nad pop receipt to delete message from queue.
    Let me know if you need any additional information or any clarification in question.
    Inputs/suggestions are welcome.
    Many thanks.

    The first thing that came to mind was to use a parallel delete at the same time you run the work in DoSomething.  If DoSomething fails, add the message back into the queue.  This won't work for every application, and work that was in the queue
    near the head could be pushed back to the tail, so you'd have to think about how that may effect your workload.
    Or, make a threadpool queued delete after the work was successful.  Fire and forget.  However, if you're loading the processing at 25/sec, and 90% of time sits on the delete, you'd quickly accumulate delete calls for the threadpool until you'd
    never catch up.  At 70-80% duty cycle this may work, but the closer you get to always being busy could make this dangerous.
    I wonder if calling the delete REST API yourself may offer any improvements.  If you find the delete sets up a TCP connection each time, this may be all you need.  Try to keep the connection open, or see if the REST API can delete more at a time
    than the SDK API can.
    Or, if you have the funds, just have more VM instances doing the work in parallel, so the first machine handles 25/sec, the second at 25/sec also - and you just live with the slow delete.  If that's still not good enough, add more instances.
    Darin R.

  • Nodejs query azure storage table by timestamp

    I can query the table with partition key by using the query:
    var query=new azure.TableQuery().where('PartitionKey ne ?','test')
    But I tried to query with the timestamp but failed.
    var query=new azure.TableQuery().where('Timestamp ge ?',azure.TableUtilities.entityGenerator.DateTime(new Date(Date.UTC(2014,11,11,00,00,00))));
    I also tried 
    var query=new azure.TableQuery().where('Timestamp ge datetime?','2014-11-11T00:00:00Z')
    But it still didn't work
    in the azure sdk says that I should add datetime keyword before UTC time format. And in nodejs I think the DateTime is the keyword for the datetime. But the query is wrong. Can anyone help me with that?
    Thanks

    Please see my comment on your question on Stack Overflow:
    http://stackoverflow.com/questions/28198972/nodejs-query-azure-storage-table-according-to-timestamp.
    Essentially, can you try the following code for specifying where clause:
    where("Timestamp ge datetime?", '2014-11-11T00:00:00Z')
    I ran this code and didn't get any error.
    Hope this helps.

  • Unable to start azure storage emulator

    Hi,
    I am using Visual Studio 2013, Azure tools 2.3, where i am not able to open the command prompt as
    an administrator. Therefore I am trying as a non admin. I am able to initialize the storage emulator  but unable to start it where it leaves an error: unable to start storage emulator.Please could you help me out.
    C:\Program Files (x86)\Microsoft SDKs\Azure\Storage Emulator>WAStorageEmulator.e
    xe init
    Windows Azure Storage Emulator 3.3.0.0 command line tool
    The storage emulator was successfully initialized and is ready to use.
    C:\Program Files (x86)\Microsoft SDKs\Azure\Storage Emulator>WAStorageEmulator.e
    xe start
    Windows Azure Storage Emulator 3.3.0.0 command line tool
    Error: Unable to start the storage emulator.

    hi sir,
    Firstly, I recommend you could try to use this method "Run as Administrator" to open the emulator:
    Secondly, I suggest you could refer to Gaura's blog about how to Managing Storage Emulator in Azure SDK Version 2.3:
    http://gauravmantri.com/2014/04/04/managing-storage-emulator-in-azure-sdk-version-2-3/
    And  the compromise approach is that you uninstall all version of Azure SDK from your PC and install the latest version again.
    Please try it and any questions, please let me know .
    Regards,
    Will
    We are trying to better understand customer views on social support experience, so your participation in this interview project would be greatly appreciated if you have time. Thanks for helping make community forums a great place.
    Click
    HERE to participate the survey.

  • Uploading Page Blob to Azure - Storage Exception: Make sure the value of Authorization header is formed correctly

    I am using Microsoft Azure Java SDK for Uploading Page Blobs (Size: 20G/40G). IN mid of Uploading, SDK throws Storage Exception:
    java.io.IOException: null
    at com.microsoft.azure.storage.core.Utility.initIOException(Utility.java:584) ~[classes/:na]
    at com.microsoft.azure.storage.blob.BlobOutputStream$2.call(BlobOutputStream.java:414) ~[classes/:na]
    at com.microsoft.azure.storage.blob.BlobOutputStream$2.call(BlobOutputStream.java:398) ~[classes/:na]
    at java.util.concurrent.FutureTask$Sync.innerRun(FutureTask.java:334) ~[na:1.7.0_25]
    at java.util.concurrent.FutureTask.run(FutureTask.java:166) ~[na:1.7.0_25]
    at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:471) ~[na:1.7.0_25]
    at java.util.concurrent.FutureTask$Sync.innerRun(FutureTask.java:334) ~[na:1.7.0_25]
    at java.util.concurrent.FutureTask.run(FutureTask.java:166) ~[na:1.7.0_25]
    at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1145) ~[na:1.7.0_25]
    at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615) ~[na:1.7.0_25]
    at java.lang.Thread.run(Thread.java:724) ~[na:1.7.0_25]
    Caused by: com.microsoft.azure.storage.StorageException: Server failed to authenticate the request. Make sure the value of Authorization header is formed correctly including the signature.
    at com.microsoft.azure.storage.StorageException.translateException(StorageException.java:162) ~[azure-storage-1.2.0.jar:na]
    at com.microsoft.azure.storage.core.StorageRequest.materializeException(StorageRequest.java:306) ~[azure-storage-1.2.0.jar:na]
    at com.microsoft.azure.storage.core.ExecutionEngine.executeWithRetry(ExecutionEngine.java:178) ~[classes/:na]
    at com.microsoft.azure.storage.blob.CloudPageBlob.putPagesInternal(CloudPageBlob.java:642) ~[classes/:na]
    at com.microsoft.azure.storage.blob.CloudPageBlob.uploadPages(CloudPageBlob.java:971) ~[classes/:na]
    at com.microsoft.azure.storage.blob.BlobOutputStream$2.call(BlobOutputStream.java:402) ~[classes/:na]
    ... 9 common frames omitted
    SDK Version:
    <groupId>com.microsoft.azure</groupId>
    <artifactId>azure-storage</artifactId>
    <version>1.2.0</version>
    Please note this error shows up in the mid of Upload randomly: For example: Transferred bytes: 27325890560 out of total 42949673472 successfully and then fails with Exception pasted above.
    Is there a chance of time drift on a Linux Box causing this issue ? Any other pointers would help.
    Thanks.

    Hi Kriti,
    We believe its a timing issue. Would you be able to refer the following link and check if it helps?
    http://blogs.msdn.com/b/kwill/archive/2013/08/28/http-403-server-failed-to-authenticate-the-request-when-using-shared-access-signatures.aspx
    If not, would you be able to share the fiddler trace for this?
    Regards,
    Malar.

  • How to set web job like a ssis which can take xmil file from azure storage to up date azure sql

    Hi,
    I have xml files under azure storage, i would like to set up web job which should be scheduled and
    load xml to update/insert/delete azure database.
    I have done SSIS locally and but cant use in azure.
    According to several advices and forum pages,I need to create azure web job.
    Please advice link or advice.
    Thanks.
    Superman

    Hi,
    To get started with Azure webjobs you could refer the following link:
    http://azure.microsoft.com/en-in/documentation/articles/websites-dotnet-webjobs-sdk-get-started/
    The article shows how to create a multi-tier ASP.NET MVC application that uses the WebJobs SDK to work with Azure blobs in an Azure Website.
    Meanwhile, we will research regarding your specific requirement and revert with our finds.
    Regards,
    Malar

  • How to upload 100 Mb+ video on Azure storage using PHP

    Dear Sirs,
    I have uploaded 1.2 Mb video to Azure storage through PHP code with no problem. When I tried to upload a 90 Mb video I got the following error message:
    Fatal error: Allowed memory size of 268435456 bytes exhausted (tried to allocate 90363852 bytes) in C:\xampp\htdocs\projetos\ajao\azure-sdk-for-php-master\WindowsAzure\Blob\BlobRestProxy.php on line 1319
    Can somebody please instruct me how to upload 100 Mb+ file size into Azure storage using PHP code?
    In advance thanks for your help.
    Best regards,
    Luiz Doi

    Hi Manu, Im having the same problem as Luiz, getting the same type of errors. I'm not a dev so I thought I ask for more details to clear things out.
    I have a Joomla site that was configured in though Azure hosting but I can't upload anything into Joomla less than 8MB. So basically having Azure storage blobs and media services for Joomla is useless if I can't use the larger files into the site or out
    of the site into Azure storage/Media services acct.   
    I even manually changed the upload_max_filesize and post_max_size parameters on the admin and then directly on the site code and even added a php.ini file on the root to override all as I read
    here but still nothing worked. 
    Therefore the code you posted above seems the solution but just wanted to be sure if it will solve my issue.
    1. I need my Azure hosted site to allow larger files up to 500MB.
    2. I need the files people upload on my site front end through a form see
    here to be sent to my Azure blob. What code do I need to paste on the Joomla's forms admn. So basically files first go into site and then to storage.
    3. After I upload the Php SDk and then the Http_Request2 PEAR
    and paste your code in the SDK how does the code recognize my Azure site and my azure blob/containers? Where in the code do I plug in my info?
    Kindly advise.
    Karin
    kd9000

  • Azure Storage Blob error - AnonymousClientOtherError and AnonymousNetworkError (why cannot see images)

    I have an mobile app and I put images in Azure Storage Blob. when tested by several of our own people (on test and beta), it is all good.
    But when we released it to beta and had hundreds (maybe above one thousand) of users to use, lots of them report that they cannot see images. It happens on their iPhones and also on many different brands of Android phones. Sometimes, for the same image,
    on one phone it is good, but on another it doesn't show.
    When I check blob log, I saw a lot of errors, mainly these two:
    AnonymousClientOtherError; 304
    "Anonymous request that failed as expected, most commonly a request that failed a specified precondition. The total number of anonymous requests that failed precondition checks (like If- Modified etc.) for GET requests. Examples: Conditional GET requests
    that fail the check."  (From Microsoft)
    AnonymousNetworkError; 200
    "The most common cause of this error is a client disconnecting before a timeout expires in the storage service. You should investigate the code in your client to understand why and when the client disconnects from the storage service. You can also use
    Wireshark, Microsoft Message Analyzer, or Tcping to investigate network connectivity issues from the client. " (From Microsoft) - a question here, this is an error, but why it is 200?
    I wonder if these are the reasons that caused my problem?
    For the first one, from my understanding, it is not actually an error, it just says that the version client cached is the same as server version. But when my client side sees this response, it thinks it is an error and throw an exception and thus no image
    is shown? (I actually outsourced my client side, so I can only guess). I later tested with my browser to access these images and found that if I refresh the browser with the same URL of the image, I saw error 304 on the log but I still see the image. So I
    guess this is not actually a reason for my problem.
    For the second one, is it because my client side's timeout is shorter than the server side's timeout? But is the timeout a connection timeout or a socket timeout? what are the default values on client side and on Azure Blob? Or is it because the network
    is not good? My Azure server is located in East Asia (Hongkong), but my users are in mainland China. I wonder if that could cause problem? But when a few users tested in China it is all good.
    Many of the images are actually very small,  just one to two hundred k. Some are just 11k.
    I cannot figure out what is the reason...

    Hi,
    Does any people encounter this error when they access the small picture, if this issue is only caused by large picture, please try to improve the timeout, you can also monitor your storage account from the Azure Management Portal, this may be help us
    to find out the detailed error information. see more at:
    http://azure.microsoft.com/en-gb/documentation/articles/storage-monitor-storage-account/, hope it helps.
    Regards

Maybe you are looking for

  • Access af:table values from JavaScript array (for google maps task)

    Hi! I have JSP page with af:table where latitude and longitude for google maps are stored. I am using these tutorial [https://blogs.oracle.com/middleware/entry/integrating_google_maps_with_adf] and I know how to access latitude and longitude from out

  • Event date incorrect after import

    I just imported about 30 minutes of recorded video from a miniDVD from my Sony Handycam Camcorder DVD-405. Since there were two "events" in that one miniDVD, I imported all of it as a single event, and then took some of the clips and selected "split

  • Update Field VBAP-ABGRU

    Dear all.    I need to put some value in the field ABGRU. When the configuration of the Item is not completed. I'm try to used the program  MV45AFZZ and the exit  FORM USEREXIT_SAVE_DOCUMENT and USEREXIT_MOVE_FIELD_TO_VBAP. But i cant update the fiel

  • How to create a Database - Please Help!

    Hi everyone, I would like to create a database program for a video store and have a simple web page as the interface between the user and the database. I would like to be able to do the following with the database: Add new film details Delete Film De

  • PDF's & Helvetica Neue

    I've just completed a project for a client using Helvetica Neue, and whilst all fonts worked and looked fine on my Mac when I sent the final pdf doc. to the client on their PC's the font Helvetica Neue looks jagged and changes clarity/density in plac