Nodejs query azure storage table by timestamp
I can query the table with partition key by using the query:
var query=new azure.TableQuery().where('PartitionKey ne ?','test')
But I tried to query with the timestamp but failed.
var query=new azure.TableQuery().where('Timestamp ge ?',azure.TableUtilities.entityGenerator.DateTime(new Date(Date.UTC(2014,11,11,00,00,00))));
I also tried
var query=new azure.TableQuery().where('Timestamp ge datetime?','2014-11-11T00:00:00Z')
But it still didn't work
in the azure sdk says that I should add datetime keyword before UTC time format. And in nodejs I think the DateTime is the keyword for the datetime. But the query is wrong. Can anyone help me with that?
Thanks
Please see my comment on your question on Stack Overflow:
http://stackoverflow.com/questions/28198972/nodejs-query-azure-storage-table-according-to-timestamp.
Essentially, can you try the following code for specifying where clause:
where("Timestamp ge datetime?", '2014-11-11T00:00:00Z')
I ran this code and didn't get any error.
Hope this helps.
Similar Messages
-
Azure Storage Table Skip Query?
Currently azure storage table api support limited query feature, i was wondering what other queries they plan to support in future.
I am mostly interested in skip as i am trying to show audit log of my application though table which has paging, sorting and filtering capacity, for jumping through pages i need to have skip support otherwise i need to use some workaround .
If azure team is planning to provide this it will be very helpful.
Thanks.Hi,
Skip is not support at currently, I would suggest you vote the feedback below.
#http://feedback.azure.com/forums/217298-storage/suggestions/1097257-support-skip-in-azure-table
#http://feedback.azure.com/forums/217298-storage/suggestions/2914587-please-give-us-count-takelast-int-i-skip-int
All of the feedback in feedback forums will be monitored and reviewed by the Microsoft engineering teams responsible for building Azure.
Best Regards,
Jambor
We are trying to better understand customer views on social support experience, so your participation in this interview project would be greatly appreciated if you have time. Thanks for helping make community forums a great place.
Click
HERE to participate the survey. -
URGENT: Azure Storage Table Outage
UPDATE: Problem appears to have been fixed, are affected accounts eligible for a refund due to the downtime?
Hi. I'm having trouble querying an Azure Storage table that is essential to my business operations. The queries seem to simply not be going through, and when using the Azure Storage Explorer (a 3rd party problem); I encounter the same issue. For some reason;
Azure does not seem to be responding to my storage requests. I also cannot open a support ticket with Microsoft; as our small business does not have a support agreement with Microsoft. The storage account name is chapteradviser, and it is not even possible
to query the table service for a list of tables (it keeps timing out). This seems to be a problem at the Azure datacenter, not my end. I am also an IT consultant for the company and do not have the binding authority to enter into a support agreement with Azure.
Thanks for any assistance,
- Brian Bosak
- Consulting with ChapterAdviser Llc.Yup I see it too.. it looks like tables is really slow :(
Adding curl tracing if someone from MS is looking .. (i redacted out the account name..)
$ curl -v --trace-time -X GET https://xxxxxxxxxx.table.core.windows.net
17:14:22.296000 * Adding handle: conn: 0x1e67e80
17:14:22.296000 * Adding handle: send: 0
17:14:22.296000 * Adding handle: recv: 0
17:14:22.296000 * Curl_addHandleToPipeline: length: 1
17:14:22.312000 * - Conn 0 (0x1e67e80) send_pipe: 1, recv_pipe: 0
17:14:22.312000 * About to connect() to xxxxxxxxxx.table.core.windows.net port 443 (#0)
17:14:22.312000 * Trying 23.99.32.80...
17:14:25.375000 * Connected to xxxxxxxxxx.table.core.windows.net (23.99.32.80) port 443 (#0)
17:14:25.640000 * successfully set certificate verify locations:
17:14:25.656000 * CAfile: C:\Program Files (x86)\Git\bin\curl-ca-bundle.crt
CApath: none
17:14:25.656000 * SSLv3, TLS handshake, Client hello (1):
17:14:30.859000 * SSLv3, TLS handshake, Server hello (2):
17:14:30.875000 * SSLv3, TLS handshake, CERT (11):
17:14:30.890000 * SSLv3, TLS handshake, Server finished (14):
17:14:30.921000 * SSLv3, TLS handshake, Client key exchange (16):
17:14:30.921000 * SSLv3, TLS change cipher, Client hello (1):
17:14:30.937000 * SSLv3, TLS handshake, Finished (20):
17:14:41.937000 * SSLv3, TLS change cipher, Client hello (1):
17:14:41.953000 * SSLv3, TLS handshake, Finished (20):
17:14:41.953000 * SSL connection using AES128-SHA
17:14:41.968000 * Server certificate:
17:14:41.984000 * subject: CN=*.table.core.windows.net
17:14:42.000000 * start date: 2014-02-20 12:59:18 GMT
17:14:42.000000 * expire date: 2016-02-20 12:59:18 GMT
17:14:42.031000 * subjectAltName: xxxxxxxxxx.table.core.windows.net matched
17:14:42.046000 * issuer: DC=com; DC=microsoft; DC=corp; DC=redmond; CN=MSIT Machine Auth CA 2
17:14:42.062000 * SSL certificate verify ok.
17:14:42.078000 > GET / HTTP/1.1
17:14:42.078000 > User-Agent: curl/7.30.0
17:14:42.078000 > Host: xxxxxxxxxx.table.core.windows.net
17:14:42.078000 > Accept: */*
17:14:42.078000 >
17:15:35.078000 < HTTP/1.1 400 The requested URI does not represent any resource on the server.
17:15:35.093000 < Content-Length: 360
17:15:35.093000 < Content-Type: application/xml
17:15:35.093000 * Server Microsoft-HTTPAPI/2.0 is not blacklisted
17:15:35.109000 < Server: Microsoft-HTTPAPI/2.0
17:15:35.109000 < x-ms-request-id: f2e0b20e-5888-43ce-bbf0-68589e7ad972
17:15:35.109000 < Date: Sat, 09 Aug 2014 00:15:30 GMT
17:15:35.125000 <
<?xml version="1.0" encoding="utf-8" standalone="yes"?>
<error xmlns="http://schemas.microsoft.com/ado/2007/08/dataservices/metadata">
<code>InvalidUri</code>
<message xml:lang="en-US">The requested URI does not represent any resource on the server.
RequestId:f2e0b20e-5888-43ce-bbf0-68589e7ad972
Time:2014-08-09T00:15:31.4896331Z</message>
</error>17:15:35.125000 * Connection #0 to host xxxxxxxxxx.table.core.windows.net left intact -
Export Table Data to JSON via Azure Storage SDK
Hi,
I'm looking to export the contents of a table in JSON format and then save that JSON to one or more files.
I see that we can have the table return JSON using: tableClient.DefaultRequestOptions.PayloadFormat = TablePayloadFormat.Json;
I also see how to deserialize the JSON into an array of objects using something like this to get an array of CustomerEntity:
IQueryable<CustomerEntity> query = from customer in table.CreateQuery<CustomerEntity>()
where string.Compare(customer.PartitionKey, "I") >= 0 &&
string.Compare(customer.PartitionKey, "X") <= 0 &&
customer.Rating >= 2
select customer;
CustomerEntity[] customers = query.ToArray();
But what if I don't want the results as CustomerEntity objects, I just want the raw JSON?
The
CloudTable.CreateQuery method requires a type that inherits from ITableEntity...
I guess I could switch from using the Azure Storage SDK client to an HTTP client and query via OData, but I'd prefer a solution within the Azure Storage SDK...
Thanks,
AronThanks Will,
Here is a more complete code snippet. As you can see, I have the payload set to JSON.
const string customersTableName = "Customers";
string connectionString = string.Format("DefaultEndpointsProtocol=https;AccountName={0};AccountKey={1}", accountName, accountKey);
CloudStorageAccount storageAccount = CloudStorageAccount.Parse(connectionString);
CloudTableClient tableClient = storageAccount.CreateCloudTableClient();
// Values supported can be AtomPub, Json, JsonFullMetadata or JsonNoMetadata with Json being the default value
tableClient.DefaultRequestOptions.PayloadFormat = TablePayloadFormat.Json;
// Create the Customers table
CloudTable table = tableClient.GetTableReference(customersTableName);
table.CreateIfNotExists();
// Insert a couple of customers into the Customers table
foreach (CustomerEntity customer in CustomerEntity.GetCustomersToInsert())
table.Execute(TableOperation.Insert(customer, echoContent: false));
// The response have a payload format of JSON no metadata and the
// client library will map the properties returned back to the CustomerEntity object
IQueryable<CustomerEntity> query = from customer in table.CreateQuery<CustomerEntity>()
where string.Compare(customer.PartitionKey, "I") >= 0 &&
string.Compare(customer.PartitionKey, "X") <= 0 &&
customer.Rating >= 2
select customer;
CustomerEntity[] customers = query.ToArray();
However, the way the query is set up it automatically casts the results as CustomerEntity. That's the challenge - How to get the JSON payload before it gets cast to CustomerEntity...
Thanks,
Aron -
Is there any way to automate purging Application Logging in azure storage account tables
is there any way to automate purging Application Logging in azure storage account tables
Rohit Pasrijahi Rohit,
If you want to delete old data automatically, I think you need develop this feature by yourself. You could code the logic methods on your project, and set a timer to execute the methods to delete old data. Please refer to this thread (http://stackoverflow.com/questions/13602629/is-it-possible-to-acces-a-table-storage-in-azure-from-an-azure-web-site
), you could operate table storage data using azure sdk or REST API.
Please try it.
Regards,
Will
We are trying to better understand customer views on social support experience, so your participation in this interview project would be greatly appreciated if you have time. Thanks for helping make community forums a great place.
Click
HERE to participate the survey. -
How to read azure storage data using JAVA with REST API
Hi,
We have a cloud service in our lab. We have enabled diagnostics
in cloud services. So WADPerformanceCounterTable was created in storage account. Now , We want to read the WADPerformanceTable using
JAVA with REST API. Is there any way to collect these details. please give me any
sample code to connect azure storage using table service REST API.
Thanks & Regards
RathideviHi,
Please have a look at this article:
https://convective.wordpress.com/2010/08/18/examples-of-the-windows-azure-storage-services-rest-api/, it demonstrate how to use table service Rest API, it also give us the code, hope this helps. Of course, the
MSDN article could also help us coding.
Best Regards,
Jambor
We are trying to better understand customer views on social support experience, so your participation in this interview project would be greatly appreciated if you have time. Thanks for helping make community forums a great place.
Click
HERE to participate the survey. -
Hi I have a form on my Joomla site (see image under for viewers to upload and send me their videos, images and info.
Question:
1.Instead of receiving this form content via email, I just need a
HTML and/or PHP code with tags code so I can send the content viewers give me (videos with up to 500MB) directly into my existing Azure blob.
Seems I can also direct the form to a any database... They gave me this code sample but it was for AWS
https://www.chronoengine.com/faqs/63-cfv4/cfv4-working-with-form-data/5196-can-i-save-files-to-aws.html
Therefore they suggested me to reach out to Azure devs.
I don't code so please show me the exact code HTML and/or PHP code to connect my "pre-built forms" with Azure Storage/Media services.
kd9000You may refer the following links which may help you in building forms to direct to azure table/blob storage:
http://www.joshholmes.com/blog/2010/04/15/creating-a-simple-php-blog-in-azure/
http://stackoverflow.com/questions/27464839/use-form-post-php-to-upload-a-file-to-azure-storage-and-return-the-link
Regards,
Manu -
Flashback query to view table transactions
I am on 11g DB and have a requirement to view a set of tables transactional events that occurred every 5-10 minutes. The events for each table must be viewed in the order they occurred. From there a may need to write that information to another table. I was looking to use the flashback quey feature (as I see it is enabled on my DB) to do this. What is the best way to do this with flashback query and are there any performance considerations to consider querying the flashback tables on a regular interval?
Thanks
Edited by: bobmagan on Feb 6, 2013 4:43 AMHi,
refer ,
http://www.oracle-developer.net/display.php?id=320
Simply , you can use below query to get records as of desired TS from any table of the database, However oldest timestamp that you can query depends on your UNDO retention
SELECT count(1) FROM <table_name> AS OF TIMESTAMP to_timestamp('13-MAY-11 19:00:00','DD-MON-YY HH24:MI:SS');Thanks,
Ajay More
http://www.moreajays.com -
Is "sp_purge_data" available only to datawarehouses or can it be used with both a normal database and Azure storage as well ?
Thank you for the reply Qiuyun , the article was really helpful!
I do have couple of other questions for you :
How do we execute our SQL queries on Windows Azure tables and create horizontal partitions ? (I know that we have our SQL Server management Studio to execute normal queries on a SQL database , do we have a similar platform for Azure or do we have to get a local
copy of the database to execute our queries and the publish everything back to Azure)? I am looking to partition data on one of our databases and would like to know if it can be done in Azure directly or if we have to bring a local copy down ,write the
partition function and
and partition scheme or create a partition key and a row key - do the needful and publish it back to Azure?
Also, how do I create a partition key and row key in Windows Azure?
I am in the process of designing data archiving strategy for my team and would like to know more about the questions I just mentioned.
Hoping to hear back from you soon.
Thanks in advance for all the help!
-Lalitha. -
I am using Azure SDK 2.2 and Azure Storage SDK 4.2.1 and storage config. 2.0 in my cloud solution. I am trying to initialize the storage as follows.
CloudStorageAccount storageAccount = CloudStorageAccount.Parse(connectionKey);
// Create the table client.
CloudTableClient tableClient = storageAccount.CreateCloudTableClient();
// Create the table if it doesn't exist.
table = tableClient.GetTableReference(tableName);
table.CreateIfNotExists();
Table.CreateIfNotExists() throws an exception in my dev emulator. I have been seeing this issue for quite sometime. It looks like MSFT has not fixed this issue at all. The same code works fine when I connect to the real Azure storage in the cloud. It just
does not work under emulator. How can I debug anything. If I use Storage SDK 2.1, it works fine in dev emulator. After 3.0, nothing seems to work. Am I only one or anybody else have the same issue. Is there any workaround for this version or should I stick
to the old version. Any help is appreciated.
here is the detail exception.
ex {"The remote server returned an error: (400) Bad Request."}
System.Exception {Microsoft.WindowsAzure.Storage.StorageException}
+ [Microsoft.WindowsAzure.Storage.StorageException]
{"The remote server returned an error: (400) Bad Request."}
Microsoft.WindowsAzure.Storage.StorageException
_className
null string
+ _data
{System.Collections.ListDictionaryInternal}
System.Collections.IDictionary {System.Collections.ListDictionaryInternal}
_dynamicMethods
null object
_exceptionMethod
null System.Reflection.MethodBase
_exceptionMethodString
null string
_helpURL
null string
_HResult
-2146233088 int
+ _innerException
{"The remote server returned an error: (400) Bad Request."}
System.Exception {System.Net.WebException}
+ _ipForWatsonBuckets
8791483017288 System.UIntPtr
_message
"The remote server returned an error: (400) Bad Request."
string
_remoteStackIndex
0 int
_remoteStackTraceString
null string
+ _safeSerializationManager
{System.Runtime.Serialization.SafeSerializationManager}
System.Runtime.Serialization.SafeSerializationManager
_source
null string
+ _stackTrace
{sbyte[192]} object {sbyte[]}
_stackTraceString
null string
_watsonBuckets
null object
_xcode
-532462766 int
+ _xptrs
0 System.IntPtr
+ Data
{System.Collections.ListDictionaryInternal}
System.Collections.IDictionary {System.Collections.ListDictionaryInternal}
HelpLink
null string
HResult
-2146233088 int
+ InnerException
{"The remote server returned an error: (400) Bad Request."}
System.Exception {System.Net.WebException}
+ IPForWatsonBuckets
8791483017288 System.UIntPtr
IsTransient
false bool
Message
"The remote server returned an error: (400) Bad Request."
string
RemoteStackTrace
null string
Source
"Microsoft.WindowsAzure.Storage" string
StackTrace
" at Microsoft.WindowsAzure.Storage.Core.Executor.Executor.ExecuteSync[T](RESTCommand`1 cmd, IRetryPolicy policy, OperationContext operationContext)\r\n at Microsoft.WindowsAzure.Storage.Table.TableOperation.Execute(CloudTableClient client,
CloudTable table, TableRequestOptions requestOptions, OperationContext operationContext)\r\n at Microsoft.WindowsAzure.Storage.Table.CloudTable.Exists(Boolean primaryOnly, TableRequestOptions requestOptions, OperationContext operationContext)\r\n
at Microsoft.WindowsAzure.Storage.Table.CloudTable.CreateIfNotExists(TableRequestOptions requestOptions, OperationContext operationContext)\r\n at PSI.Applications.RemoteManagement.TraceLogManagement.AzureTableStore.InitializeStorageTable(String tableName,
String connectionKey) in c:\\AzureTraceCloudService\\AzureTableStoreLib\\AzureTableStore.cs:line 27"
string
+ TargetSite
{T ExecuteSync[T](Microsoft.WindowsAzure.Storage.Core.Executor.RESTCommand`1[T], Microsoft.WindowsAzure.Storage.RetryPolicies.IRetryPolicy, Microsoft.WindowsAzure.Storage.OperationContext)}
System.Reflection.MethodBase {System.Reflection.RuntimeMethodInfo}
WatsonBuckets
null object
+ Static members
Thanks,
RaviHi Ravi,
This issue maybe caused by storage client library is still not compatible with storage emulator and that's why your code is failing. I would recommend downgrading your storage client library to the previous version and your code should work just fine,
the best option I think is to upgrade the SDK to the latest version, refer to
http://azure.microsoft.com/blog/2014/08/04/announcing-release-of-visual-studio-2013-update-3-and-azure-sdk-2-4/ for more details.
Best Regards,
Jambor
We are trying to better understand customer views on social support experience, so your participation in this interview project would be greatly appreciated if you have time. Thanks for helping make community forums a great place.
Click
HERE to participate the survey. -
Azure Storage Backup and Restore
Hi,
We have Azure Tables deployed in our Azure Storage Account. Is there a Backup & Recovery strategy we can use for Production Business Continuity plan.
Thanks.
Regards,
Subhash Konduru
Please remember to mark the replies as answers if they help and unmark them if they provide no help.Hi Subhash,
Thanks for posting here.
I suggest you to check this link for BCP on storage backup and recovery.
https://technet.microsoft.com/en-us/library/dn621063.aspx
http://blogs.technet.com/b/uspartner_ts2team/archive/2015/02/17/new-solution-accelerator-azure-business-continuity-and-disaster-recovery.aspx
Hope this helps you.
Girish Prajwal -
environment:
windows 8.1, Visual Studio 2013, Windows Azure SDK2.2 , Microsoft.WindowsAzure.Storage V3.0.0.0, Windows Azure Storage Emulator 2.2
Code snippet:
CloudStorageAccount account = CloudStorageAccount.DevelopmentStorageAccount;
CloudTableClient tableClient = account.CreateCloudTableClient();
CloudTable table = tableClient.GetTableReference(tableName);
table.CreateIfNotExists(); // this line cause error
Error Message:
System.Net.HttpWebRequest.GetResponse() +6594148
Microsoft.WindowsAzure.Storage.Core.Executor.Executor.ExecuteSync(RESTCommand`1 cmd, IRetryPolicy policy, OperationContext operationContext) +1948
[StorageException: remote server return error: (400) bad request]Microsoft.WindowsAzure.Storage.Core.Executor.Executor.ExecuteSync(RESTCommand`1 cmd, IRetryPolicy policy, OperationContext operationContext) +7389
Microsoft.WindowsAzure.Storage.Table.TableOperation.Execute(CloudTableClient client, CloudTable table, TableRequestOptions requestOptions, OperationContext operationContext) +206
Microsoft.WindowsAzure.Storage.Table.CloudTable.Exists(Boolean primaryOnly, TableRequestOptions requestOptions, OperationContext operationContext) +394
I have search many threads on internet. And I cannot get any solution. Who can help? Thanks a lot.Hi,
I didn’t find any issue in the code that you provided, I also try it and in my PC it created a new table in my storage, my system is windows 8, Visual Studio 2012, Windows Azure SDK2.2, Microsfot.WindowsAzure.Storage V2.0.0.0. From my experience, there is
something wrong in you Windows Azure SDK, I recommended you to reinstall, and then try the code again. If you have any further issue, please feel free to let me know.
Best Regards
We are trying to better understand customer views on social support experience, so your participation in this interview project would be greatly appreciated if you have time. Thanks for helping make community forums a great place.
Click
HERE to participate the survey. -
I have a requirement where I need users to be able to upload really large files (videos) from a Meteor app into an Azure container.
For those of you unfamiliar with Meteor, it's young but important full stack Javascript platform. BTW, Meteor just released official support for Windows. That may be one of the reasons why there are tons of S3-friendly file upload packages in
the Meteor ecosystem, but hardly anything that works for Azure storage yet.
In any case, there is this one new package, https://github.com/jamesfebin/azure-blob-upload that seemed promising. However, I'm running into issues as documented here: https://forums.meteor.com/t/uploading-large-files-to-azure-storage/2741
There's another possible path. One thing the Azure team + community could do to onboard Meteor developers is add an example of Azure storage support to the Slingshot package. https://github.com/CulturalMe/meteor-slingshot The reason why this
is important is that for gigantic files, we'd rather not have to stream from the client through the Meteor server (NodeJS), and then over to Azure. Slingshot supports a secure method for doing that with S3, Google Cloud Storage, and even Rackspace Storage.
Seems like Azure could be part of the mix if there was someone who could interpret from the NodeJS SDK and translate it to what Slingshot is doing. I'm just not saavy enough to do it myself at the moment.Hi Soupala,
Looks like you have got a solution for this issue from the
Meteor forum.
However, I will try to find out if any other solutions are available for this scenario.
Regards,
Manu -
When I call this method with large EntityProperty (around 17Kb of text), it truncates the string.
I know that there is a limitation of 64Kb for a column and 1Mb for 1 entire row when it comes to Azure Table.
Any insights?Hi,
It sounds very strange, if you save the string value to other entity property, will it be truncated? If you use
Azure storage explorer to save that string value, will it be truncated?
If the string is not very confidential, could you share here? If you try to use the latest version of the azure storage client library, does this issue will be fixed? Please give us further information for a better support.
Best Regards,
Jambor
We are trying to better understand customer views on social support experience, so your participation in this interview project would be greatly appreciated if you have time. Thanks for helping make community forums a great place.
Click
HERE to participate the survey. -
How to use one query against multiple table and recieve one report?
I have duplicate tables, (except for their names of course) with commodities prices. They have the same column headings, but the data is different of course. I have a query that gives me a certain piece of information I am looking for but now I need to run this query against every table. I will do this every day as well, to see if the buying criteria is met. There are alot of tables though (256). Is there a way to say run query in all tables and return the results in one place? Thanks for your help.
hey
a. the all 256 tables whuld be one big partitoned table
b. you can use all_tables in order to write a select that will write the report for you:
SQL> set head off
SQL> select 'select * from (' from dual
2 union all
3 select 'select count(*) from ' || table_name || ' union all ' from a
4 where table_name like 'DB%' AND ROWNUM <= 3
5 union all
6 select ')' from dual;
select * from (
select count(*) from DBMS_LOCK_ALLOCATED union all
select count(*) from DBMS_ALERT_INFO union all
select count(*) from DBMS_UPG_LOG$ union all
remove the last 'union all', and tun the generated quary -
SQL> set head on
SQL> select * from (
2 select count(*) from DBMS_LOCK_ALLOCATED union all
3 select count(*) from DBMS_ALERT_INFO union all
4 select count(*) from DBMS_UPG_LOG$
5 );
COUNT(*)
0
0
0
Amiel
Maybe you are looking for
-
On my old laptop I backed up my music on to an external hard drive ready to transfer over to my new laptop. After copying the folder from my external to the new laptops' own drive, I duly followed instructions to restore the library. I thought all wa
-
I am getting error Error in calling up function 'BAPI_APOATP_CHECK' in APO server The current application triggered a termination when triggering for gatp.
-
I would like to display two sets of data in a 3D scatter graph, and each data set should have its own style and size. Can someone help me complete this task? Thanks in advance.
-
I get the message: Installer failed to initialize. This could be due a missing file. Could someone help me?
-
Any kind of gui/browser for javacards available?
I've got the JavaCard documentation...but it's a tad verbose. Nothing to come up to speed quickly...if that's even possible. The iButton has a IB-IDE for browsing iButton javacards...any such gui (java or otherwise) for java smartcards? For now and t