Azure table storage design for simple social networking

What is the best table design if I want to use Azure Table Service for a simple social networking website?
The website could have millions of users.
Users need to be able to view a list of all other users in the system sorted by the number of mutual connections.
Users must be able to view a list of their connections
User must be able to view content posted by themselves and their connections.
One major design constraint is that Azure table service queries are generally limited to the partition key and row key when there are a large number of records or else they get really slow. Another constraint is that query results are only sorted by the
partition key and then the row key.

For your scenario, I think making use of the SQL Azure makes more senses than usage of azure table storage service offering, nature of the data looks more relational in this particular context which is not recommended for table storage model design.
You can get started with SQL Azure at -
http://azure.microsoft.com/en-us/services/sql-database/
Bhushan | Blog |
LinkedIn | Twitter

Similar Messages

  • Using one azure table storage account for many customers with their own data

    I'm developing app that will allow the customers to store their data in azure. However, currently I have no idea how to split  accounts of the customers in azure. Yes, I'm just started to read the documentation, but maybe someone can point me to the
    right topic?

    It seems like it might be worth starting from the general guidance to developing multitenant cloud applications - this resource might help: http://msdn.microsoft.com/en-us/library/ff966499.aspx
    The patterns covered in this guidance might apply to data storage mechanism chosen for the application - whether it's Azure Storage , Azure SQL DB or else.

  • How to improve performance for Azure Table Storage bulk loads

    Hello all,
    Would appreciate your help as we are facing a challenge.
    We are tried to bulk load Azure table storage. We have a file that contains nearly 2 million rows.
    We would need to reach a point where we could bulk load 100000-150000 entries per minute. Currently, it takes more than 10 hours to process the file..
    We have tried Parallel.Foreach but it doesn't help. Today I discovered Partitioning in PLINQ. Would that be the way to go??
    Any ideas? I have spent nearly two days in trying to optimize it using PLINQ, but still I am not sure what is the best thing to do.
    Kindly, note that we shouldn't be using SQL/Azure SQL for this.
    I would really appreciate your help.
    Thanks

    I'd think you're just pooling the parallel connections to Azure, if you do it on one system.  You'd also have a bottleneck of round trip time from you, through the internet to Azure and back again.
    You could speed it up by moving the data file to the cloud and process it with a Cloud worker role.  That way you'd be in the datacenter (which is a much faster, more optimized network.)
    Or, if that's not fast enough - if you can split the data so multiple WorkerRoles could each process part of the file, you can use the VM's scale to put enough machines to it that it gets done quickly.
    Darin R.

  • How to select data from AZure table storage without row key and partition key

    Hi 
    I need to select a data from azure table storage without rowkey and partition key. how  in azure storage emulator click query it display all data from that table. 
    thanks 
    rajesh 

    Hi rajesh,
    It seems that you didn't click query data using storage emulator. But I recommend you could use the azure server explore in your VS to view your data and query data. Please see this document (http://msdn.microsoft.com/en-us/library/azure/ff683677.aspx).
    And base on my experience, you may need input the command on Azure storage emulator, such as this page(http://msdn.microsoft.com/en-us/library/azure/gg433005.aspx).
    Regards,
    Will
    We are trying to better understand customer views on social support experience, so your participation in this interview project would be greatly appreciated if you have time. Thanks for helping make community forums a great place.
    Click
    HERE to participate the survey.

  • Migrating Azure table storage content to On prem SQl database

    hi 
    is it possible to import azure table storage data to On prem Sql database ?

    Hi
    You cannot do it directly from SQL or Azure Storage Explorer.
    But you can have a little application that extracts your table storage data into some CSV file like this:
    http://blogs.msdn.com/b/jmstall/archive/2012/08/03/converting-between-azure-tables-and-csv.aspx
    And then import the CSV file that you generate into your on-prem sql database
    Regards
    Aram
    Aram Koukia

  • Azure table storage rest API including

    How do I access my table storage using REST API. 
    Any example would be appreciated including enabling REST API.

    Hi,
    Please have a look at this article:
    http://blogs.msdn.com/b/tconte/archive/2011/08/10/accessing-windows-azure-blob-storage-using-jquery.aspx, hope it helps. We could also consider use
    Jquery to call codebehind to do some operation about Azure table storeage, if so we could choose azure SDK or azure storage rest API to do this.
    Best Regards,
    Jambor
    We are trying to better understand customer views on social support experience, so your participation in this interview project would be greatly appreciated if you have time. Thanks for helping make community forums a great place.
    Click
    HERE to participate the survey.

  • Project related tables and link for services in Network Activity

    Dear Experts,
        My requirement is I need a table for services in Network activity before release of Network and link for services to another table(Project related before PR).
    Ex : Network             Activity           Service         
            XXX                     XX                   12345
          So I need Service  field, which is updating in ESLL was unable to link with another table to get the Project Number. So please help me .
    Regards,
    Srikanth.

    Dear sushrut sheth,
    thank for the reply
    I looked in AFVC before to post the Issue but in this table the Activity is only updating but not the individual service items
    I required the table which is updating the individual service line items

  • Azure Table Storage Unable to connect - Error on increasing load

    I am using Windows Aure Website ( Reserved Instance) and doing the Inserts to an Azure table via http. ( Yes http).  When i was doing load testing, foud the below error.  It was like  2 hits / seconds... Not sure what is the reason?
    Any thoughts? SDK issue? 
    HTTP_Request2_ConnectionException</b>: Unable to connect to tcp://innovativetxtstorage.table.core.windows.net:80. Error: php_network_getaddresses: getaddrinfo failed: No such host is known.  
    C:\DWASFiles\Sites\innovativetxt\VirtualDirectory0\site\wwwroot\vendor\pear-pear.php.net\HTTP_Request2\HTTP\Request2\Adapter\Socket.php  on  line 324
    >HTTP_Request2_SocketWrapper->__construct('tcp://innovative&hellip;', 10, Array)</td><td>C:\DWASFiles\Sites\innovative in C:\DWASFiles\Sites\innovativetxt\VirtualDirectory0\site\wwwroot\vendor\pear-pear.php.net\HTTP_Request2\HTTP\Request2\SocketWrapper.php
    on line 116

    Here is further information ...
    In normal circumstance with 30 Hits per Minute to our our API.... It works fine.  After that latency is too high that it gives time out. See the Attached Report..
     All I am doing is reading the Azure table to verify user Loginand Password...   ( Reading Same Entity Again and Again)
    See teh Performance Test Report here... http://www.innovativetxt.com/BlitzPerformanceTest.pdf
    My customer cannot send us more than 30 request per seconds.. we feel ourself in trouble by complains every day.
    We are on reserved Instance Website URL is http://www.innovativetxt.com   and we feel like latency is killing us. 
    1. Is it because of Aure Storage? Cannot handel load of that much?
    Is it bec
    ( I have already turend of Tanggling, TCP 100 etc all tips implemented )
    See the below error message from Log file.
    [03-Aug-2013 09:37:01 UTC] PHP Fatal error:  Uncaught <table style="border: 1px" cellspacing="0">
    <tr><td colspan="3" style="background: #ff9999"> <b>HTTP_Request2_ConnectionException</b>: Unable to connect to tcp://innovativetxtstorage.table.core.windows.net:80. Error: php_network_getaddresses: getaddrinfo failed:
    No such host is known.  in <b>C:\DWASFiles\Sites\innovativetxt\VirtualDirectory0\site\wwwroot\vendor\pear-pear.php.net\HTTP_Request2\HTTP\Request2\Adapter\Socket.php</b> on line <b>324</b></td></tr>
    <tr><td colspan="3" style="background-color: #aaaaaa; text-align: center; font-weight: bold;">Exception trace</td></tr>
    <tr><td style="text-align: center; background: #cccccc; width:20px; font-weight: bold;">#</td><td style="text-align: center; background: #cccccc; font-weight: bold;">Function</td><td style="text-align:
    center; background: #cccccc; font-weight: bold;">Location</td></tr>
    <tr><td style="text-align: center;">0</td><td>HTTP_Request2_SocketWrapper->__construct('tcp://innovative&hellip;', 10, Array)</td><td>C:\DWASFiles\Sites\innovative in C:\DWASFiles\Sites\innovativetxt\VirtualDirectory0\site\wwwroot\vendor\pear-pear.php.net\HTTP_Request2\HTTP\Request2\SocketWrapper.php
    on line 116

  • Push notifications for the social networking are not working. How to fix it?

    Try to solve the problem of push notifications of social networking apps which are not working on my iphone 4. Fix it please..!

    If you go to settings-->general-->accessibility then scroll down you'll see "increase contrast." Turn that off and your control center and notification center should become more transparent. I had the same problem and that's how I fixed it! Hope it works :)

  • Does this sound like a good design for simple db lookup?

    I want to separate the gui form the behind the scenes stuff so 2 classes. One for the gui (gui.java) and one for the database stuff (dbQuery.java). I know the syntex is not accurate but it's sort of pseudo-code - Here's how it looks:
    public class gui extends JFrame {
       string name, address, city, state, id
       public static void main {
          create gui         //all the swing stuff goes here
          actionlistener{          //waiting for button to be clicked
             when button is clicked,
             dbQuery dbq = new dbQuery(id)
             name.setText(dbq.getName())
             address.setText(dbq.getAddress())
             .....        //get the rest of the information needed
    class dbQuery{
       dbQuery(string id){
          //do all the stuff to get connection. create statment, get sesult set...
          rs.next();
          string name = rs.egtString("Name");
          //.... assuming all db fields are appropriately named
    }This is my first attempt at designing more than "Hello World" and your input truly is appreciated!
    Merry Christmas!

    I am partial to the database layer and the gui layer using a primitive and collections of primitives to share data.
    A primitive is implemented as a class in java but it is not an object. Rather it just serves as a convient way to group the certain attributes together. Like this
         class CustomerPrimitive
              public String name;
              public String account;
              public Boolean preferedCustomer;
              public Double accountBalance;
    .When I use primitives I tend to use objects rather than java primitivers (Integer rather than int.) This allows me to use the primitives for queries also - a null means don't query on that. In addition it allows me to return null values as nulls.
    It is important that a primitive not have any substantial behaviour. For example there could be a validation method that verifies that the account number is not null. But there must not be a validation method to verify that the account number is unique. The first is implemented using simple java (perhaps with java.lang), while the second requires the interaction of numerous classes.

  • Best design for an Airport network in an odd house?

    I live in a large 3 storey house (wooden). My 9,000 kb/s cable internet comes in at level 3 into my Airport Extreme (n/b/g/a) base station. My new iMac sits right next to the base station and works fine, clocking pretty much full speed when I run those internet speed tests. I'm running under a combined n/b/g network at 2.4 GHz because I have older wireless clients elsewhere in the house (G5 iMac on level 1, and G4 iMac and G4 Powerbook on level 2). This is where the fun starts. The 2 G4s on level 2 clocked about 4,000 or 5,000 kb/s initially when accessing the main base station on level 3. However, the signal wont reach level 1 so I added an older Airport Extreme base station (a/b/g model, not n) on level 2. In added this base station by 'extending my wireless network'. Now the 2 G4s on level 2 have slowed a bit, clocking 3,000 or 4,000 kb/s. However the signal now reaches level 1 where my G5 clocks about 1,000 or 2,000 kb/s. I'm frankly surprised by the level of performance drop-off. Is this normal? Is there a better configuration, noting I have a couple of spare Airport Express that are currently not part of the network? Also, I'm currently locked into having the manin base station on level 3 where by cable port resides. PS: I've looked at S:R ratios throughout the house using MacStumbler and the ratios all look ok.

    Indeed, there is a performance penalty to pay by setting up a wireless distribution system to increase range. I'd say that the performance you are managing to get is actually pretty good everything considered.
    The only way to improve upon this is to cable additional access points to the main base station, or switch to an all-N network by using external wireless-N adapters on the older computers.

  • Importing Azure table Data to Hive

    Hi All,
    Have created a table in Azure. Can see the same in Azure Storage explorer under tables section.
    I want to import this table's data to Hive.
    For the blob/csv/text files we will use the syntax
    CREATE EXTERNAL TABLE <tablename>(col1  string , col2 string)
    ROW FORMAT DELIMITED FIELDS TERMINATED BY ',' LINES TERMINATED BY '\n'
    LOCATION '/xxxx/xxxx/filename';
    what's the syntax for importing the Azure table to Hive.
    Please help.
    Sriman N Vangala

    Hi Sriman,
    Unfortunately, Microsoft does not have a Azure table storage handler for HDInsight. It is in the roadmap for future but there is no specific dates on when it will be publically available. One of our product team members has a blog on how to achieve
    this using a connector available from github. Please refer to that blog here:
    http://blogs.msdn.com/b/mostlytrue/archive/2014/04/04/analyzing-azure-table-storage-data-with-hdinsight.aspx
    The set up process in the blog is not very straight forward and it requires some extra config steps as there is no out of the box solution at present. Also, the ATS Hive connector is a community driven project, not a part of HDInsight right now which means
    it is not officially supported if you run into any problems with it later on.
    One easier alternative would be to get the data out of Azure Tables into CSV in your storage account and process it normally. A lot of folks who have built service telemetry pipelines are doing so.
    Hope this helps.
    DebarchanS - MSFT

  • Is this the best design for asynchronous notifications (such as email)? Current design uses Web Site, Azure Service Bus Queue, Table Storage and Cloud Service Worker Role.

    I am asking for feedback on this design. Here is an example user story:
    As a group admin on the website I want to be notified when a user in my group uploads a file to the group.
    Easiest solution would be that in the code handling the upload, we just directly create an email message in there and send it. However, this seems like it isn't really the appropriate level of separation of concerns, so instead we are thinking to have a separate
    worker process which does nothing but send notifications. So, the website in the upload code handles receiving the file, extracting some metadata from it (like filename) and writing this to the database. As soon as it is done handling the file upload it then
    does two things: Writes the details of the notification to be sent (such as subject, filename, etc...) to a dedicated "notification" table and also creates a message in a queue which the notification sending worker process monitors. The entire sequence
    is shown in the diagram below.
    My questions are: Do you see any drawbacks in this design? Is there a better design? The team wants to use Azure Worker Roles, Queues and Table storage. Is it the right call to use these components or is this design unnecessarily complex? Quality attribute
    requirements are that it is easy to code, easy to maintain, easy to debug at runtime, auditable (history is available of when notifications were sent, etc...), monitor-able. Any other quality attributes you think we should be designing for?
    More info:
    We are creating a cloud application (in Azure) in which there are at least 2 components. The first is the "source" component (for example a UI / website) in which some action happens or some condition is met that triggers a second component or "worker"
    to perform some job. These jobs have details or metadata associated with them which we plan to store in Azure Table Storage. Here is the pattern we are considering:
    Steps:
    Condition for job met.
    Source writes job details to table.
    Source puts job in queue.
    Asynchronously:
    Worker accepts job from queue.
    Worker Records DateTimeStarted in table.
    Queue marks job marked as "in progress".
    Worker performs job.
    Worker updates table with details (including DateTimeCompleted).
    Worker reports completion to queue.
    Job deleted from queue.
    Please comment and let me know if I have this right, or if there is some better pattern. For example sake, consider the work to be "sending a notification" such as an email whose template fields are filled from the "details" mentioned in
    the pattern.

    Hi,
    Thanks for your posting.
    This development mode can exclude some errors, such as the file upload complete at the same time... from my experience, this is a good choice to achieve the goal.
    Best Regards,
    Jambor  
    We are trying to better understand customer views on social support experience, so your participation in this interview project would be greatly appreciated if you have time. Thanks for helping make community forums a great place.
    Click
    HERE to participate the survey.

  • Performing a case insensitive table storage query: Storage client 3.0

    I'm using Storage Client 3.0 and performing queries against entity properties such as a string value for email address.
    I'm using the TableQuery class and my query looks something like this:
    CloudTable accountsTable = tableClient.GetTableReference(Settings.AccountTable);
    TableQuery<Account> rangeQuery = new TableQuery<Account>().Where(
    TableQuery.CombineFilters(
    TableQuery.GenerateFilterCondition("PartitionKey", QueryComparisons.Equal, "account"),
    TableOperators.And,
    TableQuery.GenerateFilterCondition("Email", QueryComparisons.Equal, email)));
    var results = accountsTable.ExecuteQuery(rangeQuery).ToList();
    This works fine when the entity I'm searching for matches the case of my search term.  But how can I perform a case insensitive search?  I think this was possible when using linq queries but how do I accomplish this in the new storage client?
    thanks

    Hi,
    As far as I know, azure table storage didn't support some linq operator, such as this page  (http://msdn.microsoft.com/en-us/library/dd135725.aspx) shown. You could use some common
    linq methods, as those sample (http://blogs.msdn.com/b/kylemc/archive/2010/11/22/windows-azure-table-storage-linq-support.aspx).
    Also, I suggest you could refer to same thread (http://stackoverflow.com/questions/8805759/azure-table-storage-query-in-net-with-property-names-unknown-at-design-time).
    You could change your code to linq format and try it.
    Thanks.
    Any question, please let me know.
    Thanks.
    Will
    We are trying to better understand customer views on social support experience, so your participation in this interview project would be greatly appreciated if you have time. Thanks for helping make community forums a great place.
    Click
    HERE to participate the survey.

  • Table storage got 4 times slower from day to day

    Hi, I've run tasks which added data to the table storage on my cloudapp each night for the last months, but since Friday 13th (yeah 13th, thats what I thought), the tasks have been executing 4 times slower.
    The exact Work being done, is around 250,000 calls to 'ExecuteBatch' split up on three workers - thursday I got around 20 ops, and since Friday I've been getting around 5 ops. See the storage graph below. It clearly shows that the requests has dropped
    significantly...
    Does the azure table storage limit bandwidth in some cases, or what could have happened?
    Has anyone experienced this or something similar before?

    Hi APMadsen,
    I guess the azure table storage may be affected by larger and larger data. If so, I suggest you could pay attention to your query code and spend on the query code optimization (refer to this thread
    http://social.msdn.microsoft.com/Forums/windowsazure/en-US/5326d280-513f-47a3-826d-2db97ebd9ace/why-is-this-azure-table-storage-query-so-slow). Also, I suggest you could refer to this big data sample (http://www.troyhunt.com/2013/12/working-with-154-million-records-on.html
    ) and this blog (http://robertgreiner.com/2012/06/why-is-azure-table-storage-so-slow/ ).
    Hope it helps.
    Will
    We are trying to better understand customer views on social support experience, so your participation in this interview project would be greatly appreciated if you have time. Thanks for helping make community forums a great place.
    Click
    HERE to participate the survey.

Maybe you are looking for

  • Windows 7, Firefox and Citrix Access Gateway

    Hello I have a PC with Windows 7 Professional (32 bit). I have web access to a Citrix Access Gateway, and it works mostly fine in IE9. But when I try to access via Firefox 9.0.1 the CAG writes that no Citrix client is installed. If I click Already in

  • Need help installing Photoshop CS4 upgrade

    I need some help. I've been run CS on an XP system. Today I bought a new Vista system and purchase and downloaded the CS4 upgrade. I don't know what to do with the download. I unzipped the .7z file with WinZip but I'm stuck. What do I do now?

  • Problem with eclispe/find update

    Hi, My apology if it's not right place to talk about this. I get following error when I use Eclipse 3.1 find/update. Network connection problems encountered during search. Unable to access "http://update.eclipse.org/updates/3.0". Unable to access sit

  • Version conversion

    I need to convert my app from LabView version 7.1 to (compatability with) 6.0. I'm aware that 7.1 can save down to 7.0, but have yet to find a reliable means of getting down to 6.0. thanks in advance.

  • Configuring sendmail on Solaris 10 servers

    I can send mail to myself within my server but I can get the server to succesfully send mail Out to other servers and to my Outlook email address. Please help. Thanks.