Using Data-Dependent Routing in combination with Membership provider

Hi there!
Currently we have two Web applications running on azure using a single SQL database. We are experiencing problems with performance because we are using only one database. We are investigating some solutions like documentDB and elastic scale.
DocumentDb seems like a really good option for us because it looks like it can be implemented easily and we know just what to do.
With elastic scale on the other hand we have to figure out what our options are. We are using a membership provider for our users to login in one of the two applications. The other web applications does not use a membershipprovider, it does not use a login
system at all.
in the backend of our code we determine a subscription_id for both applications to retrieve tenant data.
I think we have a few options here
1. to keep using the membershipprovider we could create a database just for the login mechanism so we can determine the subscription_id, and with that subscription_id we can use Data-Dependent Routing to retrieve the correct data from the correct shard.
2. We can add extra columns to the Shard Map Manager database (i think i've read that it's not supposed to be used for user data) like username and password so we can login through this database.
Does anyone has better options than the options mentioned before and can anyone give me advise on how we should deal with these issues?
Thanks!

Elmar --
The best approach would be to maintain the Membership DB in a separate Azure SQL database.  Then you could shard your transaction details across an Elastic Scale set of databases based on the subscription ID.   (The Shard Map Manager
database is not designed to be extended with additional columns, and Elastic Scale works by caching that Shard Map data in the client application anyway, and those in-memory structures can't be changed).
You would use an ordinary ADO.Net connection to query membership for a Subscription ID, and then get an appropriate shard-specific connection using the Elastic Scale GetOpenConnectionForKey method, passing the subscription ID retrieved from
the membership query. 

Similar Messages

  • Tpdequeue and data depending routing

    Hi all,
    I would like to use data dependant routing. I try to use example from TMQUEUE server's reference.
    *GROUPS
    TMQUEUEGRP1 GRPNO=1 TMSNAME=TMS_QM
    OPENINFO="TUXEDO/QM:/dev/device1:myqueue"
    TMQUEUEGRP2 GRPNO=2 TMSNAME=TMS_QM
    OPENINFO="TUXEDO/QM:/dev/device2:myqueue"
    *SERVERS
    TMQUEUE SRVGRP="TMQUEUEGRP1" SRVID=1000 RESTART=Y GRACE=0
    CLOPT="-s ACCOUNTING:TMQUEUE"
    TMQUEUE SRVGRP="TMQUEUEGRP2" SRVID=1000 RESTART=Y GRACE=0
    CLOPT="-s ACCOUNTING:TMQUEUE"
    TMQFORWARD SRVGRP="TMQUEUEGRP1" SRVID=1001 RESTART=Y GRACE=0 REPLYQ=N
    CLOPT=" -- -qservice1"
    TMQFORWARD SRVGRP="TMQUEUEGRP2" SRVID=1001 RESTART=Y GRACE=0 REPLYQ=N
    CLOPT=" -- -qservice1"
    *SERVICES
    ACCOUNTING ROUTING="MYROUTING"
    *ROUTING
    MYROUTING FIELD=ACCOUNT BUFTYPE="FML"
    RANGES="MIN - 60000:TMQUEUEGRP1,60001-MAX:TMQUEUEGRP2"
    When I do tpdequeue("ACCOUNTING", "service1", ...), I got error in ULOG "4052 ERROR: NULL input buffer not allowed for service 'ACCOUNTING', which uses routing".
    Is it posible to use tpdequeue() with data dependant routing or only TMQFORWARD can dequeue messages from queue?
    Thanks.

    Hi,
    I'm not exactly sure what it is you are trying to do. Although the TMQUEUE servers advertise their service as the name of the queuespace, you cannot apply data dependent routing to that service as the buffer type that service uses is an internal Tuxedo buffer type and not one of the standard buffer types. This is because the service has to pass additional information (such as the information from the TPQCTL structure) besides the buffer being enqueued or dequeued.
    Where you can use data dependent routing is on the delivery of the message to a service performed by TMQFORWARD since TMQFORWARD simply uses tpcall() to invoke the service with the buffer retrieved from the queue.
    I'm not sure I answered your question, but if you could perhaps explain what it is you want to accomplish, I might be able to suggest something.
    Regards,
    Todd Little
    Oracle Tuxedo Chief Architect

  • Data dependent routing with WebLogic

    How do you do data dependent routing with WebLogic Platform? Does it require additional
    products or can it be done just using WebLogic Platform components?
    Thanks

    WLI, or "WebLogic Integrator" may be what you are looking for, this is a
              full fledged message-flow/routing product built on top of JMS.
              If your needs are simple, you can roll-your-own data dependent routing
              using the WebLogic server built-ins of JMS, which provides reliable
              messaging, in combination with MDBs and/or Messaging Bridges - both of
              which allow the specification JMS selectors (filters). Furthermore,
              your MDB application can enqueue to a specific local destination (or a
              remote one, which is harder) based on the contents of the message it
              receives. Voila.
              For example, WorkQ (local or remote) --> MDB --> JobQ1 or JobQ2 or
              JobQ3. To simplify the MDB logic, JobQs could be on the same server,
              and therefore always available. (This is one of the advantages of
              having a JMS that can run in the same JVM as the app server, the other
              is very fast performance.)
              Tom
              David B wrote:
              > How do you do data dependent routing with WebLogic Platform? Does it require additional
              > products or can it be done just using WebLogic Platform components?
              >
              > Thanks
              

  • Using own SQLite DB in combination with Data Management

    Hi,
    Currently on a huge project we're we are using LCDS 3.1 in combination with a AIR 2.5 client.
    I've been reading "Using Adobe LiveCycle Data Services ES2 version 3.1" and I have a question. In the chapter "Building an offline-enabled application" it says on the very first line:
    "You can use an offline adapter with an AIR SQLite database to perform offline fills when a desktop client is disconnected from the LiveCycle Data Services server. An offline adapter contains the SQL queries for AIR SQLite for retrieving cached items like an assembler on the server retrieves items from the data source."
    However, in my experience that AIR SQLite database is not just any DB but one that Datamanagment designs and generates itself, based on the Dto the DataManagement destination is managing. The offline adapter doesn't work like an assembler at all, because the documentation says you can only override the methods pertaining to constructing the WHERE, and ORDER BY parts of the queries, not the SELECT, CREATE, FROM,... parts.
    In our case, we have a database on the server, constructed according to a very specific ERD, and we have a SQLite database on the client, also constructed according to a very specific ERD. What we want to do is execute every fill, create, update, delete against the offline cache and only synchronize with the backend when we want it the synchronize (technically possible by playing with the autoMerge, autoSaveCache, autoConnect,... properties). So what part of datamanagement can we customize to use our DB instead of a generated one?
    Thx in advance!

    You are correct in noting that Data Management does not allow you to use your own database to store offline data.  This data is exclusively managed by the LCDS library for the developer.  The intent is that the local cache is a reflection of the server data, not an independent copy.
    If you have an existing database in AIR, then you will have much more direct control over the querying and updating of that data by using the SQLite APIs directly.
    That being said, you can in essence replicate the data stored on the server, managed by Data Management, in the offline cache.  In an upcoming release (winter 2011) we will have a few features ('briefcases' and a 'changes-only' fill) that will make this story even more compelling for your use cases.  But even with the 3.1 functionality, you can do something like the following:
    Perform a fill() to collect the data you want to have available on the client, save this in the offline cache
    Construct an Offline Adapter Actionscript class that implements the fills you want to perform on the local data
    Use the DataService.localFill() API to perform all of the client application fills, turn off autoCommit.
    When the client is online, call commit() to store client changes and call fill() to refresh the cached data.
    This should give you some ideas on how you could go about constructing your app to leverage the offline features of Data Services.
    Tom

  • BUFTYPECONV vs Data Dependent Routing

    Hello everyone,
    when using BUFTYPECONV for a service in conjunction with data dependent routing, which buffer type is used for the DDR? The original buffertype or the converted?
    If I re-phrase the question: Which operation is performed first - buffer conversion or data dependent routing?
    Any input on this welcome,
    /Per
    Per Lindstrom R2Meton AB, SWEDEN

    Per,
    The datatype conversion occurs after routing in the server that has been
    chosen to process the service.
    When the server receives the message and sees that it has receved an XML
    buffer for a service with BUFTYPECONV specified, the postrecv function for
    the XML buffer type converts the buffer to FML or FML32. When the service
    calls tpreturn(), if the buffer passed to tpreturn() is an FML or FML32
    buffer, it is converted back to XML before the reply message is sent back to
    the caller.
    <plindstrom> wrote in message news:[email protected]..
    Hello everyone,
    when using BUFTYPECONV for a service in conjunction with data dependentrouting, which buffer type is used for the DDR? The original buffertype or
    the converted?
    >
    If I re-phrase the question: Which operation is performed first - bufferconversion or data dependent routing?
    >
    Any input on this welcome,
    /Per
    Per Lindstrom R2Meton AB, SWEDEN

  • Use of action links in combination with hierarchy object in analysis

    Hi All,
    From OBI 11g release hierarchy objects are available in the presentation layer of OBI. We make often use of these hierarchy objects. We also make use of action links to drill down to a analysis with more detailed information. This more detailed analysis should inherit the values of the higher-level analysis. In combination with the hierarchy objects this can give problems.
    An example:
    Let's say we have the following atttributes:
    Hierarchy object date (year, quarter, month), a product column and # of units shipped measure. On the column # of units shipped we have defined an action link to a more detailed analysis.
    Now, let's say the user clicks on the year 2011 in the hierarchy object. The quarters for 2011 are shown. The 'normal' product column attributes contains 'Product A'. The user makes use of the action link that is available by clicking on the measure value of # of units shipped. He clicks on the value that is based on quarter 2 of 2011 and 'Product A'. The detailed analysis is set to filter the analysis on the product and date dimension. In this example the analysis is filtered on 'Product A' but not on the value Quarter 2, 2011. Hierarchy object values are not passed through on clicking on a action link.
    Is there any workaround for this issue? We want to make use of the hierarchy object but users expect to also pass the hierarchy object value to the underlying detailed analysis

    I am facing the same issue...
    First check it out:
    http://prasadmadhasi.com/2011/12/15/hierarchicalnavigationinobiee11g/
    http://www.rittmanmead.com/2010/10/oracle-bi-ee-11g-navigation-passing-parameters-using-hierarchical-columns/
    I solved it by using Go URL link. I pass value of hierarchy level (whichever level it is). For Year it is 2012, month: 2012-03.
    Now ugly part: in url I pass 2 parameters that refers to the same value : "Time"."Year"="2012", "Time"."Month"="2012"
    In target report I apply filter: "Time"."Year" "Is Prompted" OR "Time"."Month" "Is Prompted"
    This way in target report only one of filters will work : depending from which level you navigated from source report.
    If you decide use this approach be carefoul with URL syntax, remember about double quotes etc. In my case it was:
    Parameter : value
    PortalGo : [LEAVE EMPTY]
    path : %2Fusers%2Fweblogic%2FMIS%20EVAL%2FT_Target
    options : dr
    Action : Navigate
    col1 : "Time"."Year"
    val1 : [SELECT TIME HIERARCHY COLUMN]
    col2 : "Time"."Month"
    val2 : [SELECT TIME HIERARCHY COLUMN]
    Remember to:
    Remove “=” after PortalGo
    Surround value attribute with double quotes - e.g. @val1=”@{6}”
    After you "adjust" your URL text manually (unfortunatelly it won't be done automatically) it should look like:
    http://10.10.10.100:7001/analytics/saw.dll?PortalGo@{1}&path=@{2}&options=@{3}&Action=@{4}&col1=@{5}&val1=”@{6}”&col2=@{7}&val2=”@{8}”

  • Kernel debugging using git bisection method in combination with abs.

    Is there any way to use git bisection method for kernel debugging in combination with ABS for building the kernel via makepkg?

    Yes, but ABS for kernel uses some patches in order for the kernel to be successfully build.
    ftp://ftp.archlinux.org/other/kernel26/
    I wanted to know is if there is any way to distinguish what particular patch to use in every bisection point.

  • Can I use a WRT546 V8 Wireless Router in combination with Vonage?

    Please bear with me if this turns out to be a simple question with a simple answer; I'm challenged when it comes to these things. I use Vonage, with a Motorola VT2442-VD router. That's connected to my desktop but I also have a Dell laptop (with Windows Vista) that will rarely if ever be in the same room as the desktop. So I need a wireless signal, but when I've tried to connect the Vonage/Motorola router with my Linksys WRT546 V8 Wireless it doesn't work properly. Either the wireless works and the Vonage doesn't, or the wireless doesn't work and the Vonage does. I've only tried a few times, but I've gotten frustrated with my lack of technical knowledge. When I get frustrated, I can barely plug in a light bulb much less deal with anything technical. If anyone has some advise/help for me, I would very very much appreciate it. And if you need more details in order to help me out, I'll be happy to give them to you (or at least try). Anything is better than what I'm doing, which is nothing given my current level of frustration. Thanks!

    Well try upgrading the firmware of the router & keep on holding tightly
    the reset button in such a way that power light is blinking on the
    router & then do a complete network power cycle i.e., unplug the power
    cables from the modem & from the router & then plug in the power cable
    to the modem first once all the lights are solid green you could plug
    in the power cable to the router & check out it will definately work!!
    Also,the advanced wireless settings to Beacon Interval=50,Frag
    thres=2306 & Rts thres=2307 & then uncheck a Block Anonymous Internet
    Requests & it will definately work!!!

  • Data dependent routing

    Hi all,
    We would like to overwrite the routing function for FML32 buffer in tuxedo 9.1. The question is that it seems that the only way is defining a new type of buffer, and set the new function there. But also, we can'f find information about how to call the right server before, in the same transaction..
    Also we've heard that in tuxedo 10g you can define your own routing funcion. Is that true? any place where we could find more information about this issue?.
    Regards.

    Hi!!!
    first of all thanks for your responses!!
    In fact we have our database partitioned that way. We have oracle RAC and when different instances access to the same data we have many traffic that we need to avoid.
    So we have several instances of our servers in different groups each one connected to different instances of oracle rac.
    Each instance will allways access the same partitions. For example:
    INSTANCE0 - >PARTITION 0 AND 4 -> in this partition we have users like mod (user, 8)=0 or 4
    INSTANCE1 -> PARTICION 1 AND 5 -> in this partition we have users like mod (user, 8)=1 or 5
    INSTANCE2 -> PARTICION 2 AND 6 -> in this partition we have users like mod (user, 8)=2 or 6
    INSTANCE3 -> PARTICION 3 AND 7 -> in this partition we have users like mod (user, 8)=3 or 7
    we need to route all request for users that mod (user, 8) -> 0 or 4 -> to TUXEDO GROUP that connects with instance 0, and so on.
    We thought about the solution proposed by Malcom Freeman, but in our case we can't force our clients to change their systems and perform the mod operation. We are also considerating the opcion of moving our code to a new server such as:
    SERVER_CALLED_BY_OUR_CLIENTS
    { do nothing but calculate mod()
    call NEW_SERVER_WITH_NEW_FIELD_FOR_ROUTING
    But we already have many servers and many services.. and we are trying to avoid this opcion
    Now we are trying the overwriting option. For this we tryed that, if we change the routing function in FML buffer like:
    "FML32", /* type */
    "*", /* subtype */
    1024, /* dfltsize */
    _finit32, /* initbuf */
    _freinit32, /* reinitbuf */
    _funinit32, /* uninitbuf */
    _fpresend32, /* presend */
    _fpostsend32, /* postsend */
    _fpostrecv32, /* postrecv */
    _fencdec32, /* encdec */
    _newrouting, /* route */
    and we do:
    int _newrouting(char routing_name, char service, char data, long  len, char group)
    userlog ("calling new routing");
    return froute32 (routingname, service, data, len, group);
    It works (also we get our trace in ULOG). So we are trying now to get de routing field from data, and convert it and after that compose de buffer with new data to call _froute32... we hope that should work, but now we have some problems trying to get routing config from ubb (we're using MIB) to know the field we have to extract from buffer...The fact is that first we thought that in data we'll get the user, but it seems to be a bit complicated.
    Regards

  • Using dynamic VI referencing in combination with Application Builder

    Hi.
    In this thread (http://forums.ni.com/t5/LabVIEW/How-to-access-a-known-control-in-a-VI-reference/td-p/1255244) I attempted to modify our under development test system, so that we didn't have to change the same main VI for every test that we are adding, since we are several persons who'll be adding tests (i.e. conflicts would arise, that we would have to iron out manually every time).  The solution in the thread above works very well, by opening a reference to the test sub VI and calling the reference.  I then can pass data to the sub VI, and can get the test results from the sub VI.  Great!
    The problem came when running the program through the Application Builder.  The test system starts up fine, but the above method fails miserably, since the sub VI files are no longer there.  It seems that the sub VI's are not a part of the exe binary file coming from the Application Builder.
    So I'm drawing a blank here.  Does anyone have a suggestion on how to to this?  I'm only seeing the two options.  Either adding the sub VI's in the same main VI file, but then we have to be very careful on not getting conflicts (or fix the conflicts manually when they arise).  Or use the dynamic VI referencing, but then the system doesn't work with the Application Builder (which means that this method is out).  Is there a third method of doing this?

    Hi Mike.
    I use LabVIEW 2009.
    Well, I actually *can* use the dynamic call solution now.  The problem was that (1) the files weren't included in the exe file (fixed by adjusting the Application Builder settings), and (2) the Check if File or Folder Exist VI doesn't work if I try to check if the sub VI is present inside the exe file.  The solution here was basically to ommit the whole Check if File or Folder Exist VI, and just do an error check on the Open VI Reference VI.  The Open VI Reference VI will fail if trying to open a reference to a file that doesn't exist, so I just wired that to a case loop around the Call By Reference VI.

  • Error/fault when using reader and digitial signatures combined with dropdown list

    I'm working on a time card form for my company and they want me to have the form calculate the dates for the two week pay period when the ending day of the pay period is selected, that part I have down through code.  The problem I am running into though is that when the user is using Adobe reader to digitally sign and send the file to their supervisor the file is changing the selection date field back to the first selection in the drop down list after the first digital signature is applied and the file saved.  This error does not occur when the user is utilizing Adobe Pro though.  Does anyone have any ideas?  I'm wondering if it is something to do with the dropdown list settings or some kind of locking feature not working properly with the reader version.  I am editing/creating the form through LiveCycle Designer ES4. 

    Hi,
    The drop down list box, by default will have a blank value. If you can declare that as obligatory then the report will force the user to enter the value in it.
    Regards,
    Ravi
    Note : Please mark the helpful answers

  • Using Sony's IDC in combination with Lightroom

    I would like to use Sony's IDC Raw converter that comes with the Sony A700 camera, to be able to process through Sony IDC the RAWs with DRO's (Dynamic Range Optimalization) applied (as shot with the A700).
    However I would like to keep LightRoom as my archiving base.
    Question:
    Can I bring RAWs after importing in LightRoom, to IDC?
    Can I after converting the RAWs in IDC, bring the resulting PSDs (or JPEGs) from IDC back to Lightroom (for archiving)?
    I appreciate your help!

    Yes, and No!
    Lightroom won't allow you to use IDC on the Raw file, Lightroom only allows ACRaw to be used to "develop" a raw file. However, you can use Lightroom's ranking and review functions to help you decide which files you want to work with. Then right click and select show in explorer. Then select the file you want to develop, and tell Explorer to open in IDC. You then can develop, save as a Tiff and import.
    Pain in the Arse isn't it?
    Frankly if you have access to Bridge then this is faster and more convenient. Import the ARW and use IDC to make Tiffs.
    But why do you want to do this?
    Frankly DRO only becomes special when you use DRO Advanced. Anything else can be easily duplicated in Lightroom. DRO Advanced can not be done by IDC,(it's like Lightzone on Steroids) it's done by the camera, the processor looks at each pixel and compares it with adjacent pixels and makes adjustments in the jpg. So the only way to utilise DRO Advanced (and it's astonishing what is achieved) is to shoot in Jpg (and probably in cRaw + Jpeg) I shoot everything in cRaw only but have a MR setting to utilise DRO Advanced - cRaw + Jpeg, ISO 200 (remember that DRO advanced effectively locally amplifies the pixels, so if you start at say ISO 800 certain pixels could be amplified to ISO 6400!
    I can then import both the cRaw and the Jpeg.
    Hope this makes sense.
    Peter

  • Using Mac Terminal window in combination with X11

    I'm trying to log into a remote computer and bring up X11 applications
    to display on my laptop, from within the Mac Terminal window. I
    was able to get it to work on MacOSX10.3 by typing in
    "export DISPLAY=:0.0" on the remote computer after logging in, but
    the same trick isn't working on my MacOSX10.4 machine. I'm using
    the "-Y" option to ssh on the 10.4 machine, following an earlier query
    on that topic. If I do the ssh -Y from an X11 xterm, everything works fine.
    Thanks,
    Catherine

    Set DISPLAY=:0.0 on the local shell (verify that 'xterm' brings up a local-machine window) before ssh'ing to the remote machine. When you use a local xterm, the DISPLAY is already set to the local screen by default (duh...xterm is an x11 thing:), but Terminal is not an x11 thing so you have to set DISPLAY manually. W11-forwarding works by having the remote machine's x11 come back to "wherever the local shell had its x11 directed".

  • Using a iPOD nano in combination with a iPOD video

    I am using for a while a iPOD video that contains a copy of all my CD's. As I would like to run without breaking the hard disk, I have bough a iPOD nano 3rd gen. When I connect it to the PC, he has got the same name of the iPOD video.
    How can I diffenriate them to store only a few CD, podcast, ... on the nano?
    Thanks in advance

    The solution is to rename the iPod by double-clickung on his name in iTunes.

  • How to use "Data Conflicts Window while Publishing " with TFS API

    This dialog appears in Ms-Excel Addin for TFS .. is there any possibility to use this control in another custom addin application ? 

    Hi Shiraz,
    I don't know what this error window exactly is, but it may be a standard API located in TFS SDK. I'm not sure if you can show this error window control from the TFS API, but I think it's possible if you create a new one from scratch, by using Winform/WPF
    controls in your own add-in application.
    For how to extend the TFS client/server, you need to know the basics of TFS object model, refer to this MSDN documentation for more information:
    Extending Team Foundation
    For example, this code snippet can show the "Connect to TFS server dialog" to you:
    TfsTeamProjectCollection teamCollection;
    var picker = new TeamProjectPicker(TeamProjectPickerMode.SingleProject, false);
    picker.ShowDialog();
    if (picker.SelectedTeamProjectCollection != null && picker.SelectedProjects != null)
    teamCollection = picker.SelectedTeamProjectCollection;
    else
    return;
    You need to add reference to the TFS assemblies into your project.
    We are trying to better understand customer views on social support experience, so your participation in this interview project would be greatly appreciated if you have time. Thanks for helping make community forums a great place.
    Click
    HERE to participate the survey.

Maybe you are looking for

  • Remote Desktop Connection 8.1 cannot pin remoteapps to taskbar?

    Hi, I recently applied kb2830477 and related patches to a windows 7 enterprise node. I also updated a Windows 8 pro node to 8.1.  I'm enjoying the speed and other enhancements.  There are still problems, i.e. Outlook 2010 and 2013 toast and sound don

  • Substance LAF displaying problem

    I´m having problems using Substance LookAndFeel. At the start of my app everything goes well, I can switch to other JFrames and everything fine. When I access one JFrame that contains a tabbedpane with two panels I can see the first panel correctly,

  • Advanced Tab in Options is Blank in FF7 & computer is way slower!

    Two issues since the FF upgrade to 7.0.1 or whatever the most recent update was, my computer is running extremely slow, as I've seen many others with this issue as well, but don't know what the solution is, if there even is one? But also I am unable

  • Multiplie logon message *every* time I try to logon?

    Hello, I hope that someone can help me with this: before few weeks I had sudden failure in SAP and I was disconnected from the system and after that momment *every* time I logon I see this multiplie logon message: http://img255.imageshack.us/img255/7

  • Archwiz Script for installing Arch Linux

    I created a new Bash Script Project called Archwiz, located on  github flesh/archwiz , its still in Beta for a few reasons, I never figured out how to get computer translations working in the script so I can Localize it, and I never figured out all t