TABLE(VARRAY) applied to multiple VARRAY instances - possible ???

Question: How can I make a VARRAY column look like a table, if I want the result
being based on multiple rows.
Example: I have got a table with runs of users. For each run I have one row in RESULTS
where the VARRAY v contains one entry per user. I want to join the contents of
the VARRAY to the corresponding USERS rows for all RUNS I find in the RESULTS table.
CREATE TYPE VECTOR AS VARRAY(1000) OF NUMBER
CREATE TABLE RUNS
RUN_ID NUMBER(10) PRIMARY KEY
INSERT INTO RUNS VALUES(1)
INSERT INTO RUNS VALUES(2)
COMMIT
CREATE TABLE USERS
USER_ID NUMBER(10) PRIMARY KEY
,USER_NAME VARCHAR2(4)
INSERT INTO USERS VALUES(1,'Jim')
INSERT INTO USERS VALUES(2,'John')
COMMIT
CREATE TABLE RESULTS
RUN_ID NUMBER(10) PRIMARY KEY
,V VECTOR -- one entry per user here
INSERT INTO RESULTS VALUES(1, VECTOR(3,4))
INSERT INTO RESULTS VALUES(2, VECTOR(5,6))
COMMIT
-- Now I select the results for one run
-- and make VECTOR look like a table using the TABLE function.
SELECT USERS.USER_NAME, RES.RESULT
FROM
(SELECT ROWNUM AS ROW_NUM, COLUMN_VALUE AS RESULT FROM TABLE(SELECT V FROM RESULTS WHERE RUN_ID=1)) RES
, USERS
WHERE RES.ROW_NUM = USERS.USER_ID
USER RESULT
Jim 3
John 4
-- OK, so far.
-- Now generalize that to all runs
SELECT RUN.RUN_ID, USERS.USER_NAME, RES.RESULT
FROM
(SELECT ROWNUM AS ROW_NUM, COLUMN_VALUE AS RESULT FROM TABLE(SELECT V FROM RESULTS WHERE RUN_ID=RUN.RUN_ID)) RES
, USERS
, RUNS
WHERE RES.ROW_NUM = USERS_USER_ID
-- expected:
RUN_ID USER RESULT
1 Jim 3
1 John 4
2 Jim 5
2 John 6
-- got:
(SELECT ROWNUM AS ROW_NUM, COLUMN_VALUE AS RESULT FROM TABLE(SELECT V FROM RESULTS WHERE RUN_ID=RUN
ORA-00904: "RUN"."RUN_ID": invalid identifier
-- Is there a general approach to solve this if the number of runs is variable
-- given the tables layout cannot be changed ????
Thanks for your thoughts
Nils
DROP TABLE RESULTS
DROP TABLE USERS
DROP TABLE RUNS
DROP TYPE VECTOR
/

Agreed but in case of any updation for user_id column
  1  SELECT r.run_id, u.user_name,
  2           r.column_value result
  3    FROM   users u,
  4          (SELECT r.run_id,
  5                  ROW_NUMBER () OVER (
  6                     PARTITION BY r.run_id
  7                     ORDER BY ROWNUM) user_id,
  8                   column_value
  9            FROM   results r, TABLE (r.v)) r
10*   WHERE  r.user_id = u.user_id
11  /
    RUN_ID USER     RESULT
         1 Jim           3
         1 John          4
         2 Jim           5
         2 John          6
SQL>
SQL> SELECT * FROM users;
   USER_ID USER
         1 Jim
         2 John
SQL> UPDATE users SET user_id=user_id+1;
2 rows updated.
SQL> SELECT r.run_id, u.user_name,
  2           r.column_value result
  3    FROM   users u,
  4          (SELECT r.run_id,
  5                  ROW_NUMBER () OVER (
  6                     PARTITION BY r.run_id
  7                     ORDER BY ROWNUM) user_id,
  8                   column_value
  9            FROM   results r, TABLE (r.v)) r
10    WHERE  r.user_id = u.user_id;
    RUN_ID USER     RESULT
         1 Jim           4
         2 Jim           6
SQL> Khurram

Similar Messages

  • Using Eloqua tracking scripts on a single website but from multiple Eloqua instances - possible?

    OK, I have a possibly odd question.
    Is it possible to have a single site (www.xyz.com) be able to report web traffic to two different Eloqua installs (Company A install & Company B install).  This way Company B possibly in a partnership can monitor traffic on Partner pages of Company A's website and if it is a known visitor to Company B's instance - get that page view info.
    If so, is it just as simple as dropping the standard Eloqua tracking scripts?  If not, how would one go about that?
    Gratz in advance.

    This is possible but you have to add in the appropriate pieces of information to the tracking scripts to indicate which instances you want to point at.
    Eloqua's asynchronous tracking scripts allow you to track to more than one instance of Eloqua if you have a business need for doing so. Configuring that is as simple as calling elqTrackPageView twice after setting the different siteIds. Here's an example:
    <script type="text/javascript">
    var _elqQ = _elqQ || [];
    _elqQ.push(['elqSetSiteId', '123']);
    _elqQ.push(['elqTrackPageView']);
    _elqQ.push(['elqSetSiteId', '456']);
    _elqQ.push(['elqTrackPageView']);
    (function () {
    function async_load() {
    var s = document.createElement('script'); s.type = 'text/javascript'; s.async = true;
    s.src = '//img.en25.com/i/elqCfg.min.js';
    var x = document.getElementsByTagName('script')[0]; x.parentNode.insertBefore(s, x);
    if (window.addEventListener) window.addEventListener('DOMContentLoaded', async_load, false);
    else if (window.attachEvent) window.attachEvent('onload', async_load);
    </script>
    The bolded lines show the two extra commands needed to make the extra call to the Eloqua servers in order to track to a second instance.  This is just sample script and you'd obviously need the SiteID's that are relevant to what you are pointing at but this is just adding two lines into the normal tracking scripts to indicate the extra instance and the fact that you'd like the pages to be tracked.

  • Is it possible to deploy SharePoint or its Service Applications on: multiple DB-Servers and multiple SQL Instances?

    Hello Forum,
    We have a SharePoint 2013 farm (Enterprise edition) that uses one single SQL Server 2012 (Standard edition). That statement means: All my SharePoint DBs e.g. (Config, Admin, Content, and Service Apps) DBs are hosted and running onto one single instance e.g.
    Server1\SQLInstance1.
    We have some new requirements to install and configure BI tools such as: PerformancePoint services and PowerPivot. BI tools require either SQL Server 2012 Enterprise or BI editions, and we do NOT want to upgrade our current SQL Server1\SQLInstance1
    Instead, We have other separate SQL Server instance which is enterprise edition let's name it (ServerX\InstanceX) that is running standalone, and we are thinking or using it, and my 2 questions are:
    1) Can we use this other separate
    SQL Server instance which is enterprise edition to host the create and hosts the DBs of PerformancePoint services and PowerPivot ?
    2) My second question is the same: Can I create PerformancePoint services application in my SharePoint farm, But in the Database Server field, I fill up
    the details of the other DB server ServerX\InstanceX  which is the one that is SQL
    enterprise edition ? Will this work ?
    Any official Microsoft resources/links tell that it is possible to deploy SharePoint or its service applications on multiple DB-Servers and multiple SQL Instances?

    Thank you Alex and Anil,
    What are the ramifications of that?
    I mean, Assuming that I have created such a farm where most of SarePoint DBs in Standard SQL instance while the PerformancePoint service application and others e.g. PowerPivot and reporting service are deployed and configured onto other Enterprise SQL instance.
    Are there any recommendations or concerns that you would like to draw my attention to ?

  • Is there a way to create a table then apply it to other sheets and have it mirror changes across all the sheets? Like a sync'ed table that will be the same on multiple sheets

    I have a budget spreadsheet with a couple recurring tables. As months go by, I sometimes make changes to these tables and would like those changes to be synced across the different sheets in the spreadsheet. Is there a way to sync content across multiple pages, essentially having one true table that appears on multiple sheets?

    Abdur,
    In the sheet :: table :: cell that is going to receive the data, type an equals sign, then click on the cell that the data will come from and press Return.
    This will require that you navigate to the origin cell by choosing the proper sheet.
    Jerry

  • Multiple AIA instances on same server?

    Hi,
    Does anyone know how to install multiple AIA instances on the same SOA server? The installer hints that this might be possible, although I have not seen how this is done because the installer expects an empty oracle home path. The only information I have found so far is for an earlier 2.x release of AIA and that suggested:
    "Each SOA installation must be in his own ORACLE_HOME directory.
    Each AIA installation must be in his own ORACLE_HOME directory.
    You must point to different databases. You can not have mulitple SOA schema's in one sinlge database."
    Is this true?
    Thanks,
    Kev.

    957354 wrote:
    Hi Laxman,
    If its a testing purpose its ok to have a one oracle_home and single listener for multiple DB. One problem when applying oracle patches for one of the instances running in a multiple DB running then you need to bring all the DB running in that host since the oracle_home is sharing between all db instances.
    Thanks
    Sunil Rajesh K.C.Starting with 11g, patchset updates are full installs and done into a new home. This makes it possible to migrate one instance at a time. However, your point would be a consideration for one-off patches like CPUs.
    However, please note ... OP --- PLEASE NOTE even with multiple oracle homes, you still only need a single listener.

  • Multiple ASM Instances on single node

    Hi All,
    After going through some threads it seems to me that multiple ASM instances on a node is not supported and recommended by ORACLE but i coudn't find any ORACLE document or support matrix mentioning this. Can any one give me a pointer to this. Please correct me if i am wrong and Multiple ASM instances are fully supported by ORACLE.
    Thanks,

    Multiple ASM instances on a single node are supported but not recommended due to several issues you could encounter during that kind of configuration.
    Possible interraction between those instances for identification of each disk area usage,each disk area permissions , database instance to asm instance mapping and so on, could result in unwanted behaviour, as ASM is in some way just oracle's representation of LVM.
    Intention is that any kind of distinction/separation of oracle related data under the ASM should be done through the disk groups.
    So it seems it would be better to apply this kind of logic rather than doing suspiciously magical and rare configurations that could bring you similiary strange and unexpected problems.

  • Pros and Cons of Application Isolation/Multiple server instances?

    Hi. I'm setting a new server using ColdFusion Enterprise with Apache to migrate several web application from and old server with ColdFusion 7 server. I'm currently doing research regarding multiple server instances in order to have a separate server for production apps and another for development apps (see http://help.adobe.com/en_US/ColdFusion/10.0/Admin/WSc3ff6d0ea77859461172e0811cbf363c31-7ff 5.html and https://wikidocs.adobe.com/wiki/display/coldfusionen/Using+Multiple+Server+Instanceshttp:/ /). In addition, I'm also doing research regarding application isolation to have separate production application in separate servers. I'm trying to identify all pros and cons for both "Application Isolation" and "Multiple Server Instances" to make a decision on whether I will proceed in applying these techniques. I have found several links that talk about some of the advantages but have not been able to find anything regarding possible disadvantages. Have anyone in this forum has used any of the techniques, and can provide more information/experiences regarding the pros and cons?

    Hi Ricardo_Lorenzo,
    Whether to go for Multiserver instances or Single server, is totally a user requirement based decison. If a user has Single website, or multiple websites (of the same nature, in terms of functionality), usually the part of same domain, then they would go for Single sever installation. One single instance will handle the requests from all the websites (if there are multiple). There would not be a clustering/failover setup within ColdFusion and can use the ColdFusion Standard or Enterprise version.
    On the other hand, if a user has multiple websites, all with different functionality and have multiple applications (may or may not) running, then they can go for Multiserver installation. Each website can be configured with individual instances. Clustering can be done within ColdFusion if needed. One would need an Enterprise license of ColdFusion for the same.
    Hope this helps.
    Regards,
    Anit Kumar

  • Central Monitoring of Multiple XI Instances

    Is it possible to have central monitoring (message and end-to-end) for multiple XI instances in a landscape?
    We have a landscape with one Central XI instance and multiple local XI instances.
    Is it possible for me to ...
    1. Add the local XI integration servers to the component monitoring?
    2. Monitor the messages flowing through the local XI instances in the central XI RWB (even for messages that do not flow through the central XI)?
    3. Have end-to-end monitoring even for messagesthat do not flow through the central XI?

    Try removing the "CentralMonitoringServer-XIAlerts" RFC connection from both systems (proxy and PI). Then, ensure all com.sap.aii.rwb.server.centralmonitoring.* parameters are configured correctly on PI (restart system to apply changes on exchange profile) and try executing the scenarios again.
    Check the target CCMS system the RSALERTDISP report and see if the alerts are being received there.

  • One Oracle Application Server and multiple OC4J Instances

    While we are getting new server (for development) we have to set-up a Development and Production enviroment for our project (dont have previous OAS installation) but on the same machine.
    So, is a good option to have One OAS INstallation (Ora Home) and then create multiple OC4J instances, one for each environment (Development, QA and Production)?, is it possible? How to manage deployments between OC4J instances, simulating different contexts if they are in the same server and under the same HTTP Server? Will be there conflicts?

    Hi,
    you can do this and OracleAs will manage the different instances. The deployment can be done with Enterprise Manager in which case you select the OC4J instance first before deploying. If deploying from JDeveloper, you can specify the OC4J instance when creating the named connection
    Frank

  • Need to get data from multiple database instances in a single query

    Hi,
    I need a small favour from you guys. The prob is as follows:
    I need to get name, row_id from table A from schema 1 and gbu_name, name from table B from schema 2 where a.name = b.name. I wrote the query in the following manner:
    SELECT a.name, a.row_id, b.gbu_name
    FROM Schema1.A as a, Schema2.B as b
    Where a.name = b.name
    But this query is not working and the error is like " The table does not exists".
    Please update me how to avoid the error and get the right sort of result.
    Thanks & Regards,
    Debabrata

    Ah, youre actually asking different things.
    In your topic title, you say youre running separate instances
    In your body text, you say you are under different user/schema
    So tell me, do you have more than one database or not? How many entries in your TNS file?
    I would say, for "multiple database instances"
    SELECT
      a.id, b.id
    FROM
      tableA a
      INNER JOIN
      tableB@OTHER_DATABASE_LINK_NAME b  <--NOTE!
      USING(id)And of course you will have to look up CREATE PUBLIC DATABASE LINK sql..
    Message was edited by:
    charred

  • Multiple DMVPN Instances on Same WAN Interface

    Hi Folks,
    Is it possible to run Multiple DMVPN Instances on a single WAN Interface ? Can we for example configure 3 Tunnels on a Router using one same WAN Interface but running separate EIGRP Instances for each Tunnel ? Kindly let me know , Alioune

    Hi Alioune,
    Yes you can create DMVPN as you said with one WAN interface that is possible..... you can have multiple tunnel interfaces pointed to a WAN interface as the source interface which resides in public zone..... with different public ip's as the destination tunnel...
    interface Tunnel1
    description ** A-VPN Tunnel **
    bandwidth 100000
    ip vrf forwarding red
    ip address 10.0.252.2 255.255.255.252
    no ip redirects
    no ip unreachables
    no ip proxy-arp
    ip mtu 1500
    load-interval 60
    tunnel source GigabitEthernet0/0 (WAN Interface)
    tunnel destination  1.1.1.1
    tunnel protection ipsec profile dmvpn
    interface Tunnel1
    description ** B-VPN Tunnel **
    bandwidth 100000
    ip vrf forwarding red
    ip address 10.0.252.5 255.255.255.252
    no ip redirects
    no ip unreachables
    no ip proxy-arp
    ip mtu 1500
    load-interval 60
    tunnel source GigabitEthernet0/0 (WAN Interface)
    tunnel destination  2.1.1.1
    tunnel protection ipsec profile dmvpn
    like the above..... shown sample...
    Please rate if the given information helps!!!

  • Multiple BW Instance question

    Getting lost wading through the plethora of information available, so any direct pointers to information that directly addresses the following would be appreciated, in addition to any direct answers of course!
    Consider a multiple BW instance, with a staging layer, and integration layer, and multiple analytic layers (one large one, and a couple smaller ones that operate at different service levels).
    If we upgrade staging to 3.5, do all the rest have to tag along (staging feeds integration and one other, integration gets data from staging and feeds all the rest).
    If we upgrade to 7.0, or whatever they are going to call it, will every system have to upgrade, or are the BW to BW connections backwardly compatable.
    Major reason for considering this architecture is the ability to have a couple target BW's at higher or lower releases/service pack levels as warranted by the specific applications using them (e.g. SEM wanting SP's faster than the uber BW can test and apply them).   However, if we have to keep all systems in synch, then this is NOT a viable option.
    Mark Marty
    EBIS Architect
    Mckesson

    This is based on some work being done by another member of the Terabyte club, and we have similar sized landscapes.
    The key concerns are, with a single instance of BW, and ~ 10M rows of raw transactions migrating into the system each night (SD - BO, DD, SO's, MM, COPA, it all adds up), it is not clear to us that BW could handle all data rationalization, transformation and dissemination tasks. We have significant legacy transactions and master data that needs to be merged with a good portion of the R/3 data.
    In our current environment, 90% of this is done outside of BW, using Datastage and a Data Provisioning Area.  Even with sending only clean, merged, ready to load to target data, our load windows are increasing, and our backup window is scary.
    This new effort is an attempt to separate key data needs so that certain things that need to be more responsive can be made available earlier.  Additionally, this landscape is going to test out and prove or disprove the ability to do this volume of ETL wholly within a BW environment, rather than leveraging on other marketplace tools and techniques.  Needless to say, this is a hot topic, internally we have a slant towards best in breed, and consultatively everyone wants it all to be the SAP suite of tools.   Hence, we will build it and see.
    Splitting Staging from Integration is not necessary (likely) from a purely technical need, however it has proven useful at another company, and we are going to explore it and determine the efficacy ourselves.

  • LDAP connections with multiple proxy instances

    After configuring LDAP connectivity through the Admin application on a machine with multiple proxy instances I end up with:
    number of proxy instances x LDAPConnPool times number of connections to the LDAP server.
    Question: Is it possible to prevent some of the proxy instances from opening LDAP connections?

    Hi
    Increase the IDLE timeout value on the LDAP server. Of course, this just extends the inevitable. Check if there is a way to disable IDLE timeout on LDAP server.
    Regards,
    Nagendra HK

  • Process Chains across multiple BW instances?

    We're considering to implement a "CIF" (corporate information factory) approach for BW, using mutiple BW layers (staging, storage, analysis, across regions).  It seems to make sense from a performance management perspective.
    However one concern is the administration / monitoring.  
    Can standard BW tools (e.g. process chains, admin cockpit) customized to be used across multiple BW instances (plus ideally OLTP reporting, which we also have)? Or would we end up having to log in to each system individually?  My gut / understanding tells me it is the latter one, but would appreciate any thoughts.
    Thanks!

    Hi Ingo,
    Thanks for the information. I too have same scenario to work out.
    Is there any possibility of automatic Source/server name conversion from system A to System B while copying all the Universes and Queries.
    For Example: In BI we have an option to datasource server name change convention while transporting from one system to another system through out the landscape.
    As you said in your reply, in life cycle manager, do we need to edit connection for each & every universe or any option for one and all universes.
    Any documentation on this is appreciable...
    Please provide me information.
    Thanks in Advance.
    Regards,
    Ravi Kanth

  • Capture from Two Tables and Apply in one Table

    Hi All,
    Is it possible to capture from two tables (master and child table) and apply in one table?
    For example,
    1. DEPT and EMP tables in source database with relation
    DEPT table structure is like DEPT_ID, DEPT_NAME, LOC_NAME
    EMP table structure is like EMP_ID, EMP_NAME, EMP_DOJ, EMP_SAL, DEPT_ID
    2. EMP_DEPT_STAGING in destination database
    EMP_DEPT_STAGING table structure is like EMP_ID, EMP_NAME, EMP_DOJ, EMP_SAL, DEPT_ID. DEPT_NAME, LOC_NAME
    if there is any update in DEPT table, EMP_DEPT_STAGING should get populated with Department and its employee details. Similarly, if there is any update in EMP table EMP_DEPT_STAGING table should get populated with Employee details and along with department detail.
    Is it possible to accomplish this? If yes, could you please provide me some examples?
    Thanks & Regards
    Thiyagu
    Edited by: mt**** on Sep 4, 2011 11:22 PM

    like this
    INSERT @PlantNew  (PlantID, PlantName, PlantDirExists, PlantAssistantDirID, PlantDirID) 
    SELECT p.PlantID,
    p.PlantName,
    CASE WHEN pd.PlantID IS NULL THEN 0 ELSE 1 END,
    PlantAssistantDirID,
    PlantDirID
    FROM @Plant p
    LEFT JOIN (SELECT PlantID,
    MAX(CASE WHEN Assistant = 1 THEN PlantDirectorID END) AS PlantAssistantDirID,
    MAX(CASE WHEN Assistant = 0 THEN PlantDirectorID END) AS PlantDirID
    @PlantDirector
    GROUP BY PlantID)pd
    ON pd.PlantID = p.PlantID
    Please Mark This As Answer if it helps to solve the issue Visakh ---------------------------- http://visakhm.blogspot.com/ https://www.facebook.com/VmBlogs
    You're missing a FROM
    insert into @PlantNew
    SELECT p.PlantID,
    p.PlantName,
    CASE WHEN pd.PlantID IS NULL THEN 0 ELSE 1 END,
    PlantAssistantDirID,
    PlantDirID
    FROM @Plant p
    LEFT JOIN (SELECT PlantID,
    MAX(CASE WHEN Assistant = 1 THEN PlantDirectorID END) AS PlantAssistantDirID,
    MAX(CASE WHEN Assistant = 0 THEN PlantDirectorID END) AS PlantDirID
    from @PlantDirector
    GROUP BY PlantID)pd
    ON pd.PlantID = p.PlantID

Maybe you are looking for

  • What is the best practice on mailbox database size in exchange 2013

    Hi,  does anybody have any links to good sites that gives some pros/cons when it comes to the mailbox database sizes in exchange 2013? I've tried to google it - but hasn't found any good answers. I would like to know if I really need more than 5 mail

  • Open script play doesnt work in the clustered env

    We have Clustered Webcenter env. Here, we record our flow in the open script. If we play back as the recorded user, it works fine. However if we randomize the users it fails with the following two errors : 1.Failed to solve variable web.input.wcconte

  • Can I use MD5 value for indexing files ?

    I would like to know if MD5 value for each unqiue file is also unique. I am wonder if U can use MD5 value of a file for indexing. Any suggestion ??

  • Why Archive and Reinstall?

    I'm frustrated, i don't know what the point of archiving and reinstalling is, i mean, it's good that you can preserve your user settings, i'm glad about that, but there some things that don't get updated to the new system and even though they exist o

  • Played Podcasts return

    Hello everyone Is anyone experiencing the same issue as me, using Podcasts on a iPod Touch with latest software update. When I listern to a podcast, it shows as being played, then if I turn off and on again (holding the switch for period of time or j