Heirarchical data model for Ontology???Help?

I am attempting to model an ontology which is purely heirarchical onto Oracle's relational database.
This can be done with a single table and a recursive foreign key, however I am concerned about the retrieval process having to repetatively query the child rows, then the grandchildren etc.
This is going to be a very large database and I am concerned about performance.
Certainly this is being done at numerous search sites on the web, and yet I cannot find any info on the subject from Oracle.
Thanks for any help at all
null

Hah! I just work it out, share it with someone face the problem like me and hope that will be helpful,
add text() after the XPath you set to get the Element data that you need, like that getLboMappingInfoResponse/result/text(); than it will get the data as XML Format for BI.
Edited by: Weibing.Xia on 2013-5-16 下午5:31

Similar Messages

  • SQL Developer Data Modeler for SQL Server 2008

    I am not able to connect to my SQL Server 2008 from the SQL Developer Data Modeler. Although I do have jtds-1.2.jar on my machine and I can connect the SQL server through the SQL Developer, but still i'm not able to connect through the data modeler. I need to re-engineer and generate data model for some existing schemas.
    Here is what I'm following:-
    File->Data Modeler -> import -> Data Dictionary -> Add new connection -> JDBC ODBC Bridge -> Other Third Party Driver
    now when I'm giving the JDBC URL and the Driver, It throws an error message stating that the driver could not be found.
    Please let me know what I can do to solve this, any help would be appreciated.
    Regards,
    AVA

    I'd try 1st to connect to the db from sqldeveloper (through jtds - no ODBC involved) and see if you can browse your db and issue sql statements in a worksheet. Then export that connection in xml format and import that from the modeler.

  • Data Model for EM....Just need to get data.

    Hi guys,
    Does Oracle provide a data model for EM? What we are trying to do is retrieve some information that is stored in the repository. ie: CPU usage, Disk Usage, etc, etc...
    I know we can get this info from doing some shell scripts as well as querying the DB, but we wanted to get this data directly from the repository as we feel it covers all the area's we need to capture our Capacity and Performance planning metrics.
    Any suggestions would be helpful, or if you understand what I am trying to acheive, then hit me up with a way to get this.
    Thanks again guys.

    Start/All Programs/Palm/Pim Conduit sync/Sync with Palm Desktop.

  • Data Modeling for a Small Database Tutorial - understanding "Creating Relations Between Entities" part

    I am trying to understand and make use of Tutorial: Data Modeling for a Small Database
    During this tutorial I'm supposed to make Transactions Entity containing two attributes that will refer to Patrons (patron_id) and Books Entities (book_id) (2.1.4)
    Later, I'm adding two one-to-many relations that duplicate mentioned attributes in Transactions Entity (patron_id1 and book_id1). (2.1.5)
    So here comes my questions: what's the purpose of creating those attributes in point 2.1.4 if they're then duplicated in paragraph 2.1.5?
    If it might be crucial, i'm using Oracle SQL Developer Data Modeler Version 4.0.0.825 Build 825 on jdk1.7.0_25.
    Bonus question: How to turn on attributes types on Logical Diagram? I can't find corresponding option anywhere...
    I would be really grateful for every answer and for every trick!

    You are looking at release 2 documentation.  I checked 3.3 and 4.0 EA3 and the tutorial has been corrected  You may want to download the latest version and use that documentation.

  • Physical Data Model For GL

    Hi Experts,
    I am new to Oracle Apps, I need Data Model for GL tables..from wherer we can get it.
    Please suggest !
    Regards
    S

    Hi,
    As you aware major part of GL is based on Chart of Account, Calander (GL Period) and Currency, you can check all tables related to these.
    You can use below query to get those tables :
    select from all_tables where table_name like 'GL%' and owner='GL'*
    and table_name not like 'GL_ALLOC%' and table_name not like 'GL_CONS%'
    Please do refer the doc suggested by the other expert in this thread..
    Regards,
    S.P DASH
    N.B : We believe you put a GL question in a wrong folder/Thread (Procurement) :).. you could have received a better response had it been in Finance folder.. :)

  • Need help with Data Model for Private Messaging

    Sad to say, but it looks like I just really screwed up the design of my Private Messaging (PM) module...  *sigh*
    What looked good on paper doesn't seem to be practical in application.
    I am hoping some of you Oracle gurus can help me come up with a better design!!
    Here is my current design...
    member -||-----0<- private_msg_recipient ->0------||- private_msg
    MEMBER table
    - id
    - email
    - username
    - first_name
    PRIVATE_MSG_RECIPIENT table
    - id
    - member_id_to
    - message_id
    - flag
    - created_on
    - updated_on
    - read_on
    - deleted_on
    - purged_on
    PRIVATE_MSG table
    - id
    - member_id_from
    - subject
    - body
    - flag
    - sent_on
    - updated_on
    - sender_deleted_on
    - sender_purged_on
    ***Short explanation of how the application currently works...
    - Sender creates a PM and sends it to a Recipient.
    - The PM appears in the Sender's "Sent" folder in my website
    - The PM also appears in the Recipient's "Incoming" folder.
    - If the Recipient deletes the PM, I set "deleted_on" and my code moves the PM from Recipient's "Inbox" to the "Trash" folder.  (Record doesn't actually move!)
    - If the Recipient "permanently deletes" the PM from his/her "Trash", I set "purged_on" and my code removes the PM from the Recipient's Message Center.  (Record still in database!)
    - If the Sender deletes the PM, I set "sender_deleted_on" and my code moves the PM from the Sender's "Sent" folder to the "Trash" folder.  (Record doesn't actually move!)
    - If the Recipient "permanently deletes" the PM from his/her "Trash", I set "sender_purged_on" and my code removes the PM from the Sender's Message Center.  (Record still in database!)
    Here are my problems...
    1.) I can't store PM's forever.
    2.) Because of my design, the Sender really owns the PM, and if I add code to REMOVE the PM from the database once it has a "sender_purged_on" value, then that would in essence remove the PM from the Recipient's Inbox as well!!
    In order to remove a PM from the database, I would have to make sure that *both* the Recipient has "purged_on" value and the Sender has a "sender_purged_on" value.  (Lot's of Application Logic for something which should be simple?!)
    I am wondering if I need to change my Data Model to something that allows my autonomy when it comes to the Sender and/or the Recipient deleting the PM for good...
    One the other hand, I believe I did a good job or normalizing the data.  And my current Data Model is the most efficient when it comes to saving storage space and not having dups.
    Maybe I do indeed just need need to write application logic - or a cron job - which checks to make sure that *both* the Sender an Recipient have deleted the PM before it actually flushes it out of my database to free up space?!
    Of course, if one party sits on their PM's forever, then I can never clear things out of my database to free up space...
    What should I do??
    Some expert advice would be welcome!!
    Sincerely,
    Debbie

    rp0428,
    I think I am starting to see my evil ways and where I went wrong... 
    > Unfortunately his design is just as denormalized as yours
    I see that now.  My bad!!
    > the last two columns have NOTHING to do with the message itself so do NOT belong in a normalized table.
    > And his design:
    >
    > Same comment - those last two columns also have NOTHING to do with the message itself.
    Right.
    > The message table should just have columns directly related to the message. It is a list of unique messages: no more, no less.
    Right.
    > Mark gave you hints to the proper normalized design using an INTERSECT table.
    > that table might list: sender, recipient, sender_delete_flag, recipient_delete_flag.
    > As mark suggested you could also have one or two DATEs related to when the delete flags were set. I would just make the columns DATE fields.
    >
    > Once both date columns have a value you can delete the message (or delete all messages older than 30+ days).
    >
    > When both flags are set you can delete the message itself that references the sender and the message sent.
    Okay, how does this revised design look...
    MEMBER --||-----0<-- PM_DISTRIBUTION -->0-------||-- PRIVATE_MSG
    MEMBER table
    - id
    - email
    - username
    - first_name
    and so on...
    PM_DISTRIBUTION table (Maybe you can think of a better name??)
    - id
    - private_msg_id
    - sender_id
    - recipient_id
    - sender_flag
    - sender_deleted_on
    - sender_purged_on
    - recipient_flag
    - recipient_read_on
    - recipient_deleted_on
    - recipient_purged_on
    PRIVATE_MSG
    - id
    - subject
    - body
    - sent_on
    Is that what you were describing to me?
    Quickly reflecting on this new design...
    1.) It should now be in 3rd Normal Form, right?
    2.) It should allow the Sender and Recipient to freely and independently "delete" or "purge" a PM with no impact on the other party, right?
    Here are a few Potential Issues that I see, though...
    a.) What is to stop there from being TWO SENDERS of a PM?
    In retrospect, that is why I originally stuck "member_id_from" in the PRIVATE_MSG table!!  The logic being, that a PM only ever has *one* Sender.
    I guess I would have to add either Application Logic, or Database Logic, or both to ensure that a given PM never has more than one Sender, right?
    b.) If the design above is what you were hinting at, and if it is thus "correct", then is there any conflict with my Business Rule: "Any given User shall only be allowed 100 Messages between his/her Incoming, Sent and Trash folders."
    Because the Sender is no longer "tightly bound" to the PRIVATE_MSG, in my scenario above...
    Debbie could send 100 PM's, hit her quota, then turn around and delete and purge all 100 Sent PM's and that should in no way impact the 100 PM's sitting in other Users' Inboxes, right??
    I think this works like I want...
    Sincerely,
    Debbie

  • How to make linkage query In Data Model for search

    I want to make an linkage query as a condition for my  report in BI Publisher. how to make the linkage in Date Model ?

    This is the forum for SQL Developer, not for general SQL or PL/SQL questions.
    Please repost this in the SQL and PL/SQL forum.

  • How to make use of a different data model for a line chart?

    I have a datamodel which is a sql query. I have a line chart on my template which is using some "Start Time" field as X-Axis. There is a chance that "Start Time" is null. In such cases the plotted point doesn't appear with x-coordinate. But I can't change the X_Axis field to some other not-null attribute due to business requirement.
    I am thinking to create another datamodel which filters all "Start Time"s which are nulls and use that datamodel for the line chart that has "Start Time" as X-axis. But I don't know how to do this as whenever I try to insert a line chart from BIPublisher menu in MS_Word, I am not prompted to select the datamodel.
    Can anybody tell me what I am trying is possible? If so how? If not, how do I have "Start Time" as X-axis without hassles?
    Thanks,
    -Vijay-

    Are you able to extract the data based on the new data model in an XML file? If so, in BI publisher desktop, are you able to load the XML data via Add-Ins ->Data -> Load XML Data?
    If you have been able to load the data successfully, then you need to Select Chart from the Insert Menu. When you do that, you should be able to see the Data layout on the left side of the chart dialog box. Then you should be able to drag and drop fields as necessary to create the chart. Are you having issues doing this part?
    You will not be prompted to select the data model. Hope I haven't misunderstood your question.
    Thanks!

  • Data Modeling for controls using XML views(SAPUI5)

    Hello ,
    I am trying to create Table control using XML view and binding data to it through controller onInit method.
    XML View Code is as follows :
    <core:View xmlns="sap.m" xmlns:l="sap.ui.layout" xmlns:core="sap.ui.core">
        <l:VerticalLayout width="100%">
            <l:content>
                <Text id="description" class="marginAll" />
                <Table id="idProductsTable" items="{       
                    path:'/businessData'
                }">
                    <headerToolbar>
                        <Toolbar>
                            <Label text="Products"></Label>
                        </Toolbar>
                    </headerToolbar>
                    <columns>
                        <Column>
                            <Label text="Product" />
                        </Column>
                        <Column>
                            <Label text="Supplier" />
                        </Column>
                        <Column>
                            <Label text="Dimensions" />
                        </Column>
                    </columns>
                    <items>
                        <ColumnListItem>
                            <cells>
                                <ObjectIdentifier title="{COUNTRY}" text="{COUNTRY}" />
                            </cells>
                            <Text text="{REGION}"></Text>
                            <Text text="{CITY}"></Text>
                        </ColumnListItem>
                    </items>
                </Table>
            </l:content>
        </l:VerticalLayout>
    </core:View>
    Controller onInit method Code is as follows :
    var oData = {
                businessData : [ {
                    'COUNTRY' : "Canada",
                    'CITY' : "Toronto",
                    'REGION' : "US",
                    'LANGUAGE' : "English"
                    'COUNTRY' : "China",
                    'CITY' : "Bejeing",
                    'REGION' : "Ashia",
                    'LANGUAGE' : "Chinese"
            var demoJSONModel = new sap.ui.model.json.JSONModel();
            demoJSONModel.setData(oData);
            sap.ui.getCore().getElementById("idProductsTable").setModel(
                    demoJSONModel);
    Same thing when i tried with JS views , it worked however through XML view , I am getting empty table.
    Is the data modeling correct for XML views?
    Thanks,
    Mahesh.

    I've got it ! The reason for that is you bind items as below,
         <Table id="idProductsTable" items="{    
                    path:'/businessData'
                }">
    This pattern is followed if you wanna add a formatter/sorter/grouping.
    As you don't do any of those you can bind items as below &  it doesn't require  data-sap-ui-xx-bindingSyntax="complex".
    <Table id="idProductsTable" items="{/businessData}">

  • Network data model for public transport

    Hi,
    I've been playing with Gis for the last 10 years and now I am enrolled in a project to find best way to go from one point to another in Barcelona (Spain) locations, using the public transport network.
    The goal is to get a route, from one address to another (both given as inputs), formed by:
    -     a 1st piece of path walking to the bus stop
    -     a 2nd piece of path involving the bus used to go from one bus stop to another
    -     optional 3rd piece of path with a second bus
    -     the last piece of path to walk from the destination bus stop to the destination address.
    This is a nice problem to resolve with a quite good looking software like the Oracle NDM. I know the big problem will be to put all the data in the right format.
    But the question I’d like to share with you would be to approve or improve the algorithm I am thinking to resolve this:
    STRUCTURE
    Network Data Model NDM_1: creation of a SDO spatial network with all the streets and cross-roads to walk through
    Other NDM_2 to store the bus stops with the bus-route-linking information.
         The reason to put them separately is for easily maintenance (a priori).
         [A second approach perhaps would be to put the bus stops as nodes of NDM_1 also]
    ALGORITHM
    1. Look for bus-stops near the geolocated origin address. (say listing BS_ORIG_list)
    2. Look for bus-stops near the geolocated destination address. (say listing BS_DEST_list)
    3. Search through NDM_2 for possible correspondences between BS_ORIG_list and BS_DEST_list through single bus line or by two different bus lines applying a network constraint.
    (if not correspondence found or if more than 2 bus lines needed, abort by app requirement)
    4. Find the walking paths needed to complete the various routes found in step 3 to get from address origin to destination
    5. Order the results by time spent or by meters to walk.
    Sure there might be improvements to this solution and also other ways to face such a common problem.
    Thanks in advance,
    David Foix

    Hi Andrejus,
    Thanks for answering.
    I read through your thread already...
    I understand and agree it would be a multimodal network. But what would that mean in the time of storing the data and asking for the route?
    But I am still doubting the way to query for the resulting route. Having two adresses to join, would there be a nice function or procedure to ask for the route giving preference to walk the minimum meters and use preferently the bus network?
    Or should I rely on the first algorithm I proposed? I thought there would be a nicer solution.
    In your case, "road transport, railway transport, naval transport and air transport" I understand it would be a case to use different networks as they don't share spatial geometry, only some nodes..isn't it? Did you have the need to join all of them to find route solution giving preference to one of them?
    Regards,
    David

  • Why does the Summary function give a datatype error when opening a data model for editing?

    I have two datasets (G1 and G2) within a data model; i've used a field in G2 and placed it into G1 as a Summary Function of G2. The datatype of this field in G2 is Integer, however if you set the Summary field in G1 to the same datatype - Integer, an error appears.

    Oracle have provided a patch.
    We are running OBIEE  at patchset  11.1.1.7.140715, Oracle have provided us with a  patch p17563277  backported to the 140715 patchset.
    This fix will presumably be incorporated in later releases.

  • Help me in design of data model for flatfiles

    hello Guru's,
                         my client is implementing FI in ECC still they are in realization. then my client gave me FI flat files and told me to extract those into BI. those flat files consists of profit and loss statements with actuals and budget data and balance sheet statements with actuals and budget in addition to these there are forecast files.
    can you tell me what are the questions i have to ask user? and please sugest me some designs
    Thanks,
    SAM

    Hi Sam John,
    Tks for ur appreciation for me.
    In my opinion, you have to understand first regarding the what purposes from the are. Then u learn the relationship among the data; it's purposes is to define dimension / master data.
    (You can ask to ur user regarding the data itself: What the relationsionship is in among those data? What's the purposes of data itself? What's the full cycle of that business ?? )
    Then u have to define whether it uses ODS / Info Cube. You can decide it by knowing whether the data is detail data / summary. If it's detail you can choose ODS, and Cube for summary one.
    I just gave u the highlight point, for the detail u can see on this refrence.
    how to build good modeling.
    http://help.sap.com/bp_biv270/documentation/Multi-dimensional_modeling_EN.doc
    Hopefully it can help you a lot.
    Best regards,
    Niel.

  • Help me in design of data model for flat files

    Hello guru's,
                        my client gave me excel spreadsheets and told to analyze what are the things i have to look into those. and also my manager introduced to one of the finace user . what are the questions i have to ask user regarding those files. the excel spreadsheets contains  actual data, budget data and forecast data.
                        one more thing SAP FICO in ECC is in realization phase. my manger told me that you have to map all G/L accounts and Costcenters with R/3 because there excel spreadsheets have different format.
              send me what are all questions i have to ask user? there are no documnets also so can you please tell me what are all documents i have to prepare like func spec or tech spec?
    Thanks,
    Sneha

    Hi,
    First thing you need to ask the finance person, what are the buisness requirement?
    you first take down what all he says and tell him or her that you would contact him again with after sometime.
    post the exact requirent here, then someone may help you
    Thanks
    Mohammad Riaz
    Edited by: Shadow on Nov 11, 2008 6:02 AM
    Edited by: Shadow on Nov 11, 2008 6:03 AM

  • Sharepoint 2013 Reporting Services & OLAP Cubes for Data Modeling.

    I've been using PowerPivot & PowerView in Excel 2013 Pro for some time now so am now eager to get set up with Sharepoint 2013 Reporting Services.
    Before set up Reporting Services  I have just one question to resolve.
    What are the benefits/differences of using a normal flat table set up, compared to an OLAP cube?
    Should I base my Data Model on an OLAP Cube or just Connect to tables in my SQL 2012 database?
    I realize that OLAP Cubes aggregate data making it faster to return results, but am unclear if this is needed with Data Modeling for Sharepoint 2013.
    Many thanks,
    Mike

    So yes, PV is an in-memory cube. When data is loaded from the data source, it's cached in memory, and stored (compressed) in the Excel file. (also, same concept for SSAS Tabular mode... loads from source, cached in mem, but also stored (compressed) in data
    files, in the event that the server reboots, or something similar).
    As far as performance, tabular uses memory, but has a shorter load process (no ETL, no cube processing)... OLAP/MDX uses less memory, by requiring ETL and cube processing... technically tabular uses column compression, so the memory consumption will be based
    on the type of data (numeric data is GREAT, text not as much)... but the decision to use OLAP (MDX)/TAB (DAX) is just dependent on the type of load and your needs... both platforms CAN do realtime queries (ROLAP in multidimensional, or DirectQuery for tabular),
    or can use their processed/in-memory cache (MOLAP in multidimensional, xVelocity for tabular) to process queries.
    if you have a cube, there's no need to reinvent the wheel (especially since there's no way to convert/import the BIDS/SSDT project from MDX to DAX). If you have SSAS 2012 SP1 CU4 or later, you can connect PV (from Excel OR from within SP) directly to the
    MDX cube.
    Generally, the benefit of PP is for the power users who can build models quickly and easily (without needing to talk to the BI dept)... SharePoint lets those people share the reports with a team... if it's worthy of including in an enterprise warehouse,
    it gets handed off to the BI folks who vet the process and calculations... but by that time, the business has received value from the self-service (Excel) and team (SharePoint) analytics... and the BI team has less effort since the PP model includes data sources
    and calculations - aside from verifying the sources and calculations, BI can just port the effort into the existing enterprise ETL / warehouse / cubes / reports... shorter dev cycle.
    I'll be speaking on this very topic (done so several times already) this weekend in Chicago at SharePoint Saturday!
    http://www.spschicagosuburbs.com/Pages/Sessions.aspx
    Scott Brickey
    MCTS, MCPD, MCITP
    www.sbrickey.com
    Strategic Data Systems - for all your SharePoint needs

  • Reuse Dimdate across different database for different data models

    Hi,
    I am designing a new data model for a data mart. I need to add dimdate dimension in this data model. I noticed that dimdate already exists in another database and being used by another
    data model. Is it possible to re-use the existing dimdate table for my new data model? If so, what about the foreign key constraints? Normally we link the date columns from fact table to the dimdate keys. How would we achieve that in case we are using the
    same table across different databases?
    Any opinion on this will be highly appreciated.
    Thanks in Advance.
    Cheers!!

    You can create a copy of dimdate table to your new data warehouse.
    If both data marts were in a single data warehouse, then you don't required to copy. but as these are in two different databases then you just copy that.
    regarding FK relationship. you can connect any fact table to you date dimension. even if you want to use more than one instance of your date dimension, it would be simply adding multiple FK columns in your fact table (role playing dimension).
    For date dimension be sure that your date dimension covers most of the attributes required. here is an example of date dimension:
    http://www.rad.pasfu.com/index.php?/archives/156-Script-to-Generate-and-Populate-Date-Dimension-Version-2-Adding-Multiple-Financial-Years.html
    Regards,
    Reza
    SQL Server MVP
    Blog:  
    http://rad.pasfu.com  Twitter:
      LinkedIn:
    SQL Server Integration Services 2012 Tutorial Videos:
    http://www.radacad.com/CoursePlan.aspx?course=1

Maybe you are looking for

  • Does EBS Integration Repository has API capability or Export facility?

    Now looking into the Integration repository in V12 from the perspective of application integration. Is is possible to interrogate the repository from an outside program using APIs? Or is the repository only meant to to be a browser within the Oracle

  • Music not playing in itunes

    Recently some of the songs I have in iTunes won't play anymore. Whenever I click on a song to play it says it can't find it even though I have it saved in my library. I get a little exclamation in a circle when it won't play. I have to manually find

  • Problems synchronizing photos in photostream

    My photo stream reached the limit of 1000. Therefore it stopped loading. After deleting some, the newest photos still do not appear in the stream. Does this mean photo stream does not recognize them as new? How to get them in photo stream?

  • Urgent Process chain

    Hi experts,               can any one tell me how to copy the process chain with step by step info thanx regards rajesh

  • Java 1.6 Performance

    I am trying to assist one of our users with a report she is building. Using Discoverer 10g web version: She added a "new total" of type sum on the quantity field. It totals fine except that the "Sum:" label often appears on the wrong lines. For examp