Logical schema

Hello all.....
I m using my local Oracle db as my target and Oracle db on some other system as source....
Plz let me know which KMs are to be imported....
Thanks...
Susane

Hi Susane,
I am listing the steps to create a data server in Topology manager and do reverse engineering in Designer to make sure if you did all these.
Source oracle DB:
1. Go to topology manager, right click and insert data server.
2. Give name and the user details
3. Go to the JDBC tab and give the appropriate JDBC driver and URL
4. When you click ok, a Physical Schema Box opens.Here, we have to mention the schema and work schema.
5. Go to the Context tab and create a context. Select Global context or any context appropriate to you environment. For 'logical schema', you can either enter a new name or use the one already created under logical architecture. (This is very important, if you do not create a context under physica schema, you will not be able to do Reverse Engineering). Click ok to close the physical schema.
6. Go to the designer create a model, select the technology and the logical schema you have created in Topology manager. Go to the Reverse tab and then, you select the Context under which you have defined this logical schema.
7. Do the selective reverse or just click on the reverse tab.
You should be able to do the reverse engineering without any issues. Is this of any help to you?

Similar Messages

  • Logical Schema not being found

    I have an incoming data feed that works fine in our Development environment but when we import the scenario into UAT it gets an error:
    ODI-1222: Session start failure on agent DATA_INTEGRATOR_UAT_11G: logical schema FTP_IMAGE_OUT cannot be found in the master repository.
    Initially the logical schema was mistyped so the error was fair enough but even now that we've fixed it we get the same error. We've even tried completely deleting and recreating everything.
    I've looked in the master repository database and there’s definitely a record in snp_lschema with an lschema_name of “FTP_IMAGE_OUT” (i_lschema 67001), a record in snp_pschema_cont with an i_lschema of 67001 and an i_pschema of 131001 which is “FTP_IMAGE_OUT.\” in snp_pschema so it all connects up.
    I've tested the connection of the physical schema and it's successful.
    Can anyone suggest anything else I can look at or try? I've run out of ideas.

    Never mind, I found the problem. On the "Command on Source" tab of the procedure I'd selected a Context of "Development", so in UAT (which only has a UAT context) it couldn't find it.
    No idea why, I've never done it before. Maybe it was a slip of the mouse.

  • The interface you are trying to use is related to a logical schema that no

    "The interface you are trying to use is related to a logical schema that no longer exists"
    I'm facing this error when importing a project on Designer connect to a new work repository.
    I have an TEST Data Integrator environment and now I need to move objects already created to a new DEV environment. I've created a new master and work repository with distinct ID's according note https://metalink.oracle.com/metalink/plsql/f?p=130:14:4335668333625114484::::p14_database_id,p14_docid,p14_show_header,p14_show_help,p14_black_frame,p14_font:NOT,423815.1,1,1,1,helvetica
    Any ideas?
    Thanks

    Hi,
    Nothing occurs. My steps:
    1) Export Master Repository from 1st environment (topoloy -> export master repository)
    2) Create Master Repository on 2nd environment (through repcreate.bat)
    3) Export Topology (1st environment)
    4) Export all projects (1st environment)
    5) Import Topology (2nd environemtn) ----> com.sunopsis.core.n: This import action has been cancelled because it could damage your repository (problem with the identifier sequences)
    Is this sequence of operations correct?
    Thanks

  • Will deleting a column at logical schema delete the same at physical level by DDL Sync?

    Will deleting a column at logical schema delete the same at physical level by DDL Sync?

    Hi David,
    First of all thanks for your quick response and for your help logging the enhancement request,
    I am testing more or less your suggestion but I  am not sure if I understood exactly what you mean,
    1)I imported from data dictionary in a new model and into the options menu on the schema select screen I un-ckecked partitions and triggers,
    I guessed that the import should not get from the data dictionary the information about the partitions but the result is that the tables partitioned (by list in this case) are partitioned by range without fields into the physical model on SDDM,
    2)I select one of the tables modify a NO partitioned option and propagate the option for the rest of the tables
    3) I imported again from data dictionary but this time I included the partitions into the option menu on select schema screen,
    into tabular view on compare models screen I can select all the tables with different partitioned option, also I can change for "list partitions" and select only the partitions that I want to import.
    So I have a solution for my problem, thanks a lot for your suggestion
    The second step I'm not sure is needed or maybe I can avoid the step with some configuration setting in any of the preferences screen,
    if not, I think the options to not include partitions into select schema screen are not so clear, at least for me,
    please, could you confirm me if a way to avoid the second step exists or if I misunderstood this option?
    thanks in advance

  • Context,Physical schema and Logical schema

    Hi,
    How the context,physical schema,logical schema and agent are interrelated.
    Please explain
    Thanks
    Jack

    Hi Jack,
    Context:
    A context is a set of resources allowing the operation or simulation of one or more data processing applications. Contexts allow the same jobs (Reverse, Data Quality Control, Package, etc) to be executed on different databases and/or schemas.
    Its used to run the object(process) in different database.
    Physical Schema:
    The physical schema is a decomposition of the data server, allowing the Datastores (tables, files, etc) to be classified. Objects stored in data servers with this mode of classification can be accessed by specifying the name of the schema attached to the object name.
    Ex
    Oracle classifies its tables by "schema" (or User). Each table is linked to a schema, thus SCOTT.EMP represents the table EMP in the schema SCOTT.
    Logical schema:
    A logical schema is an alias that allows a unique name to be given to all the physical schemas containing the same datastore structures.
    ->The aim of the logical schema is to ensure the portability of the procedures and models on the different physical schemas. In this way, all developments in ODI Designer are carried out exclusively on logical schemas.
    Thanks
    Madha

  • Problem in exporting logical schema

    Hi,
    I exported my logical schema(master repository1) into a xml file.
    But when i imported it into another master repository2, i did not show up the logical schema present in masterrepository1.
    What could be the reason for this?
    Thanks.

    Hi,
    Steps:(MR1)
    Under Logical Schema : Right click Oracle -- > Export Technology --> Xml
    MR2:
    Under Logical Schema : Right click Oracle -- > Import Technology -- > Insert update mode
    But nothing gets inserted .
    Thanks.

  • How to 1 logical schema to many physical schemas?

    I have a database schema which is instantiated on many different servers. I set up a physical schema pointing to one, and a logical schema pointing to that physical schema. I imported the schema to a model, created interfaces for the tables, and created a package to execute it; and that is all working for that one physical instance.
    1) How can I implement that same model, interfaces, and package for each of the physical instances?
    1a) Can I change the JDBC parameters at package run time to point to a different database? How?
    1b) Can I select a different physical schema for the logical schema at package run time so that I only have to set up a different physical schema for each database? How?
    Thank you.

    "But if you have a lot of context (for example 1000 stores), you can define a generic physical schema, a logical one. The physical is based on variables (host, port,..). "
    Using contexts is working for me, but at least one of my schemas has more than 50 server instances, so this approach would be beneficial. Before I posted this question, I had tried to use variables for the host, port, and SID without success. I used a global variable and gave it default values, but it failed. Then I tried setting the value in a package and creating a scenario, but that too failed. What am I missing?

  • Change Logical Schema name of an existing interface

    How do we change the logical schema name of an existing interface which is pointing to a different logical schema. Note: Both two different logical schema are pointing to the same database schema.
    Thanks in advance for your update.
    Kaustuv

    I'm not sure it's good to have two different logical schemas pointing to the same physical schema.
    Whatever, what do you want to change ?
    The logical schema of a source datastore ? Of a target datastore? Of a target temporary interface? Or the staging area ?

  • Physical Schema and logical schema

    Hi,
    When creating the data server in the topology corresponding to the appropriate technology we are creating a physical schema. But then why do we need to create logical schema. Is it created for execution of the interface? And can multiple physical schemas be mapped to same logical schema?

    Hi
    Physical schema represents the actual connection to the data source or data target. Logical schema represents the logical name associated to that source or target.
    One logical schema can be associated with multiple physical schema along with context, i.e. one logical schema is associated with different physical schema using different context.
    It can be understood with following example:
    You have 3 environments: Dev, QA, Prod, each having different database servers as DB1, DB2, DB3, respectively. Similarly we have 3 context corresponding to Dev, QA and Prod. You create logical schema with name DB_source
    Now you associate physical DB servers to logical schema (DB_source) for each context:
    DEV: DB1
    QA: DB2
    PROD: DB3
    Now when u develop ODI interfaces, you use one context DEV which associates DB_source to DB1. While mentioning context for execution, keep it as "Execution". This means, whatever context you choose during execution, corresponding physical DBs will be used.
    Thus if you change the execution context, corresponding physical schema will be used during execution.
    Let me know if you have further questions !!

  • Circular logical schemas are not supported

    Hi,
    I have created a repository in the OBIEE using the data model diagram supplied to me. I have dragged the entire physical layer model to form my BMM Layer Model.
    I am getting the below error when I create my RPD. "Multiple paths exist to table A. Circular logical schemas are not supported."
    Please note that there are no circular connections and I have double checked this.
    When I try to delete the BMM Layer and keep only the physical layer,I do not get any consistency errors.
    Someone please help.
    Thanks,
    Akshatha

    Hi,
    According to the docs-
    Cause. The repository contains a circular logical scheme, which is not supported.
    Response. Correct the repository so it does not contain multiple paths to the named table, and retry your actions.
    Make in BMM star schema properly (dimensions, facts) with complex joins.
    Check if you are having a star schema followed in BMM with proper fact tables and dimension tables and also 1:n joins.
    Hope this helped/ answered.
    Regards
    MuRam

  • Reg logical schema

    Hi ,
    I had created the interface when the logical schema was X , but now the same schema is changed to Y .
    correct logical schema has been mapped , now the interface is also showing logical schema as Y .
    But now when I am executing the interface , I am getting the below error
    java.lang.NullPointerException
         at com.sunopsis.dwg.codeinterpretor.SnpGeneratorSQLCIT.treatPopSourceSet(SnpGeneratorSQLCIT.java:7740)
         at com.sunopsis.dwg.codeinterpretor.SnpGeneratorSQLCIT.treatJobPopCollectionBuild(SnpGeneratorSQLCIT.java:7527)
         at com.sunopsis.dwg.codeinterpretor.SnpGeneratorSQLCIT.preComputePop(SnpGeneratorSQLCIT.java:7469)
         at com.sunopsis.dwg.codeinterpretor.SnpGeneratorSQLCIT.treatJobPop(SnpGeneratorSQLCIT.java:7379)
         at com.sunopsis.dwg.codeinterpretor.SnpGeneratorSQLCIT.mainGenPopInternal(SnpGeneratorSQLCIT.java:3168)
         at com.sunopsis.dwg.codeinterpretor.SnpGeneratorSQLCIT.mainGenPop(SnpGeneratorSQLCIT.java:3124)
         at com.sunopsis.graphical.dialog.SnpsDialogExecution.doInterfaceExecuter(SnpsDialogExecution.java:478)
         at oracle.odi.ui.action.SnpsPopupActionExecuteHandler.actionPerformed(SnpsPopupActionExecuteHandler.java:141)
         at oracle.odi.ui.SnpsFcpActionAdapter.handleEvent(SnpsFcpActionAdapter.java:253)
         at oracle.ide.controller.IdeAction.performAction(IdeAction.java:529)
         at oracle.ide.controller.IdeAction.actionPerformedImpl(IdeAction.java:884)
         at oracle.ide.controller.IdeAction.actionPerformed(IdeAction.java:501)
         at javax.swing.AbstractButton.fireActionPerformed(AbstractButton.java:1995)
         at javax.swing.AbstractButton$Handler.actionPerformed(AbstractButton.java:2318)
         at javax.swing.DefaultButtonModel.fireActionPerformed(DefaultButtonModel.java:387)
         at javax.swing.DefaultButtonModel.setPressed(DefaultButtonModel.java:242)
         at javax.swing.plaf.basic.BasicButtonListener.mouseReleased(BasicButtonListener.java:236)
         at java.awt.AWTEventMulticaster.mouseReleased(AWTEventMulticaster.java:273)
         at java.awt.Component.processMouseEvent(Component.java:6288)
         at javax.swing.JComponent.processMouseEvent(JComponent.java:3267)
         at java.awt.Component.processEvent(Component.java:6053)
         at java.awt.Container.processEvent(Container.java:2041)
         at java.awt.Component.dispatchEventImpl(Component.java:4651)
         at java.awt.Container.dispatchEventImpl(Container.java:2099)
         at java.awt.Component.dispatchEvent(Component.java:4481)
         at java.awt.LightweightDispatcher.retargetMouseEvent(Container.java:4577)
         at java.awt.LightweightDispatcher.processMouseEvent(Container.java:4238)
         at java.awt.LightweightDispatcher.dispatchEvent(Container.java:4168)
         at java.awt.Container.dispatchEventImpl(Container.java:2085)
         at java.awt.Window.dispatchEventImpl(Window.java:2478)
         at java.awt.Component.dispatchEvent(Component.java:4481)
         at java.awt.EventQueue.dispatchEventImpl(EventQueue.java:643)
         at java.awt.EventQueue.access$000(EventQueue.java:84)
         at java.awt.EventQueue$1.run(EventQueue.java:602)
         at java.awt.EventQueue$1.run(EventQueue.java:600)
         at java.security.AccessController.doPrivileged(Native Method)
         at java.security.AccessControlContext$1.doIntersectionPrivilege(AccessControlContext.java:87)
         at java.security.AccessControlContext$1.doIntersectionPrivilege(AccessControlContext.java:98)
         at java.awt.EventQueue$2.run(EventQueue.java:616)
         at java.awt.EventQueue$2.run(EventQueue.java:614)
         at java.security.AccessController.doPrivileged(Native Method)
         at java.security.AccessControlContext$1.doIntersectionPrivilege(AccessControlContext.java:87)
         at java.awt.EventQueue.dispatchEvent(EventQueue.java:613)
         at java.awt.EventDispatchThread.pumpOneEventForFilters(EventDispatchThread.java:269)
         at java.awt.EventDispatchThread.pumpEventsForFilter(EventDispatchThread.java:184)
         at java.awt.EventDispatchThread.pumpEventsForHierarchy(EventDispatchThread.java:174)
         at java.awt.EventDispatchThread.pumpEvents(EventDispatchThread.java:169)
         at java.awt.EventDispatchThread.pumpEvents(EventDispatchThread.java:161)
         at java.awt.EventDispatchThread.run(EventDispatchThread.java:122)
    Is there any way by which I don't have to bring the source again and don't have recreate the mapping .
    Regards
    Bhoopendra

    Hi
    I got the same error. I think, it happened because the model was lost (with needed code). I've replaced Target Datastore in Interface and the error has gone.

  • Physical schema vs logical schema in odi

    hi, i am new to odi.I have successfully loaded data metadata to essbase and planning . But still i am not clear why odi uses physical schema when we just uses logical schema while reversing, execution of interfaces etc in designer

    Hi,
    Logical schema will always point to a Physical Schema .
    The aim of the logical schema is to ensure the portability of the procedures and models on the different physical schemas. In this way, all developments in Designer are carried out exclusively on logical schemas.
    A logical schema can have one or more physical implementations on separate physical schemas, but they must be based on data servers of the same technology. A logical schema is always directly linked to a technology.
    To be usable, a logical schema must be declared in a context. Declaring a logical schema in a context consists of indicating which physical schema corresponds to the alias - logical schema - for this context.
    Thanks,
    Sutirtha

  • Regarding logical schemas

    Hi Experts,
    i have one MASTER_REPOSITORY Three work repositories for dev,test,prod, when i importing from development do i need phycal schemas and logical schemas
    and
    in other case
    i have two master repositories ,when i am importing from master_repo1 to master_repo2  do i need physica schemas and logical schemas
    please help me
    regards
    ksbabu

    Hi,
    i have one MASTER_REPOSITORY Three work repositories for dev,test,prod, when i importing from development do i need phycal schemas and logical schemas
    and
    in other case
    i have two master repositories ,when i am importing from master_repo1 to master_repo2  do i need physica schemas and logical schemas
    Yes in both the cases you required the schemas.
    when you import the master repository, Physical & logical schema are imported as well with this(Topology informations are stored in Master repository) and logical schema is very imported to run the scenarios
    Hope this helps
    Thanks

  • Logical schema can have one or more physical schemas

    Hi Experts,
    Can a logical schema in ODI will have one or more physical schema, if possible in what case do we have it.
    In the same way can one physical schema have multiple logical schema if possible in what case.
    Please give me your brief explanation on it.
    Thx,
    Sahadeva.

    Hi Sahadeva,
    1) Yes. You can map it through different contexts. The goal is to use the same logical schema to point to a different physical schemas for each environment (Dev, Test, PROD, ...).
    So let's say you have a have the schema SH_DEV on database 1 for Dev and schema SH on database 2 for Dev :
    - Create one SH Logical schema
    - Create two data server : one for database1 and one for database2
    - Create the physical schemas SH_DEV under dataserver1 and SH under dataserver2
    - Create two context : Dev and Prod
    - Map logical SH to dataserver1.SH_DEV through context Dev
    - Map logical SH to dataserver2.SH through context Prod
    2) Technically yes, through different contexts. Though I don't see any use case for this.
    Hope it helps.
    Regards,
    JeromeFr

  • ODI 11g Logical Schema Error [Error Code: 1222]

    Hi,
    One of my ODI Jobs is failing with an error "ODI-1222: Session start failure on agent Internal: logical schema BIAPPS_11g_RA cannot be found in the master repository."
    BIAPPS_11g_RA = Logical Schema in our System.
    However, all the other Jobs of the same Module is Passing Successfully.
    The Failure happens ONLY in the QA env and the same Job passes successfully in the DEV. I had checked all the topology connections and everything looks fine as all the other jobs in the same module are running fine. Can this be a problem with the Migration that happened from Dev to QA..?
    I figured out that for the failed Job in the QA env's operator we have the "Forced Context type" as "Development" where as the "Execution Context" in "QA". However, for all the other Jobs of the same module are having both as "QA". I tried re-importing the Scenarios from Dev but it did not help me..!
    Can anybody help me on this..
    Thnx!

    We dont know what the architecture of ODI is in your organization. However, as a general rule, you should not be forcing the contexts. Ideally they should be "Execution Context". And then you create the scenario out of interface/packages and then deploy them to the next environment.

Maybe you are looking for