A separate work schema for a physical schema ?

Hi,
The ODI best practices white paper tells us (p.47/48) to create a separate schema and designate that as the work schema for a physical schema. This, however means that the user with which we connect to the database needs to have powerful system privileges like CREATE ANY TABLE, DROP ANY TABLE depending on what you want to do. Many DBA's will object to this. A simpler solution seems to be to set the work schema to be identical to the physical schema.
Has anybody done this and experienced drawbacks/issues with it ?
Can anybody give any good arguments in favor of the best practice advising a separate work schema ?
thanks, Arjan

Arjan ,
I would like to suggest my opinion. The main reason for allocating the work Schema separate to the main schema is to have all the $ tables created separated so that while debugging we can choose to have them and drop them later and also mostly when the CKM is enabled the E$ tables are created and they are not dropped. there are also situation where dynamic $ tables are created so each time you run they $ tables are created and in case of failure or de bugging they remain . On the second attempt we have a new dynamic $ table. Think of a scenario where we have both of them in one place and the amount of unnecessary $ tables we have with our DW tables . Also if such a scenario have to happen in production box. Nobody would like to have such a scenario and then you need to either drop them manually or through another pl/sql codes . Thats the main reason its good to have a separate Work Schema.
Also we use the Work schema for creating stagging table at run time while ODI processing.
Secondly regarding permission and using the schema i have seen both the kind of environment where user used in topology is either the Work Schema or the Data Schema. Yes you would need to give certain permission or priviledges like Drop , create as $ tables have to have them but some dont give permission like Delete , truncate target tables .
Now if you make your DBA understand the architecture , he would surely provide the required permission for the Work schema tables to be created and dropped and i dont see any thing to worry, since $ tables gets created and dropped but yes the permission have to be in such a way the user used in ODI data server dont have the permission to drop the Data schema table , if you want have more restricited permission for truncate and delete too, but in development we would surely need truncate and delete too since we tend to load wrong data and logic . Think about it in a different scenarios and requirement.
Look at this link Cezar has spoken on this topic. -http://odiexperts.com/?p=672
hope this helps.

Similar Messages

  • How to 1 logical schema to many physical schemas?

    I have a database schema which is instantiated on many different servers. I set up a physical schema pointing to one, and a logical schema pointing to that physical schema. I imported the schema to a model, created interfaces for the tables, and created a package to execute it; and that is all working for that one physical instance.
    1) How can I implement that same model, interfaces, and package for each of the physical instances?
    1a) Can I change the JDBC parameters at package run time to point to a different database? How?
    1b) Can I select a different physical schema for the logical schema at package run time so that I only have to set up a different physical schema for each database? How?
    Thank you.

    "But if you have a lot of context (for example 1000 stores), you can define a generic physical schema, a logical one. The physical is based on variables (host, port,..). "
    Using contexts is working for me, but at least one of my schemas has more than 50 server instances, so this approach would be beneficial. Before I posted this question, I had tried to use variables for the host, port, and SID without success. I used a global variable and gave it default values, but it failed. Then I tried setting the value in a package and creating a scenario, but that too failed. What am I missing?

  • Physical Schema and logical schema

    Hi,
    When creating the data server in the topology corresponding to the appropriate technology we are creating a physical schema. But then why do we need to create logical schema. Is it created for execution of the interface? And can multiple physical schemas be mapped to same logical schema?

    Hi
    Physical schema represents the actual connection to the data source or data target. Logical schema represents the logical name associated to that source or target.
    One logical schema can be associated with multiple physical schema along with context, i.e. one logical schema is associated with different physical schema using different context.
    It can be understood with following example:
    You have 3 environments: Dev, QA, Prod, each having different database servers as DB1, DB2, DB3, respectively. Similarly we have 3 context corresponding to Dev, QA and Prod. You create logical schema with name DB_source
    Now you associate physical DB servers to logical schema (DB_source) for each context:
    DEV: DB1
    QA: DB2
    PROD: DB3
    Now when u develop ODI interfaces, you use one context DEV which associates DB_source to DB1. While mentioning context for execution, keep it as "Execution". This means, whatever context you choose during execution, corresponding physical DBs will be used.
    Thus if you change the execution context, corresponding physical schema will be used during execution.
    Let me know if you have further questions !!

  • Differentiating of pricing schema for each purchasing document type

    Dear colleagues,
    This topic could be discussed before but I need to clarify some points.
    We would like to differentiate pricing schema per purchasing document types (e.g. AA and ZZ). These purchasing documents are not like domestic and import types.
    There some condition(s) where purchasing document type AA should not be able to maintain whereas ZZ should be able to.
    In this case shall use one unique pricing schema for both purchasing document types if so how can I hide some conditions for document type AA or if I use separate pricing schema how can I set different pricing schemas for different purchasing document types because both document types have same purchasing organization, vendor etc. 
    Regards
    Metin

    Hi,
    Why to have two different schema groups! Is procurement process is completely different, so use to have!
    (have thought of separate condition types for each Calculation Schema)
    Based on Schema Group Of Vendor in the vendor master, the PO for vendor with respective Schema Group Of Vendor will populate the assigned condition typesu2019. Follow path:
    SPRO->MM->Purchasing->Conditions->Define Price determination process ->Define Schema Determination
    Under segment:
    Determine Calculation Schema for Standard Purchase Orders
    Here you can assign Schema Group Purchasing Organization, Schema Group of Vendor and Calculation Schema
    Under segment:
    Determine Schema for Stock Transport Orders
    Here you can assign Schema Group Purchasing Organization, PO document type and Calculation Schema
    Regards,
    Biju K

  • How to change parsing schema for using application in test environment?

    Hi,
    I set up a test environment, i.e. an other Oracle schema with identical tables, views etc. to the productive environment. I wanted to duplicate the APEX application so that there will be a test version using the test Oracle schema and productive version using the productive oracle schema.
    Can some one please give me some tip how to do this? As far as I see i could change the parsing schema in the APEX Application install SQL in the following ways:
    1. While exportin the application with APEX
    2. In the result install f...sql file using search-replace
    3. While importing the APEX application via APEX Development GUI.
    I would prefere 1. or 3. but in both cases when I would like to change the schema name, the drop dowl list does not contain the test Oracle schema name. Do I have to grant something to the APEX worksspace schema for the test schema name to show up in the drop dowl list?
    Tamás

    On that screen,
    - click Create
    - select "Existing" Schema, click Next
    - select the Workspace, click Next
    - select the Schema, click Next
    - click Add Schema
    And now you'll end up with two schema's attached for the same Workspace.

  • BUILD FAILED java.lang.Error: unable to load schema-for-schema for W3C XML

    I am new to JAXB, I am trying to run the sample applciations on Win 98,
    I have set up the environment values as per the UserGuide specs.
    I am getting the following error:
    BUILD FAILED java.lang.Error: unable to load schema-for-schema for W3C XML Schema at com.sun.msv.reader.xmlschema.XMLSchemaReader.getXmlSchemaForXmlSchema (XMLSchemaReader.java:190)
    Could someone please suggest a solution.
    Thank you.

    Hi
    I am using Windows 2000 and I am repeatedly getting the same error too (See below). Would appreciate any help ..
    parsing a schema...
    org.iso_relax.verifier.VerifierConfigurationException
         at com.sun.msv.verifier.jarv.FactoryImpl.compileSchema(FactoryImpl.java:104)
         at org.iso_relax.verifier.VerifierFactory.compileSchema(Unknown Source)
         at org.iso_relax.verifier.VerifierFactory.compileSchema(Unknown Source)
         at com.sun.msv.reader.xmlschema.XMLSchemaReader.getXmlSchemaForXmlSchema(XMLSchemaReader.java:186)
         at com.sun.tools.xjc.Driver$1.<init>(Driver.java:477)
         at com.sun.tools.xjc.Driver.loadXMLSchemaGrammar(Driver.java:476)
         at com.sun.tools.xjc.Driver.loadGrammar(Driver.java:404)
         at com.sun.tools.xjc.Driver.run(Driver.java:268)
         at com.sun.tools.xjc.Driver.main(Driver.java:88)
         at sample1.Binder.main(Binder.java:18)
    StackTrace of Original Exception:
    java.lang.NullPointerException
         at com.sun.msv.datatype.xsd.TypeIncubator.addFacet(TypeIncubator.java:64)
         at com.sun.msv.reader.datatype.xsd.XSDatatypeExp$1.addFacet(XSDatatypeExp.java:87)
         at com.sun.msv.reader.datatype.xsd.FacetState.startSelf(FacetState.java:56)
         at com.sun.msv.reader.State.init(State.java:154)
         at com.sun.msv.reader.GrammarReader.pushState(GrammarReader.java:579)
         at com.sun.msv.reader.datatype.xsd.TypeState.startElement(TypeState.java:101)
         at org.xml.sax.helpers.XMLFilterImpl.startElement(Unknown Source)
         at org.apache.xerces.parsers.AbstractSAXParser.startElement(AbstractSAXParser.java:459)
         at org.apache.xerces.parsers.AbstractXMLDocumentParser.emptyElement(AbstractXMLDocumentParser.java:221)
         at org.apache.xerces.impl.XMLNamespaceBinder.handleStartElement(XMLNamespaceBinder.java:874)
         at org.apache.xerces.impl.XMLNamespaceBinder.emptyElement(XMLNamespaceBinder.java:591)
         at org.apache.xerces.impl.XMLDocumentFragmentScannerImpl.scanStartElement(XMLDocumentFragmentScannerImpl.java:747)
         at org.apache.xerces.impl.XMLDocumentFragmentScannerImpl$FragmentContentDispatcher.dispatch(XMLDocumentFragmentScannerImpl.java:1477)
         at org.apache.xerces.impl.XMLDocumentFragmentScannerImpl.scanDocument(XMLDocumentFragmentScannerImpl.java:329)
         at org.apache.xerces.parsers.DTDConfiguration.parse(DTDConfiguration.java:525)
         at org.apache.xerces.parsers.DTDConfiguration.parse(DTDConfiguration.java:581)
         at org.apache.xerces.parsers.XMLParser.parse(XMLParser.java:152)
         at org.apache.xerces.parsers.AbstractSAXParser.parse(AbstractSAXParser.java:1175)
         at com.sun.msv.reader.GrammarReader._parse(GrammarReader.java:459)
         at com.sun.msv.reader.GrammarReader.switchSource(GrammarReader.java:434)
         at com.sun.msv.reader.GrammarReader.switchSource(GrammarReader.java:407)
         at com.sun.msv.reader.xmlschema.XMLSchemaReader.switchSource(XMLSchemaReader.java:683)
         at com.sun.msv.reader.xmlschema.ImportState.startSelf(ImportState.java:41)
         at com.sun.msv.reader.State.init(State.java:154)
         at com.sun.msv.reader.GrammarReader.pushState(GrammarReader.java:579)
         at com.sun.msv.reader.SimpleState.startElement(SimpleState.java:72)
         at org.xml.sax.helpers.XMLFilterImpl.startElement(Unknown Source)
         at org.apache.xerces.parsers.AbstractSAXParser.startElement(AbstractSAXParser.java:459)
         at org.apache.xerces.impl.XMLNamespaceBinder.handleStartElement(XMLNamespaceBinder.java:877)
         at org.apache.xerces.impl.XMLNamespaceBinder.startElement(XMLNamespaceBinder.java:569)
         at org.apache.xerces.impl.XMLDocumentFragmentScannerImpl.scanStartElement(XMLDocumentFragmentScannerImpl.java:759)
         at org.apache.xerces.impl.XMLDocumentFragmentScannerImpl$FragmentContentDispatcher.dispatch(XMLDocumentFragmentScannerImpl.java:1477)
         at org.apache.xerces.impl.XMLDocumentFragmentScannerImpl.scanDocument(XMLDocumentFragmentScannerImpl.java:329)
         at org.apache.xerces.parsers.DTDConfiguration.parse(DTDConfiguration.java:525)
         at org.apache.xerces.parsers.DTDConfiguration.parse(DTDConfiguration.java:581)
         at org.apache.xerces.parsers.XMLParser.parse(XMLParser.java:152)
         at org.apache.xerces.parsers.AbstractSAXParser.parse(AbstractSAXParser.java:1175)
         at com.sun.msv.reader.GrammarReader._parse(GrammarReader.java:459)
         at com.sun.msv.reader.GrammarReader.parse(GrammarReader.java:449)
         at com.sun.msv.reader.xmlschema.XMLSchemaReader.parse(XMLSchemaReader.java:89)
         at com.sun.msv.verifier.jarv.XSFactoryImpl.parse(XSFactoryImpl.java:26)
         at com.sun.msv.verifier.jarv.FactoryImpl.compileSchema(FactoryImpl.java:95)
         at org.iso_relax.verifier.VerifierFactory.compileSchema(Unknown Source)
         at org.iso_relax.verifier.VerifierFactory.compileSchema(Unknown Source)
         at com.sun.msv.reader.xmlschema.XMLSchemaReader.getXmlSchemaForXmlSchema(XMLSchemaReader.java:186)
         at com.sun.tools.xjc.Driver$1.<init>(Driver.java:477)
         at com.sun.tools.xjc.Driver.loadXMLSchemaGrammar(Driver.java:476)
         at com.sun.tools.xjc.Driver.loadGrammar(Driver.java:404)
         at com.sun.tools.xjc.Driver.run(Driver.java:268)
         at com.sun.tools.xjc.Driver.main(Driver.java:88)
         at sample1.Binder.main(Binder.java:18)
    java.lang.Error: unable to load schema-for-schema for W3C XML Schema
         at com.sun.msv.reader.xmlschema.XMLSchemaReader.getXmlSchemaForXmlSchema(XMLSchemaReader.java:190)
         at com.sun.tools.xjc.Driver$1.<init>(Driver.java:477)
         at com.sun.tools.xjc.Driver.loadXMLSchemaGrammar(Driver.java:476)
         at com.sun.tools.xjc.Driver.loadGrammar(Driver.java:404)
         at com.sun.tools.xjc.Driver.run(Driver.java:268)
         at com.sun.tools.xjc.Driver.main(Driver.java:88)
         at sample1.Binder.main(Binder.java:18)
    Exception in thread "main"
    -Thanks
    Guna

  • RPD - Cannot obtain number of columns for the query result :Working with MS SQL 2012 schema

    Hi All,
    I have created my warehouse in MS SQL 2012.
    For management purpose, I have created different schemas in SQL database
    In RPD, Physical layer, when i view data > I get error as
    [nQSError:16002] Cannot obtain number of columns for the query result.
    [nQSError:16001] ODBC error state : S0002 code : 208 message: [Microsoft][ODBC SQL Server Driver][SQL Server] Invalid object name 'tbl'..
    [nQSError:16001] ODBC error state : S0002 code : 208 message: [Microsoft][ODBC SQL Server Driver][SQL Server] Statements could not be prepared..
    I have already browsed : OBIEE 11g Strange ODBC Driver Error with SQL Server : Total Business Intelligence ... did not help me
    please help!!!

    Hi All,
    After all R&D it is been found that Oracle business administrator( RPD) needs default dbo schema. It doesn't accept custom schema for pulling data.
    If anybody have other views please share.!!
    Thank you

  • I need to extend the schema for iPlanet Dir. 5.0 and add custom objectclasses and atributes. I do this by adding entries in the 99user.ldif file. Its not working. Any ideas?

    Hi
    I need to extend the schema for iPlanet Dir. 5.0 and I do not want to do so from the console. As per the documentation, I need to either add entries in the 99user.ldif file or define my own custom [00-99]myname.ldif file. I tried this but its not working.
    I have made the assumption that there is no explicit import step for the 'user defined' schema files (as it is for user data ldif files). I assume that on start (or on opening the console), I'd be able to see the new schema after the server has read the schema file.
    I have verified that entering new objectclasses and attributes from the console adds entries into the 99user.ldif file. So why is the reverse process not working. Can anybody throw some light on this? Also in case my assumptions are faulty, please let me know.
    I did not change the aci entries in the existing ldif file. Is any modification needed there? I was logged in as the Directory Manager during this testing process.
    regards
    Sikka ([email protected])

    Hi Sikka,
    The server reads its schema configuration on startup. If you manually modify the schema files while the server is running, it will not have any effect. You have to restart the server.
    The console adds the new schema elements over LDAP (you could do that as well, you only have to modify the cn=schema entry), so the server is aware of the changes immediately and thus restarting is not needed.
    I hope this helps.
    Bertold

  • Grants for physical schema and data-servers

    Hi,
    I'd like to know
    What are the Grants needed for the Owners of each physical schema?
    For example: Grants DDL (drop table n) and DML (select / update / delete / insert).
    Grants needed for the users' connection data-servers?
    Bovolini

    It depends on what technology you plan to use.
    If you plan to use Oracle - Is the data server connection user different from the owner of the physical schemas ?
    In addition to the connect, resource to each of these users, you will also need to give data server connection user privileges on the objects of the owner of the physical schemas.

  • Limiting ODI users to physical schemas

    We have multiple Oracle users with different privileges and have created data servers for each. We want to restrict the ODI users to access the database using only a specific Oracle user id. i.e. restrict access of a ODI user to a physical schema. How can i do this in ODI?

    1> Create different data servers, logical schemas for application users e.g. Finance, HR
    2> Create ODI users for each of your users
    3> Ideally create different projects for each application like Finance & HR, if not then separate Folders for each of them. Then grant the ODI users object instance level authorization on Projects/Folders & Models.
    4> Try to give Logical Schema object instance authorization to different ODI users. If doesn't work, then create separate Models for Fin & HR yourself and grant authorization on them to ODI users.
    Let me know if its helpful.
    Cheers
    Suds

  • Oracle Physical schema password management

    Hi:
    After setting up an Oracle physical schema and attaching a data store, I did a lot of work reversing tables, views etc and using them in my integrations, procedures etc. I never had to look back at the physical schema until we went into production, and the IT have informed me that there is a password ageing restriction for production systems. I made provisions for other system passwords (Planning, Essbase etc) using variables. But I couldn't get my head around to doing it at the Phsyical schema level.
    Given that this is a classic problem, has anyone run into this sort of issue before, and if yes, what's the best practice to deal with it?
    Any help would be greatly appreciated
    Thanks

    Rajesh,
    I see that you are using variables in Planning, essbase. On the same lines you can uses the variable as a password in the physical schema for database servers as well.
    Not that this is the best practice.
    But if your IT has a password ageing restriction - how tough is it for you to change it in the Topology Manager. That is the same as changing it in a database table and refreshing a variable from it but with less overhead.
    Or you can change your system to use service accounts. Usage of service accounts is the best practice for things like these. Ask them - How do they run automated jobs on Unix ? Do they change the passwords for those accounts every now and then. I think the answer that you will get is "We use service accounts for that"
    Are the database passwords stored in LDAP somewhere. Perhaps you can use JNDI to authenticate your database user/password from LDAP.
    In the Dataserver connection properties, you will see the "JNDI Connection" checkbox. If you select it, it will change the next tab to let you add LDAP details.
    Hope that helps

  • How to list all physical schemas in ODI procedure

    Dear Experts,
    I am trying a requirement which is to execute a set of sqls in all the schemas configured in ODI.
    for example
    1) I have four data servers/physical schemas configured in Physical Architecture under Oracle techonlogy.
    2) Created corresponding 4 logical schemas for Oracle.
    3) Mapped using two different contexts.
    4) one context mapped two schemas and another mapped two other schemas.
    now, I need to exeture the following sql for all the Oracle schemas mapped in context selected.
    just for testing: select dummy from dual
    I would like to execute this sql in ODI procedure for every schema mapped in the selected context in single execution.
    Can any expert help me on this solution?
    - Raja

    Raja,
    I was actually talking about multiple ODI Step command rather than multiple procedure. What you are trying to achieve is difficult unless you specify the logical schema in ODI Procedure , becuase getInfo will throw Null pointer if we dont specify the logical schema ,
    the getSchema needs Logical schema . The other way i can think query in work rep tables and get the schema name and pass it .
    Any way in case your figure out without passing Logical schema . Please share with us , i would be interested in learning the trick .

  • Physical schema vs logical schema in odi

    hi, i am new to odi.I have successfully loaded data metadata to essbase and planning . But still i am not clear why odi uses physical schema when we just uses logical schema while reversing, execution of interfaces etc in designer

    Hi,
    Logical schema will always point to a Physical Schema .
    The aim of the logical schema is to ensure the portability of the procedures and models on the different physical schemas. In this way, all developments in Designer are carried out exclusively on logical schemas.
    A logical schema can have one or more physical implementations on separate physical schemas, but they must be based on data servers of the same technology. A logical schema is always directly linked to a technology.
    To be usable, a logical schema must be declared in a context. Declaring a logical schema in a context consists of indicating which physical schema corresponds to the alias - logical schema - for this context.
    Thanks,
    Sutirtha

  • Procedure not selecting SQL Server database specified in Physical Schema

    Hi all,
    I'm still green with ODI and couldn't find an answer to this after searching through the forum. I have 10.1.3.6.2 installed right now.
    I have my Physical Architecture setup with Microsoft SQL Server, Essbase, and some others. The SQL Server connection is JDBC, using integrated security (I don't think integrated security is supported but I got it to work, although I had to hack up the 10.1.3.6 upgrade script to get the patch to work...)
    Under the Physical Architecture I have several data servers. Right now the only field I specify is the Database (Catalog) and a context -- everything else is standard (even Owner (Schema) is just <Undefined> right now but changing it doesn't seem to make a difference right now)
    So, I have my job all designed up in Designed and can run it, and everything works just fine, except one little snag. The database in SQL Server doesn't seem to get selected automatically (by virtue of being specified in the Physical schema). However, if I put a step in to the job so that it exectures "USE database_name", then everything works.
    Normally, I could live with this, but the databases are named differently on different servers, so I'd prefer to just get it to work from the physical schema instead of jumping through some hoops with variables.
    Also, while I thought about specifying the database on the JDBC URL (databaseName=X), this won't work (it's probably poor form anyway) either.
    So... shouldn't specifying the database/catalog in the physical schema make it so that it gets selected as default and then SQL queries are running against that?
    Thanks for any help/insight, as always,
    Jason

    Jason
    Using the database option in the URL is the right way to do it, the URL I use is:
    jdbc:sqlserver://myserver:1433;database=ODIM;selectMethod=cursor
    Craig

  • Variables in physical schema of File data server

    I want to avoid hard coding of folder name where source files used to kept as it could be vary from environment to environment like dev to production
    Can you please suggest best way to implement it?
    How we can store it into parameter table ans how it could fetched on production enviormnet.
    Please advice.
    Thanks in advance.
    Regards,
    Dinesh.

    Hi Dinesh,
    I believe you can substitute a variable in physical schema aswell..But you cannot test it by clicking the test button there.
    This will work only when you have defined a variable in package at the beginning of the step.That phycal path will be resolved during run time.
    Hope you got it now.
    Same thing will work for the password field in any oracle dataserevr as well.
    Thanks.
    http://bhabaniranjan.com/

Maybe you are looking for