Import of mappings in a test scheme

hi,
I have a problem importing mappings with owb 10 release 1 and I did not find an answer by now:
I developed all mappings in a scheme and know I want to test the hole DWH application on a test scheme (on the same server). is it correct that the import via mdl only runs for the same scheme?
I built a new runtime, rep and run rep user and I created the test scheme with them. then I tried to compile the packages of the mappings, but this did run without errors (ORA-01924).
So I think I am totally wrong ... Could you tell me, who to transfer the mappings from one scheme into another?
best regards,
Klaus

Hi
Create a test runtime and target and deploy to this environment from the origin design rep with a new runtime connection.
Karesz

Similar Messages

  • How to handle refreshing TEST schema with PRODUCTION schema ?

    - we have database 10g ( standard edition), database name : ABC
    - Schema name: ABC.PRODUCTION (which is our production schema)
    - Schema name: ABC.TEST (which is our testing schema, where developers work)
    Both the production & Test schemas exist in the same database.
    Now once a week I wanted to refresh TEST schema with PRODUCTION data
    Here is what I have been doing all these years:
    => Take a logical backup (EXPDP) of PRODUCTION schema (prod.dmp)
    => Drop user TEST cascade ( i don't need a backup of this TEST schema)
    => Create user TEST
    => Import PROD.DMP data into TEST schema
    All the above 4 steps are being done manually.
    Questions:
    ======
    1. Is there any easier way of doing the above steps using some tool ?
    2. Does Oracle enterprise manager which comes free with database installation (http://localhost:118/em)
    has any utility or tool to do this job ?
    3. I want everything to be refreshed (all database objects including data) ?
    Thanks
    John P
    Edited by: johnpau2013 on Feb 23, 2011 4:32 AM

    This is crazy. One inadvertent typo and you'll overwrite your Production schema. Plus, what happens if a developer 'tests' against the test schema and slows the Production database to a crawl.
    I presume you know all about this, though and can't make the case to management. I hope it's not a business-critical Production database!
    Anyway, your method is decent. I would advise against doing it automatically, to be honest, especially when your system is so precariously set up. But if you exist, you could use encapsulate all the steps into a script and use crontab to automate the process. I, personally, wouldn't use DBMS_SCHEDULER as you have to be careful with priorities and workload sometimes (at least in my experience) and you might end up having your export/import clash with other jobs in the system if you don't pay attention.
    Here are the steps I would use:
    Create a 'create user' script for the test schema based on dynamic SQL. That way you can be sure you have all the grants necessary for the user, in case things change.
    Drop the test user (use EXTRA caution and be defensive when coding this part!)
    Export the schema using FLASHBACK_SCN to ensure you have a consistent export
    Run your 'create user' script to create the test user
    Import the schema with a REMAP_SCHEMA option (use EXTREME caution with this!!!!)
    Compile invalid objects
    Compare objects and exclude any recycle_bin objects. Send an email alert if the object counts are different.
    Compare invalid objects. Any objects which aren't invalid in Production should be flagged up if they're invalid in test.
    Again, it's absolute insanity to have a test schema in a Production database. You absolutely must insist on addressing that with management.
    Mark

  • How to synchronize test schema objects with the prod schema objects.

    Hi,
    I have a requirement of synchronizing test schema objects with the production schema objects. Please let me know the below
    1. if there is a standardized method for such activity,
    2. if there are oracle utilities for this task.
    3. If i had to do this job manually, can you let me know the check list if any.
    Thanks
    Purushotham M

    http://www.oracle.com/technetwork/issue-archive/2012/12-sep/o52sqldev-1735911.html
    You could try database diff tool in sql developer(but there are some licence restrictions).
    I don't know your database version, you could try DBMS_COMPARISON package also.
    Look at this link http://docs.oracle.com/cd/B28359_01/appdev.111/b28419/d_comparison.htm
    Other solution is to create db link between test and production database, and then you can try different types of queries like
    select table_name from user_tables
    minus
    select table_name from user_tables@db_link_to_other_database
    And you can do this for columns, indexes and so on.
    But you must have proper DDL scripts for this, to generate sync script.
    Also there is a question about work process, you are doing sync in reverse order(from production to test). Test db is for test, after test you go to production db with proper ddl and dml scripts, so these schemas shouldn't be different in the first place(talking about schema, not data here).

  • Deploying mappings in non-target schema

    Hi,
    How can I deploy my mappings in a different schema other than the target schema(where all the target facts/dimensions are present). I want the OWB generated mapping packages in a different schema. How can I accomplish this!!
    Thanks,
    Srinivas.

    Hi Srinivas,
    You connect to the runtime repository using the runtime access user.
    The you register your factual target users (you now would have 2). Then when you deploy objects to the target locations (a.k.a users) they will end up in that user, not in the stuff you log on with...
    If you want to do all of this nicely (just realized that) you should have 2 modules in the design. If you do this you do not need the synonyms..
    So you would have a module called packages, a module called schema_objects. Then you have 2 locations related to these 2 modules (package_location and object_location respectively).
    When you go to the deployment manager, you will see a tree with package_location and object_location. Click register for those.
    Package_location will be user PACK
    object_location will be user OBJECTS_HERE
    Deploy the lot (packages into PACK of course) and make sure to deploy the connector (this allows OWB to generate the required schema references, replacing the synonyms I suggested before).
    Jean-Pierre

  • Import Data Table from Different DB Schema in BI Administration

    Here is my scenario:
    Tables are imported from different DB schema in Administration. They are then joined together.
    When I select fields from more than one schema in BI Answer, incorrect result is returned.
    Below is the SQL it returns:
    -------------------- Sending query to database named DW_TESTING (id: <<15875>>):
    select D1.c4 as c1,
    D1.c5 as c2,
    D1.c2 as c3,
    D1.c1 as c4,
    D1.c3 as c5
    from
    *(select D1.c1 as c1,*
    D1.c2 as c2,
    D1.c3 as c3,
    D1.c4 as c4,
    D1.c5 as c5
    from
    *(select sum(T34.PRODUCT_SALES) as c1,*
    T52.DESCRIPTION as c2,
    T52.BE_TYPE as c3,
    T137.YEAR as c4,
    T137.QUARTER as c5,
    ROW_NUMBER() OVER (PARTITION BY T52.BE_TYPE, T137.QUARTER ORDER BY T52.BE_TYPE ASC, T137.QUARTER ASC) as c6
    from
    BEADMIN.BE_MASTER T52,
    DWADMIN.DATE_MASTER T137,
    BEADMIN.BE_DAILY_SALES T34
    where  ( T34.BE_TYPE = T52.BE_TYPE and T34.SALES_DATE = T137.CALENDAR_DATE )
    group by T52.BE_TYPE, T52.DESCRIPTION, T137.YEAR, T137.QUARTER
    *) D1*
    where  ( D1.c6 = 1 )
    *) D1*
    order by c2
    +++Administrator:7e0000:7e001f:----2009/06/08 17:22:09
    -------------------- Sending query to database named DW_TESTING (id: <<15938>>):
    select D2.c2 as c1,
    D2.c3 as c2,
    D2.c1 as c3
    from
    *(select D1.c1 as c1,*
    D1.c2 as c2,
    D1.c3 as c3
    from
    *(select sum(T109.PRODUCT_SALES) as c1,*
    T137.YEAR as c2,
    T137.QUARTER as c3,
    ROW_NUMBER() OVER (PARTITION BY T137.QUARTER ORDER BY T137.QUARTER ASC) as c4
    from
    DWADMIN.DATE_MASTER T137,
    DWADMIN.DAILY_SALES T109
    where  ( T109.SALES_DATE = T137.CALENDAR_DATE )
    group by T137.YEAR, T137.QUARTER
    *) D1*
    where  ( D1.c4 = 1 )
    *) D2*
    order by c2
    It seems that query has been seperated into two, and doesn't have relationship between each others.
    However, I also tested the same situation in other environment, and it returns correct result. I found that the query hasn't been seperated into two, but one. The query is as below:
    -------------------- Sending query to database named BI (id: <<2392>>):
    WITH
    SAWITH0 AS (select D1.c1 as c1,
    D1.c2 as c2,
    D1.c3 as c3,
    D1.c4 as c4
    from
    *(select sum(T29.PRODUCT_SALES) as c1,*
    T96.DESCRIPTION as c2,
    T120.YEAR as c3,
    T120.QUARTER as c4,
    ROW_NUMBER() OVER (PARTITION BY T96.DESCRIPTION, T120.QUARTER ORDER BY T96.DESCRIPTION ASC, T120.QUARTER ASC) as c5
    from
    SIMULATION.BE_MASTER T96,
    SIMULATION.DATE_MASTER T120,
    BIEE.BE_DAILY_SALES T29
    where  ( T29.BE_TYPE = T96.BE_TYPE and T29.SALES_DATE = T120.CALENDAR_DATE )
    group by T96.DESCRIPTION, T120.YEAR, T120.QUARTER
    *) D1*
    where  ( D1.c5 = 1 ) ),
    SAWITH1 AS (select D1.c1 as c1,
    D1.c2 as c2,
    D1.c3 as c3
    from
    *(select sum(T102.PRODUCT_SALES) as c1,*
    T120.YEAR as c2,
    T120.QUARTER as c3,
    ROW_NUMBER() OVER (PARTITION BY T120.QUARTER ORDER BY T120.QUARTER ASC) as c4
    from
    SIMULATION.DATE_MASTER T120,
    SIMULATION.DAILY_SALES T102
    where  ( T102.SALES_DATE = T120.CALENDAR_DATE )
    group by T120.YEAR, T120.QUARTER
    *) D1*
    where  ( D1.c4 = 1 ) )
    select distinct case  when SAWITH0.c3 is not null then SAWITH0.c3 when SAWITH1.c2 is not null then SAWITH1.c2 end  as c1,
    case  when SAWITH0.c4 is not null then SAWITH0.c4 when SAWITH1.c3 is not null then SAWITH1.c3 end  as c2,
    SAWITH0.c2 as c3,
    SAWITH1.c1 as c4,
    SAWITH0.c1 as c5
    from
    SAWITH0 full outer join SAWITH1 On nvl(SAWITH0.c4 , 8) = nvl(SAWITH1.c3 , 8) and nvl(SAWITH0.c4 , 9) = nvl(SAWITH1.c3 , 9)
    order by c1, c2, c3
    This query returns correct result and seems reasonable.
    Assume that the setup in BI Administration is the same for both environment, why the first situation returns query that are separated?

    I have made the same remark but not with two different environment and same design.
    What can influence the construction of two queries ?
    Here some ideas :
    - if you have of course two connections pools (I assume not)
    - if in our database features (double click on the icon and choose the tab feature), full outer join is not supported
    But I have this two requirement on my laptop and I can see that I have also two sql fired.
    What you can do is to make a comparison of your two repository.
    In the administration tool, File/Compare.
    The result will be very interesting.

  • Import of BaseView to a different schema in database

    Hi All,
    We have developed some bam reports. Now the base view, meta view are referring to a schema name (say SchemaName1) in database.
    In the plans we are using "SQL query" to get data from database. So we have queries like "Select SchemaName1.table1.column1 ... from ......".
    Now we have taken a export of the base view, meta view ,plans and all other components of BAM.
    All is fine till now... But now i need to import all these stuff in a different database. My problem is that in the new database instance i cannot use the same schema name (i.e. SchemaName1 ). I have a different schema name(say SchemaName2) there and i need to use that......There is no way i can use SchemaName1.
    Will the import of Base view/Meta view/plans work or we need to do some configuration changes during import??
    Do we have to create Base view for the new schema names from scratch or the imported code can be used??
    The plans have sql queries referring to SchemaName1 but we need it to refer to the Other schema(SchemaName2 )...What is the possible way of referring to the new schema without developing the plan again??
    Waiting for your valuable feedback...

    ***Backup your repository database***
    ***Please review all steps before attempting***
    Sarputil Export
    1.     Start | Run, type “sarputil”
    2.     Do a partial export. (Note down directory that will contain files to be exported.).
    3.     Select your Baseview, Metaview, Security:Login Profile, and Plan. (Note, the check boxes are in front of each object name and might appear very light on some clients.)
    4.     On the XML dialog and with the question on what you would like to do – select Execute Now.
    5.     Finish the wizard and from the directory noted above, verify .csv files were created. (Typically C:\OracleBAM\EnterpriseLink\Data\rp\export).
    6.     Copy all .csv files and paste to different temporary folder on 2nd server.
    Sarputil Import
    1.     Start | Run, type “sarputil” on the 2nd server where you’ll import into.
    2.     Partial import
    3.     Under Repository Information dialog, set the Data Source information on the right and remember to change the directory now containing your .csv files.
    4.     Like before, select “Execute now” and Finish the wizard.
    Oracle BAM Admin
    Modify connect string (host string or tns name) if you need to:
    1.     Connect in Oracle BAM Admin
    2.     Expand Baseviews and select the imported Baseview
    3.     Check under “Server” on the left if the connect string is correct.
    4.     If not, click Modify button and change.
    Modify Baseview Login if you need to:
    1.     With your Baseview still selected on the left side, click on “Baseview Logins” tab
    2.     Logins on the left correspond to actual database user id’s and password. Add a new Login with a database User ID and Password to access the new schema.
    3.     Associate or Set the new Login (right) to the BAM User on the left.
    Sarpbv Modification
    Use sarpbv utility to change references of the old schema name to a new schema name.
    General syntax for Oracle:
    sarpbv /R"username:pwd:Oracle:TNS Name::DB UserID:DBUser pwd” /B"BaseView name" /O"NewSchema"
    **Use Capital letters for New Schema name!
    **Notice 2 colons (::) after TNS Name (because you do not have a database name in Oracle).
    1.     Open DOS
    2.     Type your sarpbv command. Below is an example where I changed a Baseview called “scott” to use a new schema name called Jack.
    sarpbv /R"sa::ORACLE:baminst::sagent:sagent" /B"scott" /O"JACK"
    3.     Open Design Studio, locate the plan and check the SQL Query now reflects the new schema.
    4.     Test the plan to ensure it’s pulling data correctly from the new schema.

  • Export and Import of mappings/process flows etc

    Hi,
    We have a single repository with multiple projects for DEV/UAT and PROD of the same logical project. This is a nightmare for controlling releases to PROD and in fact we have a corrupt repository as a result I suspect. I plan to split the repository into 3 separate databases so that we have a design repos for DEV/UAT and PROD. Controlling code migrations between these I plan to use the metadata export and subsequent import into UAT and then PROD once tested. I have used this successfully before on a project but am worried about inherent bugs with metadata export/imports (been bitten before with Oracle Portal). So can anyone advise what pitfalls there may be with this approach, and in particular if anyone has experienced loss of metadata between export and import. We have a complex warehouse with hundreds of mappings, process flows, sqlldr flatfile loads etc. I have experienced process flow imports that seem to lose their links to the mappings they encapsulate.
    Thanks for any comments,
    Brandon

    This should do the trick for you as it looks for "PARALLEL" therefore it only removes the APPEND PARALLEL Hint and leaves other Hints as is....
    #set current location
    set path "C:/TMP"
    # Project parameters
    set root "/MY_PROJECT"
    set one_Module "MY_MODULE"
    set object "MAPPINGS"
    set path "C:/TMP
    # OMBPLUS and tcl related parameters
    set action "remove_parallel"
    set datetime [clock format [clock seconds] -format %Y%m%d_%H%M%S]
    set timestamp [clock format [clock seconds] -format %Y%m%d-%H:%M:%S]
    set ext ".log"
    set sep "_"
    set ombplus "OMBplus"
    set omblogname $path/$one_Module$sep$object$sep$datetime$sep$ombplus$ext
    set OMBLOG $omblogname
    set logname $path/$one_Module$sep$object$sep$datetime$ext
    set log_file [open $logname w]
    set word "PARALLEL"
    set i 0
    #Connect to OWB Repository
    OMBCONNECT .... your connect tring
    #Ignores errors that occur in any command that is part of a script and moves to the next command in the script.
    set OMBCONTINUE_ON_ERROR ON
    OMBCC "'$root/$one_Module'";      
    #Searching Mappings for Parallel in source View operators
    puts "[clock format [clock seconds] -format "%Y%m%d %H:%M:%S"] --> Searching for Loading/Extraction Operators set at Parallel";
    puts "[clock format [clock seconds] -format "%Y%m%d %H:%M:%S"] --> Module: $one_Module, Object_type: Mapping";
    puts $log_file "[clock format [clock seconds] -format "%Y%m%d %H:%M:%S"] --> Searching for Loading/Extraction Operators set at Parallel";
    puts $log_file "[clock format [clock seconds] -format "%Y%m%d %H:%M:%S"] --> Module: $one_Module, Object_type: Mapping";
    foreach mapName [OMBLIST MAPPINGS] {
    foreach opName [OMBRETRIEVE MAPPING '$mapName' GET TABLE OPERATORS] {
    foreach prop1 [OMBRETRIEVE MAPPING '$mapName' OPERATOR '$opName' GET PROPERTIES (LOADING_HINT)] {
    if { [ regexp $word $prop1] == 1 } {
    incr i
    puts "[clock format [clock seconds] -format "%Y%m%d %H:%M:%S"] --> Mapping: $mapName, Loading Operator: $opName, Property: $prop1"
    puts $log_file "[clock format [clock seconds] -format "%Y%m%d %H:%M:%S"] --> Mapping: $mapName, Loading Operator: $opName, Property: $prop1"
    OMBALTER MAPPING '$mapName' MODIFY OPERATOR '$opName' SET PROPERTIES (LOADING_HINT) VALUES ('');
    OMBCOMMIT;
    foreach prop2 [OMBRETRIEVE MAPPING '$mapName' OPERATOR '$opName' GET PROPERTIES (EXTRACTION_HINT) ] {
    if {[regexp $word $prop2] == 1 } {
         incr i
    puts "[clock format [clock seconds] -format "%Y%m%d %H:%M:%S"] --> Mapping: $mapName, Extraction Operator: $opName, Property: $prop2"
    puts $log_file "[clock format [clock seconds] -format "%Y%m%d %H:%M:%S"] --> Mapping: $mapName, Extraction Operator: $opName, Property: $prop2"
    OMBALTER MAPPING '$mapName' MODIFY OPERATOR '$opName' SET PROPERTIES (EXTRACTION_HINT) VALUES ('');
    OMBCOMMIT;
    foreach opName [ OMBRETRIEVE MAPPING '$mapName' GET DIMENSION OPERATORS ] {
    foreach prop1 [OMBRETRIEVE MAPPING '$mapName' OPERATOR '$opName' GET PROPERTIES (LOADING_HINT) ] {
    if {[regexp $word $prop1] == 1 } {
         incr i
    puts "[clock format [clock seconds] -format "%Y%m%d %H:%M:%S"] --> Mapping: $mapName, Loading Operator: $opName, Property: $prop1"
    puts $log_file "[clock format [clock seconds] -format "%Y%m%d %H:%M:%S"] --> Mapping: $mapName, Loading Operator: $opName, Property: $prop1"
    OMBALTER MAPPING '$mapName' MODIFY OPERATOR '$opName' SET PROPERTIES (LOADING_HINT) VALUES ('');
    OMBCOMMIT;
    foreach prop2 [OMBRETRIEVE MAPPING '$mapName' OPERATOR '$opName' GET PROPERTIES (EXTRACTION_HINT) ] {
    if {[regexp $word $prop2] == 1 } {
         incr i
    puts "[clock format [clock seconds] -format "%Y%m%d %H:%M:%S"] --> Mapping: $mapName, Extraction Operator: $opName, Property: $prop2"
    puts $log_file "[clock format [clock seconds] -format "%Y%m%d %H:%M:%S"] --> Mapping: $mapName, Extraction Operator: $opName, Property: $prop2"
    OMBALTER MAPPING '$mapName' MODIFY OPERATOR '$opName' SET PROPERTIES (EXTRACTION_HINT) VALUES ('');
    OMBCOMMIT;
    foreach opName [ OMBRETRIEVE MAPPING '$mapName' GET CUBE OPERATORS ] {
    foreach prop1 [OMBRETRIEVE MAPPING '$mapName' OPERATOR '$opName' GET PROPERTIES (LOADING_HINT) ] {
    if {[regexp $word $prop1] == 1 } {
         incr i
    puts "[clock format [clock seconds] -format "%Y%m%d %H:%M:%S"] --> Mapping: $mapName, Loading Operator: $opName, Property: $prop1"
    puts $log_file "[clock format [clock seconds] -format "%Y%m%d %H:%M:%S"] --> Mapping: $mapName, Loading Operator: $opName, Property: $prop1"
    OMBALTER MAPPING '$mapName' MODIFY OPERATOR '$opName' SET PROPERTIES (LOADING_HINT) VALUES ('');
    OMBCOMMIT;
    foreach prop2 [OMBRETRIEVE MAPPING '$mapName' OPERATOR '$opName' GET PROPERTIES (EXTRACTION_HINT) ] {
    if {[regexp $word $prop2] == 1 } {
         incr i
    puts "[clock format [clock seconds] -format "%Y%m%d %H:%M:%S"] --> Mapping: $mapName, Extraction Operator: $opName, Property: $prop2"
    puts $log_file "[clock format [clock seconds] -format "%Y%m%d %H:%M:%S"] --> Mapping: $mapName, Extraction Operator: $opName, Property: $prop2"
    OMBALTER MAPPING '$mapName' MODIFY OPERATOR '$opName' SET PROPERTIES (EXTRACTION_HINT) VALUES ('');
    OMBCOMMIT;
    foreach opName [ OMBRETRIEVE MAPPING '$mapName' GET VIEW OPERATORS ] {
    foreach prop1 [OMBRETRIEVE MAPPING '$mapName' OPERATOR '$opName' GET PROPERTIES (LOADING_HINT) ] {
    if {[regexp $word $prop1] == 1 } {
         incr i
    puts "[clock format [clock seconds] -format "%Y%m%d %H:%M:%S"] --> Mapping: $mapName, Loading Operator: $opName, Property: $prop1"
    puts $log_file "[clock format [clock seconds] -format "%Y%m%d %H:%M:%S"] --> Mapping: $mapName, Loading Operator: $opName, Property: $prop1"
    OMBALTER MAPPING '$mapName' MODIFY OPERATOR '$opName' SET PROPERTIES (LOADING_HINT) VALUES ('');
    OMBCOMMIT;
    foreach prop2 [OMBRETRIEVE MAPPING '$mapName' OPERATOR '$opName' GET PROPERTIES (EXTRACTION_HINT) ] {
    if {[regexp $word $prop2] == 1 } {
         incr i
    puts "[clock format [clock seconds] -format "%Y%m%d %H:%M:%S"] --> Mapping: $mapName, Extraction Operator: $opName, Property: $prop2"
    puts $log_file "[clock format [clock seconds] -format "%Y%m%d %H:%M:%S"] --> Mapping: $mapName, Extraction Operator: $opName, Property: $prop2"
    OMBALTER MAPPING '$mapName' MODIFY OPERATOR '$opName' SET PROPERTIES (EXTRACTION_HINT) VALUES ('');
    OMBCOMMIT;
    if { $i == 0 } {
              puts "[clock format [clock seconds] -format "%Y%m%d %H:%M:%S"] --> Module: $one_Module, Object_type: Mapping";
              puts "[clock format [clock seconds] -format "%Y%m%d %H:%M:%S"] --> Not found any Loading/Extraction Operators set at Parallel";
              puts $log_file "[clock format [clock seconds] -format "%Y%m%d %H:%M:%S"] --> Module: $one_Module, Object_type: Mapping";
              puts $log_file "[clock format [clock seconds] -format "%Y%m%d %H:%M:%S"] --> Not found any Loading/Extraction Operators set at Parallel";
         } else {
              puts "[clock format [clock seconds] -format "%Y%m%d %H:%M:%S"] --> Module: $one_Module, Object_type: Mapping";
              puts "[clock format [clock seconds] -format "%Y%m%d %H:%M:%S"] --> Fixed $i Loading/Extraction Operators set at Parallel";
              puts $log_file "[clock format [clock seconds] -format "%Y%m%d %H:%M:%S"] --> Module: $one_Module, Object_type: Mapping";
              puts $log_file "[clock format [clock seconds] -format "%Y%m%d %H:%M:%S"] --> Fixed $i Loading/Extraction Operators set at Parallel";
    close $log_file;
    Enjoy!
    Michel

  • JDBC Lookup - Import table data from a different schema in same DB

    Hi XI Experts,
    We are facing an issue while importing a Database table into the external definition in PI 7.1.
    The details are as below:
    I have configured user 'A' in PI communication channel to access the database. But the table that I want to access is present in schema "B". Due to this, I am unable to view the table that I have to import in the list available.
    In other words, I am trying to access a table present in a different schema in the same database. Please note that my user has been given all the required permissions to access different schema. Even then, I am unable to access the table in different schema.
    Kindly provide your valuable suggestions as to how I can import table which is present in another schema but in the same Database.
    Regards,
    Subbu

    If you are using PI 7.1, then you can do JDBC Lookup to import JDBC meta data (table structures from DB). Configure a jdbc receiver communication channel where you specify username and password which has permission to access schema A and Schema B of database. Specify database name in the connection string. Then you might have access to import both schema.
    Please refer these links
    SAP PI 7.1 Mapping Enhancements Series: Graphical Support for JDBC and RFC Lookups
    How to use JDBC Lookup in PI 7.1 ?

  • Performance slows down when moving from stage to test schema within same instance with same database table and objects

    We have created a stage schema and tested application which is working fine when we are moving it to another schema for further testing ( This schema is created using same scripts which were used to create objects in staging schema) the performanc of application (Developed in .NET) slows down drastically
    Some of the store procedures we have checked at Databse/SQLdeveloper level are giving almost same performance but at Application level there is lot of difference
    Can you please help
    We are using Oracke 11g Database

    Are you using the Database Cloud Service?  You cannot create schemas in the Database Cloud Service, which makes me think you are not.  This forum is only for the Database Cloud Service.
    - Rick Greenwald

  • How to export and import dependent tables from 2 different schema

    I have a setup where schema1 has table 1 and schema 2 has table2. And table 1 from schema1 depends upon table 2 from schema 2.
    I would like to export and import these tables only and not any other tables from these 2 schemas with all information like grants,constraints.
    Also will there be same method for Oracle 10g R1,R2 and Oracle 11G.
    http://download.oracle.com/docs/cd/B12037_01/server.101/b10825/dp_export.htm#i1007514
    Looking at this For table mode it says
    Also, as in schema exports, cross-schema references are not exported
    Not sure what this means.
    As I am interested in only 2 tables I think I need to use table mode. But if I try to run export with both tables names, it says table mode support only one schema at a time. Not sure then How would the constraints would get exported in that case.
    -Rohit

    worked for my 1st time I tried
    exp file=table2.dmp tables="dbadmin.temp1,scott.emp"
    Export: Release 10.2.0.1.0 - Production on Mon Mar 1 16:32:07 2010
    Copyright (c) 1982, 2005, Oracle.  All rights reserved.
    Username: / as sysdba
    Connected to: Oracle Database 10g Enterprise Edition Release 10.2.0.1.0 - Production
    With the Partitioning, OLAP and Data Mining options
    Export done in US7ASCII character set and AL16UTF16 NCHAR character set
    server uses WE8ISO8859P1 character set (possible charset conversion)
    About to export specified tables via Conventional Path ...
    Current user changed to DBADMIN
    . . exporting table                          TEMP1         10 rows exported
    EXP-00091: Exporting questionable statistics.
    Current user changed to SCOTT
    . . exporting table                            EMP         14 rows exported
    EXP-00091: Exporting questionable statistics.
    EXP-00091: Exporting questionable statistics.
    Export terminated successfully with warnings.

  • SQL Server 2012 Import-Module 'sqlps' breaks the "Test-Path" PowerShell cmdlet

    I've run into something that is "very" frustrating with the new SQL Server 2012 PowerShell module.  When I Import the module, it breaks the "Test-Path" cmdlet when trying to test a UNC path to a directory.
    For example:  
    "Test-Path -path \\server\dirname" returns true as expected before the sqlps module is imported.  But after you import the SQL Server module "Import-Module 'sqlps' –DisableNameChecking" the same Test-Path
    now returns false.
    If I run the following in Windows PowerShell ISE I see the following results:
    Test-Path -path "\\server\directoryname"
    Import-Module 'sqlps' –DisableNameChecking
    Test-Path -path "\\server\directoryname"
    True
    False
    Anyone have any idea what's going on?
    UPDATE: after more testing, it looks like the problem happens with any cmdlet that references a UNC.  The New-Item has the same problem.  Before importing 'sqlps', New-Item is able to create a directory at the UNC path specified, but ater importing
    'sqlps', the New-Item fails.
    Thanks!

    Hi Mikea730,
    Sqlps.exe doesn't take advantage of a couple of these nice PowerShell V2 cmdlets without doing a bit of configuring in your environment. 
    Please refer to the following references to make some configuration in your server
    http://www.maxtblog.com/2010/11/denali-get-your-sqlpsv2-module-set-to-go/
    http://www.simple-talk.com/sql/database-administration/practical-powershell-for-sql-server-developers-and-dbas-%E2%80%93-part-1/
    http://sev17.com/2010/07/making-a-sqlps-module/
    Thanks,
    TechNet Subscriber Support
    If you are
    TechNet Subscription user and have any feedback on our support quality, please send your feedback
    here.
    Iric Wen
    TechNet Community Support

  • Export/import of mappings using OMBPLUS and Collections?

    Hi,
    We've got some good TCL code to export (and import) mappings for our projects along the lines of:
    OMBEXPORT MDL_FILE '$projname.mdl' \
    PROJECT '$projname' \
    CONTROL_FILE '$controlfile.txt' \
    OUTPUT LOG '$projname.log'
    and now we're moving to organizing mappings by OWB COLLECTION and need to export/import by collection instead. Thought it might be as simple as this:
    OMBEXPORT MDL_FILE '$collname.mdl' \
    FROM COMPONENTS ( COLLECTION '$collname')
    CONTROL_FILE '$controlfile.txt' \
    but that's not working (saying "Project $collname does not exist.", so it seems to still be treating what I thought was a collection name as a project name . Can someone point out the right syntax to use COLLECTIONs in OMBEXPORT/OMBIMPORT?
    All thoughts/tips appreciated...
    Thanks,
    Jim C.

    Hi,
    If we use the below tcl script for the entire project export , does it also export the locaitons,control center,configurations along???
    If yes then can you gimme the script for altering the locations , control center/configuration connection details??
    OMBEXPORT MDL_FILE '$projname.mdl' \
    PROJECT '$projname' \
    CONTROL_FILE '$controlfile.txt' \
    OUTPUT LOG '$projname.log'
    Can you also provide me the tcl for project import??
    Thanks

  • JAX-WS generated WSDL uses xsd:import and no option to inline schemas

    Is it possible to configure JAX-WS 2.0/2.1 so that when generating WSDL at runtime it will inline the schema definitions instead of using the <xsd:import> method? Unfortunately, some clients (Adobe Flex, for example) do NOT know how to process schema imports. I realize that Adobe (and others) should fix the problem on their end and become fully spec compliant, but the cold reality is that they are not.
    Any advice, suggestions, or solutions would be greatly appreciated.
    Regards,
    Todd

    You should post this question either to the JAXB 2.0 and JAX-WS 2.0 forum (http://forums.java.net/jive/forum.jspa?forumID=46) or the JAX-WS dev mailing list ([email protected]) so that the team is aware of the issue.
    That functionality currently does not exist. The only work around that I am aware of is to generate the WSDL before deployment and rearrange it as necessary.

  • Export/Import with apps user or EUL schema owner ?

    Hi,
    I am working with a migration plan to move Discoverer 9 to Discoverer 10.
    It is an apps mode EUL, where business areas and workbooks have been granted to Apps Users & Responsibilities.
    Customer has only maintained the EUL with the database user of the EUL owner.
    What is recommended for exporting/importing the EUL from 9 to 10.
    To connect to Administrator as the EUL owner, or to connect as an apps user ( SYSADMIN ? )
    Will all the grants to apps users/responsibilities work when importing as schema owner ?
    I tried to create a entire EUL export file connected as EUL owner, and for the workbooks the export log contained :
    1234#.Workbook name1
    2345#.Workbook name 2
    1234 is the user_id for the workbook owner ( All workbooks have been created by an apps user )
    Will the import process manage to convert this user_id, and set the correct owner for the imported workbooks ?
    ( I will use the option Only take ownership if original owner not found )
    I am not sure, but I think I have seen the following syntax in other projects when exporting an apps mode EUL
    SYSADMIN.Workbook name 1
    SYSADMIN.Workbook name 2

    >
    What is recommended for exporting/importing the EUL from 9 to 10.
    To connect to Administrator as the EUL owner, or to connect as an apps user ( SYSADMIN ? )
    Will all the grants to apps users/responsibilities work when importing as schema owner ?
    Hi,
    The best that i know is that you should export using the DB user so that you will not have problem with workbooks or BA that you are not granted for (or any other grants or privileges).
    The import should be done using the APPS user (the super user you use).
    That way if you import a workbook that the owner will not be found then you will get the ownership and you can after migration deal with it.
    If you'll do that with the DB (EUL owner) user after migration you will not have access to those workbooks.
    Any way about the workbooks i suggest you'll save them as DIS files for cases that the import of the workbooks or the owner association for them fail.

  • Unable to Export/Import between different CPS Enviornments( Test& QAS)

    HI all,
    i am facing some issues while doing export and import of Objects from one CPS host to another host.
    I did see another thread mentioning the same but stilll was not clear.
    I am trying to Export Application from one host to another host.
    My Case:
    Steps taken to Export:
    >selected SAP System in Application under SAP and do Export Tree . and Export is successfull. Copied the *.car file localy to desktop.
    I go to another CPS host and do Import Rule Set . I dont put any rule there but just submitt the Jobs. It asks to Upload the file to server. give path to *.car file , Once done. Import Job Runs but goes to error status.
    I have attached the logs.
    ERROR 2011-07-22 15:31:40,977 Africa/Harare [Redwood Job Thread Pool: GLOBAL.System.System worker 0] job.System_Import_Archive.2835 - Import failed, no objects have been imported.
    -- JOB RUN STACK TRACE --
    JCS-113051: Exception while parsing source jar:file:/C:/usr/sap/QRP/JC00/j2ee/cluster/server0/scheduler/JOB_2835carin.car.car!/SAPSystem/GLOBAL.QSR.xml at line 22: JCS-113004: No such field LoadThreshold on object type SAPSystem. See cause for further information
         at com.redwood.scheduler.model.imprt.BaseImporter.importObject(BaseImporter.java:297)
         at com.redwood.scheduler.model.imprt.ImportScanner.importObject(ImportScanner.java:52)
         at com.redwood.scheduler.model.imprt.BaseImporter.importObject(BaseImporter.java:185)
         at com.redwood.scheduler.model.imprt.CronacleArchiveReader.scanArchive(CronacleArchiveReader.java:111)
         at com.redwood.scheduler.model.imprt.CronacleArchiveReader.importAll(CronacleArchiveReader.java:84)
         at com.redwood.scheduler.system.jobs.CronacleArchiveImport.runInternal(CronacleArchiveImport.java:396)
         at com.redwood.scheduler.system.jobs.CronacleArchiveImport.run(CronacleArchiveImport.java:192)
         at com.redwood.scheduler.system.jobs.CommonSystemJob.execute(CommonSystemJob.java:53)
         at com.redwood.scheduler.systemjobservice.impl.JobWorker.doWork(JobWorker.java:242)
         at com.redwood.scheduler.infrastructure.workqueue.Worker.run(Worker.java:74)
    I do feel i have to use import Rule . But what feild do i need to include to replace with as per new host.
    Seconldy when i try to export EVENTS. I am able to do successfully. by giving an export rule for specfic events.
    But when i import it ,, i get in issues.
    Error Log:
    INFO  2011-07-22 10:06:24,091 Africa/Harare [Redwood Job Thread Pool: GLOBAL.System.System worker 0] job.System_Import_Archive.2685 - Starting to import all definitions from archive.
    INFO  2011-07-22 10:06:24,163 Africa/Harare [Redwood Job Thread Pool: GLOBAL.System.System worker 0] job.System_Import_Archive.2685 - Scanning : jar:file:/C:/usr/sap/QRP/JC00/j2ee/cluster/server0/scheduler/JOB_2685carin.car.car!/EventDefinition/GLOBAL.QFS_Detect_Z_C6_ABSA_119020.xml
    ERROR 2011-07-22 10:06:24,662 Africa/Harare [Redwood Job Thread Pool: GLOBAL.System.System worker 0] job.System_Import_Archive.2685 - Unable to parse input source: jar:file:/C:/usr/sap/QRP/JC00/j2ee/cluster/server0/scheduler/JOB_2685carin.car.car!/EventDefinition/GLOBAL.QFS_Detect_Z_C6_ABSA_119020.xml line 9
    I guess iagain i should be using import rule correctly.
    what feilds should i be replacing in import rule. is it going to be Application ( as feild ) or partition or anything else..
    Please advise if i am doing something worng.
    Thank You
    Deepak

    Hi ,
    There was version mismatch between Target CPS host and Sources CPS Host.
    I bought the Sources CPS host version same as Target CPS version (.Build: M33.21-49924 ). But stil when i do the Applciation Export from sourcs CPS to Target CPS i am still receiving same error as  below:
    INFO  2011-07-25 10:12:37,009 Africa/Harare [Redwood Job Thread Pool: GLOBAL.System.System worker 0] job.System_Import_Archive.3913 - Starting to import all definitions from archive.
    INFO  2011-07-25 10:12:37,055 Africa/Harare [Redwood Job Thread Pool: GLOBAL.System.System worker 0] job.System_Import_Archive.3913 - Scanning : jar:file:/C:/usr/sap/QRP/JC00/j2ee/cluster/server0/scheduler/JOB_3913carin.car.car!/JobDefinition/GLOBAL.Z_SRM-220_I_POW_UPD_ERP_DAT_POWL.xml
    INFO  2011-07-25 10:12:37,057 Africa/Harare [Redwood Job Thread Pool: GLOBAL.System.System worker 0] job.System_Import_Archive.3913 - Scanning : jar:file:/C:/usr/sap/QRP/JC00/j2ee/cluster/server0/scheduler/JOB_3913carin.car.car!/Application/GLOBAL.ESKOM_QUALITY_SRM_SYSTEM.xml
    ERROR 2011-07-25 10:12:37,069 Africa/Harare [Redwood Job Thread Pool: GLOBAL.System.System worker 0] job.System_Import_Archive.3913 - Unable to parse input source: jar:file:/C:/usr/sap/QRP/JC00/j2ee/cluster/server0/scheduler/JOB_3913carin.car.car!/Application/GLOBAL.ESKOM_QUALITY_SRM_SYSTEM.xml line 7
    JCS-113007: Cannot set field User/Role on Grant on Application <Name Not Set> to <Grantee Subject Not Set>:<Grantee Subject Not Set> to Subject:Role.Everyone
    Please advice.
    Regards
    Deepak

Maybe you are looking for

  • A bug and some suggestions to return to old ways of doing things in ical

    hi there when i first started using ical i could double click an event to edit it and press enter when finished now i have to press apple E and click done when finished - with hundreds of events its painful any ideas apart from someone at apple makin

  • How to execute an internal WebService from CRM On Demand?

    Hello, I would like to know how to execute an internal WebService, for example Contact's WebService, from CRM On Demand? I have seen in the online documentation: "An integration event is a mechanism for triggering external processes that are based on

  • Problems with MySQL in Slave mode

    Hi erverybody!!! I have a problem with my IDM and mysql. This is my envioroment: I have installed one IDM and has a Database on Mysql. I have a DRP and i have other server of IDM an his database. The databases of mysql are replicated and are configur

  • Service Cofirmation not replicated in Backend system

    Hi All,   When we are confirming service good in the SRM browser it is not being replicated in R/3, WS10400010 is being triggered and completed successfully but the data is not replicated in R/3. I have checked the successful one and it has system st

  • I want to find an application in order to help me with my storage .

    I want it to be able to get all the information about a product which means entry , price , how many left , etc. all by reading the product's barcode with iPhone or iPad camera . I have seen such this app , but it was just fine medicine in a specific