Import of BaseView to a different schema in database

Hi All,
We have developed some bam reports. Now the base view, meta view are referring to a schema name (say SchemaName1) in database.
In the plans we are using "SQL query" to get data from database. So we have queries like "Select SchemaName1.table1.column1 ... from ......".
Now we have taken a export of the base view, meta view ,plans and all other components of BAM.
All is fine till now... But now i need to import all these stuff in a different database. My problem is that in the new database instance i cannot use the same schema name (i.e. SchemaName1 ). I have a different schema name(say SchemaName2) there and i need to use that......There is no way i can use SchemaName1.
Will the import of Base view/Meta view/plans work or we need to do some configuration changes during import??
Do we have to create Base view for the new schema names from scratch or the imported code can be used??
The plans have sql queries referring to SchemaName1 but we need it to refer to the Other schema(SchemaName2 )...What is the possible way of referring to the new schema without developing the plan again??
Waiting for your valuable feedback...

***Backup your repository database***
***Please review all steps before attempting***
Sarputil Export
1.     Start | Run, type “sarputil”
2.     Do a partial export. (Note down directory that will contain files to be exported.).
3.     Select your Baseview, Metaview, Security:Login Profile, and Plan. (Note, the check boxes are in front of each object name and might appear very light on some clients.)
4.     On the XML dialog and with the question on what you would like to do – select Execute Now.
5.     Finish the wizard and from the directory noted above, verify .csv files were created. (Typically C:\OracleBAM\EnterpriseLink\Data\rp\export).
6.     Copy all .csv files and paste to different temporary folder on 2nd server.
Sarputil Import
1.     Start | Run, type “sarputil” on the 2nd server where you’ll import into.
2.     Partial import
3.     Under Repository Information dialog, set the Data Source information on the right and remember to change the directory now containing your .csv files.
4.     Like before, select “Execute now” and Finish the wizard.
Oracle BAM Admin
Modify connect string (host string or tns name) if you need to:
1.     Connect in Oracle BAM Admin
2.     Expand Baseviews and select the imported Baseview
3.     Check under “Server” on the left if the connect string is correct.
4.     If not, click Modify button and change.
Modify Baseview Login if you need to:
1.     With your Baseview still selected on the left side, click on “Baseview Logins” tab
2.     Logins on the left correspond to actual database user id’s and password. Add a new Login with a database User ID and Password to access the new schema.
3.     Associate or Set the new Login (right) to the BAM User on the left.
Sarpbv Modification
Use sarpbv utility to change references of the old schema name to a new schema name.
General syntax for Oracle:
sarpbv /R"username:pwd:Oracle:TNS Name::DB UserID:DBUser pwd” /B"BaseView name" /O"NewSchema"
**Use Capital letters for New Schema name!
**Notice 2 colons (::) after TNS Name (because you do not have a database name in Oracle).
1.     Open DOS
2.     Type your sarpbv command. Below is an example where I changed a Baseview called “scott” to use a new schema name called Jack.
sarpbv /R"sa::ORACLE:baminst::sagent:sagent" /B"scott" /O"JACK"
3.     Open Design Studio, locate the plan and check the SQL Query now reflects the new schema.
4.     Test the plan to ensure it’s pulling data correctly from the new schema.

Similar Messages

  • JDBC Lookup - Import table data from a different schema in same DB

    Hi XI Experts,
    We are facing an issue while importing a Database table into the external definition in PI 7.1.
    The details are as below:
    I have configured user 'A' in PI communication channel to access the database. But the table that I want to access is present in schema "B". Due to this, I am unable to view the table that I have to import in the list available.
    In other words, I am trying to access a table present in a different schema in the same database. Please note that my user has been given all the required permissions to access different schema. Even then, I am unable to access the table in different schema.
    Kindly provide your valuable suggestions as to how I can import table which is present in another schema but in the same Database.
    Regards,
    Subbu

    If you are using PI 7.1, then you can do JDBC Lookup to import JDBC meta data (table structures from DB). Configure a jdbc receiver communication channel where you specify username and password which has permission to access schema A and Schema B of database. Specify database name in the connection string. Then you might have access to import both schema.
    Please refer these links
    SAP PI 7.1 Mapping Enhancements Series: Graphical Support for JDBC and RFC Lookups
    How to use JDBC Lookup in PI 7.1 ?

  • How to export and import dependent tables from 2 different schema

    I have a setup where schema1 has table 1 and schema 2 has table2. And table 1 from schema1 depends upon table 2 from schema 2.
    I would like to export and import these tables only and not any other tables from these 2 schemas with all information like grants,constraints.
    Also will there be same method for Oracle 10g R1,R2 and Oracle 11G.
    http://download.oracle.com/docs/cd/B12037_01/server.101/b10825/dp_export.htm#i1007514
    Looking at this For table mode it says
    Also, as in schema exports, cross-schema references are not exported
    Not sure what this means.
    As I am interested in only 2 tables I think I need to use table mode. But if I try to run export with both tables names, it says table mode support only one schema at a time. Not sure then How would the constraints would get exported in that case.
    -Rohit

    worked for my 1st time I tried
    exp file=table2.dmp tables="dbadmin.temp1,scott.emp"
    Export: Release 10.2.0.1.0 - Production on Mon Mar 1 16:32:07 2010
    Copyright (c) 1982, 2005, Oracle.  All rights reserved.
    Username: / as sysdba
    Connected to: Oracle Database 10g Enterprise Edition Release 10.2.0.1.0 - Production
    With the Partitioning, OLAP and Data Mining options
    Export done in US7ASCII character set and AL16UTF16 NCHAR character set
    server uses WE8ISO8859P1 character set (possible charset conversion)
    About to export specified tables via Conventional Path ...
    Current user changed to DBADMIN
    . . exporting table                          TEMP1         10 rows exported
    EXP-00091: Exporting questionable statistics.
    Current user changed to SCOTT
    . . exporting table                            EMP         14 rows exported
    EXP-00091: Exporting questionable statistics.
    EXP-00091: Exporting questionable statistics.
    Export terminated successfully with warnings.

  • How can i import tables from a different schema into the existing relational model... to add these tables in the existing model? plss help

    how can i import tables from a different schema into the existing relational model... to add these tables in the existing relational/logical model? plss help
    note; I already have the relational/logical model ready from one schema... and I need to add few more tables to this relational/logical model
    can I import the same way as I did previously??
    but even if I do the same how can I add it in the model?? as the logical model has already been engineered..
    please help ...
    thanks

    Hi,
    Before you start, you should probably take a backup copy of your design (the .dmd file and associated folder), in case the update does not work out as you had hoped.
    You need to use Import > Data Dictionary again, to start the Data Dictionary Import Wizard.
    In step 1 use a suitable database connection that can access the relevant table definitions.
    In step 2 select the schema (or schemas) to import.  The "Import to" field in the lower left part of the main panel allows you to select which existing Relational Model to import into (or to specify that a new Relational Model is to be created).
    In step 3 select the tables to import.  (Note that if there are an Foreign Key constraints between the new tables and any tables you had previously imported, you should also include the previous tables, otherwise the Foreign Key constraints will not be imported.)
    After the import itself has completed, the "Compare Models" dialog is displayed.  This shows the differences between the model being imported and the previous state of the model, and allows you to select which changes are to be applied.
    Just selecting the Merge button should apply all the additions and changes in the new import.
    Having updated your Relational Model, you can then update your Logical Model.  To do this you repeat the "Engineer to Logical Model".  This displays the "Engineer to Logical Model" dialog, which shows the changes which will be applied to the Logical Model, and allows you to select which changes are to be applied.
    Just selecting the Engineer button should apply all the additions and changes.
    I hope this helps you achieve what you want.
    David

  • How do I import a dump file into a different schema

    I think this is an easy question.
    I just exported a schema for userA to a file using expdp. I now want to import this data for userA to a different schema for userB.
    What parameters do I need in the impdp command?
    thanks,
    jj

    Oh OK thanks.
    Let me extend my question. I see that I can change the schema and I did that but then I realized that I also need to change my tablespace.
    Can this be done in a single command? I got an error when I tried to remap both the schema and tablespace.
    thanks.

  • How to restore a single table from a DP Export from a different schema?

    Environment:
    Oracle 11.2.0.3 EE on Solaris
    I was looking at the documentation on DP Import trying to find the correct syntax to import a single table from a DP Export of a different schema.
    So, I want to load table USER1.TABLE1 into USER2.TABLE1 from a DP Export.
    Looking at the REMAP_TABLE options:
    REMAP_TABLE=[schema.]old_tablename[.partition]:new_tablename
    OR
    REMAP_TABLE=[schema.]old_tablename[:partition]:new_tablenameI can't see where to specify the target schema name. The examples had the new table name residing in the same schema with just a new name.
    I looked at the REMAP_SCHEMA but the docs say that will import the entire schema into the new schema and I only want one (1) table.
    Any suggestions are most welcome!
    -gary

    I thought I tried that combination and it seemed to me that the REMAP_SCHEMA somehow over-rode the TABLES= parameter >and started loading all the objects.If it does fail (and it should not) then please post the details here and I will try to see what is happening.
    Let me get back into the sandbox and try it again. I admit I was in a bit of a hurry when I did it the first time.We are all in a hurry, no worries. If it fails, please post the details and the log file.
    Does it make any sense that one parameter would override another?No, this should never happen. We have tons of checks to make sure the job can't have multiple meanings. For example, you can't say
    full=y schemas=foo --- Which do you want, a full export or a list of schema exports, etc.
    Your suggestion was the first thing I thought would work.This should work. If not, please post the log file with the command and the results.
    Dean
    Thanks again for the help and stay tuned for my new attempt.
    -gary

  • Code and core tables in different schemas

    Hi,
    My db version : Oracle Database 11g Enterprise Edition Release 11.2.0.3.0 - 64bit Production
    I would like to understand, the pros and cons of the following situation:
    We have a single application.
    The db design for this application has 44 tables out of which 28 are core tables related by PK and FK relationships. The remaining 16 are look up and reference tables which are not related to any.
    The team decided to place the 28 core tables in one schema A and the remaining 16 in another schema B within same database. (The reason for this is..since it is done in other projects lets do it here too).
    Now comming to the code (stored procs, functions, packages etc). The teams wants to place most of the code in the schema B that has the 16 ref tables. (the reason again being the same).
    What are the pros and cons doing this???
    Please advice.
    PS:
    I have googled and found sth on these lines:
    cons: 
    o harder to manage
    o harder to upgrade
    o harder to patch
    o harder to maintain
    o causes your shared pool size to increase 1,000 times (shared sql goes down the tubes)
    o takes more space
    o queries against the dictionary will be impacted
    o latching on the shared pool goes WAY up (latching = locks = serialization device =
    slows you down)
    pros:
    o none that I can think of.

    >
    I would like to understand, the pros and cons of the following situation:
    Yes I am straining to find more points (was not good at it though).
    >
    You just want to understand? Are you sure? Your thread reads more like you just want to do things your way and are looking for support.
    >
    The team decided to place the 28 core tables in one schema A and the remaining 16 in another schema B within same database. (The reason for this is..since it is done in other projects lets do it here too).
    Now comming to the code (stored procs, functions, packages etc). The teams wants to place most of the code in the schema B that has the 16 ref tables. (the reason again being the same).
    >
    My question to you is: what PROBLEM are you trying to solve? If the 'team' already uses this approach and there haven't been any substantive problems then why try to change things now? Why have you chosen to fight this battle?
    Your 'team' has already decided and now, after that decision, you want to argue about it with them? The time to present arguments for/against a given plan is BEFORE the decisions are made, not after. Once a decision is made you need to be a team player and implement that decision to the best of your ability.
    One thing I'm certain of. If you try to support your argument using things like that 'AskTom' link you posted any credibility you had will go out the window. That link, as you already hinted yourself, is not your use case at all. All it takes is for one of your 'team' members to point that out and everyone will pretty much stop listening to any other arguments you make.
    People are generally not going to 'change their ways' unless you can show them:
    a) there is something seriously wrong with the way they are now doing things or
    b) a new way of doing things provides some substantial benefits
    Choice 'a' above is where you need to start but you haven't provided ANY information in this post that you have identified any serious issues with the status quo.
    The main task for Oracle is to be able to FIND the objects being referenced. So, in my opinion, that is what you should focus on when looking for PRO/CON arguments.
    That is: What issues are there if an object being referenced is in a different schema than the session that needs to use the object?
    1. objects may need to be prefixed with the schema name
    2. public or private synonyms may need to be created/maintained to avoid having to deal with item #1 above
    3. new grants may be needed to implement/maintain the proper security
    4. new roles may need to be created to maintain proper security (see item #3 above)
    5. additional work will be needed to maintain the new roles in item #4 above
    6. PL/SQL code may not be able to reference the object or may reference the wrong object
    7. Roles are disabled in PL/SQL (see item #6 above) - this means that the new grants (see item #3) may need to be granted directly to the schema users that need access instead of to roles. That can make it harder to create and maintain a role-base security schema.
    If I were you I would spend my time on other more important thingsd. But if I chose to fight this particular battle then I would make a list of problems that occured in the past with the current method of doing things and also problems related to the above list of items and then show how many of those problems will 'disappear' if the new method is used.

  • Creating dup app in different schema...

    I have found a few notes on exporting/importing applications, but they don't tell me whether the export is a copy or move. I need to keep both schema's versions of the same application. My original version is under my personal schema. I am ready for office wide testing. How do I copy my application so that I have both versions, but under different schemas. Exactly how do I do that? Is it simply exporting the application and then performing an import?

    Hello,
    >> but they don't tell me whether the export is a copy or move
    Export is a copy. APEX creating a SQL script, which include all the application meta data. Your original application is stay intact.
    >> Is it simply exporting the application and then performing an import?
    Basically Yes. You are exporting your application on one server, and import it on the other. As part of the import process, you can choose the schema in which to install the application.
    However, as you are installing your application on a new server/schema, you need also to copy and install all the supporting objects (on the database level) your application is using. You can read more about how to deploy a new application in the following - http://download-uk.oracle.com/docs/cd/B32472_01/doc/appdev.300/b32471/deploy.htm#BABGJIAD .
    Regards,
    Arie.

  • Loading the data in different schemas in different environments

    Hi all,
    My source technology is File and Target is Oracle.
    In Development environment developed the code and successively loaded in target  Oracle DB with schema A.
    In UAT environment i have to load the different schema with same oracle DB with schema B.
    Imported the code to UAT from development and trying to load the data in the schema B.
    I created the physical connection and pointed the same Loagical name which i used in schema A.
    I'm getting the error.
    could you please any one help on this. and also correct me if i'm doing wrong.
    Thanks in advance 

    Hi,
    can you please paste the error here.
    Thanks

  • Compare DB objects between different schemas

    Hi,
    I want to compare db objects beween 2 different schemas like QA and PROD.
    Please suggest me a tool that compares table definitions,stored procedures and other db objects between different schemas.

    Did you check the above video link which is shared?
    Yes I'd agree with you and also i faced these situations to check what objects i need to export from Dev to Quality ( to analyze what are already correct or what have to be imported again)
    Generally Quality & Prod are in sync.
    Also will be waiting along with you if anyone comments on any such tool.
    Regards,
    Krishna Tangudu

  • Same Application connecting to different schemas

    I currently have a single version of my application that is tied to a workspace/schema. My client is now requesting to have the same application connect to additonal test and training schema so they can play (without changing live data).
    What is the best approach for setting this up. Do I create a new workspace for each schema they want and import my application into it? Do I need to change the application ID when I import so the URL is different for each connection?

    Hello:
    You probably want to create a workspace for each schema. I don't think you can re-use application ids in your situation where the workspaces are all in a single APEX installation.
    You could specify an appropriate alias for your application in each workspace and address the application using the alias
    http://apexhost:7777/f?p=TEST
    http://apexhost:7777/f?p=TRAIN
    etc
    Varad

  • Accessing tables from different schema in CDS and AMDP

    Hi All,
    We are working on a HANA system which has several schema replicated from SAP R/3/Non SAP systems. We have BW 7.4 SP9 deployed on the same system and accessing the HANA views using latest BW virtual objects such as Open ODS , Composite providers etc.
    We are also using the BW system for few ABAP based data processing developments. We are currently accessing HANA views in ABAP programs by creating dictionary views based on external HANA views.
    We would like to however use recent possibilities of CDS and AMDP for better life cycle management of ABAP based solutions. The open SAP course on this subject was of very good help. Thanks a lot "open SAP team" for that. I would however have few open questions,
    As I understand AMDP gives us full flexibility of writing sql procedures within ABAP development environment, but can we access tables from different schema into AMDP code. If yes, then sample code would help.
    If the answer of first question is yes, then how do we manage transports between development and production systems where the schema names would be different. Currently in open HANA developments, such transport is manged using Schema mapping.
    Can I also use different schema tables in CDS views.
    We are updating few tables in ABAP dictionary after applying processing logic in ABAP program as detailed in step 1. With the new approach using AMDP, can we directly update database schema tables which will give us an optimization advantage.
    New ABAP HANA program interfaces are quite promising and we would like to use them to optimize many data intensive applications.
    Thanks & Regards,
    Anil

    Hi Anil,
    I can only answer 1. and 2. (and would be interested into 3. as well):
    1.
    Yes you can access tables from a different schema and also HANA views. In this case no 'using' is needed.
    Examples:
        RESULT = SELECT
        FROM
              "SAP_ECC"."T441V" AS t,
              "_SYS_BIC"."tmp.package/AFPO" AS a.
        WHERE ...
    2. In this case, if you need schema mapping: You could use HANA (projection) views which just forward to a different schema, also see example.
    Best regards,
    Christoph

  • Moving Subpartitions to a duplicate table in a different schema.

    +NOTE: I asked this question on the PL/SQL and SQL forum, but have moved it here as I think it's more appropriate to this forum. I've placed a pointer to this post on the original post.+
    Hello Ladies and Gentlemen.
    We're currently involved in an exercise at my workplace where we are in the process of attempting to logically organise our data by global region. For information, our production database is currently at version 10.2.0.3 and will shortly be upgraded to 10.2.0.5.
    At the moment, all our data 'lives' in the same schema. We are in the process of producing a proof of concept to migrate this data to identically structured (and named) tables in separate database schemas; each schema to represent a global region.
    In our current schema, our data is range-partitioned on date, and then list-partitioned on a column named OFFICE. I want to move the OFFICE subpartitions from one schema into an identically named and structured table in a new schema. The tablespace will remain the same for both identically-named tables across both schemas.
    Do any of you have an opinion on the best way to do this? Ideally in the new schema, I'd like to create each new table as an empty table with the appropriate range and list partitions defined. I have been doing some testing in our development environment with the EXCHANGE PARTITION statement, but this requires the destination table to be non-partitioned.
    I just wondered if, for partition migration across schemas with the table name and tablespace remaining constant, there is an official "best practice" method of accomplishing such a subpartition move neatly, quickly and elegantly?
    Any helpful replies welcome.
    Cheers.
    James

    You CAN exchange a subpartition into another table using a "temporary" (staging) table as an intermediary.
    See :
    SQL> drop table part_subpart purge;
    Table dropped.
    SQL> drop table NEW_part_subpart purge;
    Table dropped.
    SQL> drop table STG_part_subpart purge;
    Table dropped.
    SQL>
    SQL> create table part_subpart(col_1  number not null, col_2 varchar2(30))
      2  partition by range (col_1) subpartition by list (col_2)
      3  (
      4  partition p_1 values less than (10) (subpartition p_1_s_1 values ('A'), subpartition p_1_s_2 values ('B'), subpartition p_1_s_3 values ('C'))
      5  ,
      6  partition p_2 values less than (20) (subpartition p_2_s_1 values ('A'), subpartition p_2_s_2 values ('B'), subpartition p_2_s_3 values ('C'))
      7  )
      8  /
    Table created.
    SQL>
    SQL> create index part_subpart_ndx on part_subpart(col_1) local;
    Index created.
    SQL>
    SQL>
    SQL> insert into part_subpart values (1,'A');
    1 row created.
    SQL> insert into part_subpart values (2,'A');
    1 row created.
    SQL> insert into part_subpart values (2,'B');
    1 row created.
    SQL> insert into part_subpart values (2,'B');
    1 row created.
    SQL> insert into part_subpart values (2,'C');
    1 row created.
    SQL> insert into part_subpart values (11,'A');
    1 row created.
    SQL> insert into part_subpart values (11,'C');
    1 row created.
    SQL>
    SQL> commit;
    Commit complete.
    SQL>
    SQL> create table NEW_part_subpart(col_1  number not null, col_2 varchar2(30))
      2  partition by range (col_1) subpartition by list (col_2)
      3  (
      4  partition n_p_1 values less than (10) (subpartition n_p_1_s_1 values ('A'), subpartition n_p_1_s_2 values ('B'), subpartition n_p_1_s_3 values ('C'))
      5  ,
      6  partition n_p_2 values less than (20) (subpartition n_p_2_s_1 values ('A'), subpartition n_p_2_s_2 values ('B'), subpartition n_p_2_s_3 values ('C'))
      7  )
      8  /
    Table created.
    SQL>
    SQL> create table STG_part_subpart(col_1  number not null, col_2 varchar2(30))
      2  /
    Table created.
    SQL>
    SQL> -- ensure that the Staging table is empty
    SQL> truncate table STG_part_subpart;
    Table truncated.
    SQL> -- exchanging a subpart out of part_subpart
    SQL> alter table part_subpart exchange subpartition
      2  p_2_s_1 with table STG_part_subpart;
    Table altered.
    SQL> -- exchanging the subpart into NEW_part_subpart
    SQL> alter table NEW_part_subpart exchange subpartition
      2  n_p_2_s_1 with table STG_part_subpart;
    Table altered.
    SQL>
    SQL>
    SQL> select * from NEW_part_subpart subpartition (n_p_2_s_1);
         COL_1 COL_2
            11 A
    SQL>
    SQL> select * from part_subpart subpartition (p_2_s_1);
    no rows selected
    SQL>I have exchanged subpartition p_2_s_1 out of the table part_subpart into the table NEW_part_subpart -- even with a different name for the subpartition (n_p_2_s_1) if so desired.
    NOTE : Since your source and target tables are in different schemas, you will have to move (or copy) the staging table STG_part_subpart from the first schema to the second schema after the first "exchange subpartition" is done. You will have to do this for every subpartition to be exchanged.
    Hemant K Chitale
    Edited by: Hemant K Chitale on Apr 4, 2011 10:19 AM
    Added clarification for cross-schema exchange.

  • MapViewer metadata problem - accessing spatial data in a different schema.

    I have a MapViewer application that uses data from three different schemas.
    1. Dynamic Themes come from schema A.
    2. Static Themes come from schema B.
    3. A newly added static theme in B whose data comes from schema C.
    The mapviewer datasource points to schema B where the static themes, data and metadata are defined while the dynamic themes have their own datasource specified as part of addJDBCTheme(...).
    To get the newly added map to work I've had to add a view in schema B that points to C instead of referencing directly the table and I've had to add the metadata twice, once for schema B and once for schema C.
    If I put the metadata in just one of the two schemas I get the following errors.
    08/11/21 13:58:57 ERROR [oracle.sdovis.ThemeTable] cannot find entry in ALL_SDO_GEOM_METADATA table for theme: AMBITOS_REST
    08/11/21 13:58:57 ERROR [oracle.sdovis.ThemeTable] java.sql.SQLException: Invalid column index
    OR
    08/11/21 13:53:39 ERROR [oracle.sdovis.theme.pgtp] java.sql.SQLException: ORA-29902: error in executing ODCIIndexStart() routine
    ORA-13203: failed to read USER_SDO_GEOM_METADATA view
    It's not a big deal but I'd like to know if anyone else has has similar problems.
    Saludos,
    Lew.
    Edited by: Lew2 on Nov 21, 2008 6:42 AM

    Hi Lew,
    if you are using a recent version (10.1.3.1 or later) there is no need to use a view and to create the metadata in both schemas.
    You need to grant selection on tables between the schemas.
    You can try the following. Assume you have the MVDEMO schema (from MapViewer kit) and SCOTT schema.
    1) grant select on MVDEMO Counties table to SCOTT
    SQL> grant select on counties to scott;
    2) Now you are ready to create a predefined theme in schema SCOTT using the MVDEMO Counties table.
    - Open MapBuilder and loads the SCOTT schema.
    - On the Data navigator (bottom left tree), go to Geometry tables and you should see the MVDEMO node and the COUNTIES node inside it.
    - Start a wizard to create a geometry theme based on this Counties table.
    - At the end you should see that the base table name is MVDEMO.COUNTIES. Therefore MapViewer will use the metadata in MVDEMO schema and there is no need to replicate it in SCOTT schema.
    Joao

  • HTML DB application using table in a different schema

    I need to create a new HTML DB application. The table is in a different schema than mine. The DBA granted me rights to see it and I can see it from iSQL. HTML DB does not give the alternative of selecting this table.

    This thread covers your issue.
    New to HTML DB, have questions about service admin and schemas
    Ask your DBA to add workspace to schema mapping.
    Sergio

Maybe you are looking for