Migrate test EUL to production

Hello
I am new to the whole Discoverer thing… and have been named the Discoverer expert (lol)…
Can some me or send me a link of a PDF on migrating EUL’s from a test environment to production?
I assume this can be done. An EUL can be moved from a test server with test tables to a production server and have its table point at the real deal… would this effect any worksheet that was made in the test environment?
Thanks,
Rob

Do I use the EUL Export Wizard?
If so when I import it to a new database is it easy to change from the test tables to the prod tables?

Similar Messages

  • Issue after migration of test environment to production

    Hi guys,
    After I migrate the test environment to production enviroment I am getting the following errors:
    java.sql.SQLException: ORA-00001: unique constraint (SNPPW.PK_PACKAGE) violated
    java.sql.SQLException: ORA-00001: unique constraint (SNPPW.PK_SNP_GRP_STATE) violated
    java.sql.SQLException: ORA-00001: unique constraint (SNPPW.PK_FOLDER) violated
    java.sql.SQLException: ORA-00001: unique constraint (SNPPW.PK_TRT) violated
    The version of ODI used is ODI 10.1.3.5
    Any help will be appreciated.
    Thanks in advance.

    Hi,
    How exactly are you doing the migration?
    Regards,
    K

  • How do I copy whole farm in 2013 Test to 2013 Production or is this more trouble than worth?

    Hi,
    We are migrating from SharePoint 2010 farm to a TEST SharePoint 2013 farm as Microsoft recommend. The test farm is almost complete. The Host Url Site Collections under the main Web Application all use a Sharepoint UAT prefix style name i.e. SUAT<unique
    name>. I want to be able to move everything from TEST farm to Production in one foul swoop i.e. Service Applications. How can I do this - farm-backup? Or should I update the Service Applications and restore my Host Url Site Collections without prefix with
    powershell script I have or is this more trouble then it's worth.
    Thanks.
    John.

    Hi John,
    Step 1 is straight forward; you are upgrading content from the 2010 to the 2013 version.  I agree with the recommendation to move to a testing environment as a best practice.  However, setting up your pre-production and moving the content there
    will save effort.
    Moving content and containers between 2013 test to 2013 production is a matter of copying and attaching databases.  The trick is the farm level configurations, as they cannot be easily "copied."  There are some steps to save effort, for
    instance the Term Store can be exported (https://gallery.technet.microsoft.com/office/PowerShell-for-SharePoint-a838b5d0).  However you really need to plan for building two sets of service applications.
    Also, make sure you plan for deploying any Apps or Features between test and production.
    There are 3rd party tools to simplify this, my firm has a tool DocAve Deployment Manager.  But before taking the plunge on any of this additional effort, be sure that using a pre-production farm is not best for you.  This
    is especially true if it will be offline until after the full migration.
    Hope this helps,
    Al
    Al Lombardi, MCSE, CSM
    Premier Field Engineer – AvePoint, Inc.
    www.AvePoint.com
    View my LinkedIn Profile

  • Problem exporting workbooks from development EUL to test EUL

    I have created some workbooks in a Development EUL. We want to promote them to a test EUL for users to test on. I shared my workbooks to both my boss's user name, and to the responsiibility that he uses when he does Discoverer Admin stuff. In development, my boss is able to see these workbooks. He can open them and run them. So my boss did one export out of development and into test for the business area. That went just fine and he can see the business area and its folders in the test EUL. So next he did a new export out of development, picking my workbooks. As the workbook owner, I do NOT exist at this time in the test EUL. When my boss did the import of that workbook .eex file into the test eul, we got a security warning message on each workbook. Since I did not exist, on the import it should not matter which option you pick - either make the admin user the owner out right, or make owner only if owner does not exist. My boss is not able to see the workbooks we imported. The main Discoverer administrator says the workbooks did get imported. So we are puzzled as to why he cannot see them. Has anyone run into a situation like this before? Any ideas on what might be happening to us? Is it possible that we have run across a programming error in Discoverer admin? Thanks for any time and assistance on this.

    Well, what we ended up doing is that in the development EUL, I shared the workbooks to the responsibility that my boss uses. My boss then exported the workbooks out of development and imported to the test EUL. In the import wizard he specified to rename the existing objects. Import log shows objects being renamed and then a successful import completion. My boss can now see the workbooks in the test EUL. So while he could see and run in development shared to user name, it seems as if when imiporting, the responsibility somehow takes over and is required. We are waiting to see if the test user can see the workbooks (currently shared to user name), or if will need to have the test user give us a responsibiity to share to. Beginning to wonder why Oracle even gives us the option to share to a user name if it won't always work.
    John Dickey

  • How to Synchronize Test Instance with Production database on regular basis.

    Hello
    How to Synchronize Test Instance with Production database on regular basis.
    I want to update my Test Instance with Production ERP on daily basis without downTime.?

    Hi,
    I recomend to configure physical standby database for your production and when ever you requir open standby Using a Physical Standby DB in Read Write Mode and Flashback DB, so that you can revert all the changes back and put the db in sync. Please follow below metalink notes.
    Business Continuity for Oracle E-Business Release 12 Using Oracle 11g Physical Standby Database (Doc ID 1070033.1)
    Business Continuity for Oracle E-Business Release 11i Using Oracle 11g Physical Standby Database - Single Instance and Oracle RAC (Doc ID 1068913.1)
    Business Continuity for Oracle Applications Release 11i, Database Releases 9i and 10g (Doc ID 216212.1)
    update my Test Instance with Production ERP on daily basis without downTime.?With zero down time is not possible I think.

  • (part 2 of 2): 3.x Dataflow migration test. Any hints on this problem?

    Hi Experts
    You may see this link for part 1 of this question for the full details if the continuation question below is not clear :
    (part 1 of 2): 3.x Dataflow migration test. Any hints on this problem?
    The eventual goal here is to migrate the 3.sx dataflow so as to use the transformations, etc
    4:
    Instruction: After successful activation of transfer rule Right Click on Data Source and say Migrate.
    i.e. I am still under InfoSource tree, I expanded transfer rule to see Data Source under itu2026
    R-clk & select Migrate, WITH EXPORT
    --finally got u201CDataSource 0PLANT_ATTR(E75CLNT800) was migratedu201D
    5:
    Instruction: To convert Update Rules to Transformation
    The instructions states u201Cu2026 Now Go to Info Provider and search for 3.x old Data Source . Right click on Update rules and say Create Transformation. ..u201D
    My issue:  Under InfoProvider tree, search for original datasource: 0Plant_Attr could not be found.
    I then performed a FIND and checked the box for Datasources 3.x  It gave no datasource from the source system(E75CLNT800) I am dealing with.
    BUT if I ignore the instruction and do a FIND for Datasources (without the 3.x), I find it for the right 0plant_Attr for my source system
    The problem is that it still did not show the Update rule under itu2026
    --I then went to Datasource tree, Logistics, u2026followed path to 0palnt_Attr datasource and identified the Update Rule
    but it does not have an option for u201CAdditional Functionsu201D, to u201CCreate transformationu201D
    I am therefore stuck on page 9 on the above link providing me the instructions. Any hints?
    6.
    I searched under the InfoProvider tree, found 0Plant but there were multiple Transfer rules BUT none of them showed for E75CLNT800 that I was interested.
    I saw Transfer Rules and Update rules for other source systems on the IDES e.g. E69CLNT190 with the datasource etc. What happened to mine?

    Hi,
    Great, I followed the comments on the board and restored the datasource to 3.x datasource. Then contrary to the instructions in the link I was following, I migrated Transfer rule, Update rules before the datasource (0PLANT_ATTR) migration.
    I am happy everything is fine but I encountered some issues, which I would like your comments on:
    1.
    I have access to the ECC E75CLNT800  and I saw that some developers were also loading to the same 0Plant attribute with datasource e.g. 0WS_TAROBJ_ATTR from other systems e.g. E69CLNT190
    If I were to test this second option as a BI Consultant, do I need the involvement of Basis to establish a connection from BI to E69CLNT190 or, am I as a BI consultant expected to do that myself?
    2.
    During the migration of Transfer Rule it failed to activate with error:
    u201CRule (target: 0GN_R3_SSY, group: 01 Standard Group): Syntax error in routineu201D
    Later, I u201CDelete Ruleu201D for 0GN_R3_SSY (Source Systems for R3 Entity), before I was able to activate.
    What is this about and what is the implication for deleting this rule? There was nothing I could map it to and was not sure why the rule was "Routine" so I changed to "No Transformation"
    3.
    How come that under the InfoProvider treeu2026 in the data flow for 0Plant_Attr, I donu2019t have the DTP I created under the Datasource treeu2026. Shouldnu2019t it appear at both places?
    --I executed the DTP created under the Datasource tree and data went from PSA to InfoObject (0Plant) alright.
    Thanks..
    Edited by: AmandaBaah on Aug 11, 2011 5:43 PM

  • A SQL tuning issue-sql runs much slower in test than in production?

    Hi Buddies,
    I am working on a sql tuning issue. A sql runs much slower in test than in production.
    I compared the two explain plans in test and production
    seems in test, CBO refuses to use index SUBLEDGER_ENTRY_I2.
    we rebuile it and re-gether that index statistcs. run, still slow..
    I compared the init.ora parameters like hash_area_size, sort_area_size in test, they are same as production.
    I wonder if any expert friend can show some light.
    in production,
    SQL> set autotrace traceonly
    SQL> SELECT rpt_horizon_subledger_entry_vw.onst_offst_cd,
    2 rpt_horizon_subledger_entry_vw.bkng_prd,
    3 rpt_horizon_subledger_entry_vw.systm_afflt_cd,
    4 rpt_horizon_subledger_entry_vw.jrnl_id,
    5 rpt_horizon_subledger_entry_vw.ntrl_accnt_cd,
    6 rpt_horizon_subledger_entry_vw.gnrl_ldgr_chrt_of_accnt_nm,
    7 rpt_horizon_subledger_entry_vw.lgl_entty_brnch_cd,
    8 rpt_horizon_subledger_entry_vw.crprt_melob_cd AS corp_mlb_cd,
    rpt_horizon_subledger_entry_vw.onst_offst_cd, SUM (amt) AS amount
    9 10 FROM rpt_horizon_subledger_entry_vw
    11 WHERE rpt_horizon_subledger_entry_vw.bkng_prd = '092008'
    12 AND rpt_horizon_subledger_entry_vw.jrnl_id = 'RCS0002100'
    13 AND rpt_horizon_subledger_entry_vw.systm_afflt_cd = 'SAFF01'
    14 GROUP BY rpt_horizon_subledger_entry_vw.onst_offst_cd,
    15 rpt_horizon_subledger_entry_vw.bkng_prd,
    16 rpt_horizon_subledger_entry_vw.systm_afflt_cd,
    17 rpt_horizon_subledger_entry_vw.jrnl_id,
    18 rpt_horizon_subledger_entry_vw.ntrl_accnt_cd,
    19 rpt_horizon_subledger_entry_vw.gnrl_ldgr_chrt_of_accnt_nm,
    20 rpt_horizon_subledger_entry_vw.lgl_entty_brnch_cd,
    21 rpt_horizon_subledger_entry_vw.crprt_melob_cd,
    22 rpt_horizon_subledger_entry_vw.onst_offst_cd;
    491 rows selected.
    Execution Plan
    0 SELECT STATEMENT Optimizer=CHOOSE (Cost=130605 Card=218764 B
    ytes=16407300)
    1 0 SORT (GROUP BY) (Cost=130605 Card=218764 Bytes=16407300)
    2 1 VIEW OF 'RPT_HORIZON_SUBLEDGER_ENTRY_VW' (Cost=129217 Ca
    rd=218764 Bytes=16407300)
    3 2 SORT (UNIQUE) (Cost=129217 Card=218764 Bytes=35877296)
    4 3 UNION-ALL
    5 4 HASH JOIN (Cost=61901 Card=109382 Bytes=17719884)
    6 5 TABLE ACCESS (FULL) OF 'GNRL_LDGR_CHRT_OF_ACCNT'
    (Cost=2 Card=111 Bytes=3774)
    7 5 HASH JOIN (Cost=61897 Card=109382 Bytes=14000896
    8 7 TABLE ACCESS (FULL) OF 'SUBLEDGER_CHART_OF_ACC
    OUNT' (Cost=2 Card=57 Bytes=1881)
    9 7 HASH JOIN (Cost=61893 Card=109382 Bytes=103912
    90)
    10 9 TABLE ACCESS (FULL) OF 'HORIZON_LINE' (Cost=
    34 Card=4282 Bytes=132742)
    11 9 HASH JOIN (Cost=61833 Card=109390 Bytes=7000
    960)
    12 11 TABLE ACCESS (BY INDEX ROWID) OF 'SUBLEDGE
    R_ENTRY' (Cost=42958 Card=82076 Bytes=3611344)
    13 12 INDEX (RANGE SCAN) OF 'SUBLEDGER_ENTRY_I
    2' (NON-UNIQUE) (Cost=1069 Card=328303)
    14 11 TABLE ACCESS (FULL) OF 'HORIZON_SUBLEDGER_
    LINK' (Cost=14314 Card=9235474 Bytes=184709480)
    15 4 HASH JOIN (Cost=61907 Card=109382 Bytes=18157412)
    16 15 TABLE ACCESS (FULL) OF 'GNRL_LDGR_CHRT_OF_ACCNT'
    (Cost=2 Card=111 Bytes=3774)
    17 15 HASH JOIN (Cost=61903 Card=109382 Bytes=14438424
    18 17 TABLE ACCESS (FULL) OF 'SUBLEDGER_CHART_OF_ACC
    OUNT' (Cost=2 Card=57 Bytes=1881)
    19 17 HASH JOIN (Cost=61899 Card=109382 Bytes=108288
    18)
    20 19 TABLE ACCESS (FULL) OF 'HORIZON_LINE' (Cost=
    34 Card=4282 Bytes=132742)
    21 19 HASH JOIN (Cost=61838 Card=109390 Bytes=7438
    520)
    22 21 TABLE ACCESS (BY INDEX ROWID) OF 'SUBLEDGE
    R_ENTRY' (Cost=42958 Card=82076 Bytes=3939648)
    23 22 INDEX (RANGE SCAN) OF 'SUBLEDGER_ENTRY_I
    2' (NON-UNIQUE) (Cost=1069 Card=328303)
    24 21 TABLE ACCESS (FULL) OF 'HORIZON_SUBLEDGER_
    LINK' (Cost=14314 Card=9235474 Bytes=184709480)
    Statistics
    25 recursive calls
    18 db block gets
    343266 consistent gets
    370353 physical reads
    0 redo size
    15051 bytes sent via SQL*Net to client
    1007 bytes received via SQL*Net from client
    34 SQL*Net roundtrips to/from client
    1 sorts (memory)
    1 sorts (disk)
    491 rows processed
    in test
    SQL> set autotrace traceonly
    SQL> SELECT rpt_horizon_subledger_entry_vw.onst_offst_cd,
    2 rpt_horizon_subledger_entry_vw.bkng_prd,
    3 rpt_horizon_subledger_entry_vw.systm_afflt_cd,
    4 rpt_horizon_subledger_entry_vw.jrnl_id,
    5 rpt_horizon_subledger_entry_vw.ntrl_accnt_cd,
    rpt_horizon_subledger_entry_vw.gnrl_ldgr_chrt_of_accnt_nm,
    6 7 rpt_horizon_subledger_entry_vw.lgl_entty_brnch_cd,
    8 rpt_horizon_subledger_entry_vw.crprt_melob_cd AS corp_mlb_cd,
    9 rpt_horizon_subledger_entry_vw.onst_offst_cd, SUM (amt) AS amount
    10 FROM rpt_horizon_subledger_entry_vw
    11 WHERE rpt_horizon_subledger_entry_vw.bkng_prd = '092008'
    12 AND rpt_horizon_subledger_entry_vw.jrnl_id = 'RCS0002100'
    AND rpt_horizon_subledger_entry_vw.systm_afflt_cd = 'SAFF01'
    13 14 GROUP BY rpt_horizon_subledger_entry_vw.onst_offst_cd,
    15 rpt_horizon_subledger_entry_vw.bkng_prd,
    16 rpt_horizon_subledger_entry_vw.systm_afflt_cd,
    17 rpt_horizon_subledger_entry_vw.jrnl_id,
    18 rpt_horizon_subledger_entry_vw.ntrl_accnt_cd,
    rpt_horizon_subledger_entry_vw.gnrl_ldgr_chrt_of_accnt_nm,
    rpt_horizon_subledger_entry_vw.lgl_entty_brnch_cd,
    rpt_horizon_subledger_entry_vw.crprt_melob_cd,
    rpt_horizon_subledger_entry_vw.onst_offst_cd; 19 20 21 22
    no rows selected
    Execution Plan
    0 SELECT STATEMENT Optimizer=CHOOSE (Cost=92944 Card=708 Bytes
    =53100)
    1 0 SORT (GROUP BY) (Cost=92944 Card=708 Bytes=53100)
    2 1 VIEW OF 'RPT_HORIZON_SUBLEDGER_ENTRY_VW' (Cost=92937 Car
    d=708 Bytes=53100)
    3 2 SORT (UNIQUE) (Cost=92937 Card=708 Bytes=124962)
    4 3 UNION-ALL
    5 4 HASH JOIN (Cost=46456 Card=354 Bytes=60180)
    6 5 TABLE ACCESS (FULL) OF 'SUBLEDGER_CHART_OF_ACCOU
    NT' (Cost=2 Card=57 Bytes=1881)
    7 5 NESTED LOOPS (Cost=46453 Card=354 Bytes=48498)
    8 7 HASH JOIN (Cost=11065 Card=17694 Bytes=1362438
    9 8 HASH JOIN (Cost=27 Card=87 Bytes=5133)
    10 9 TABLE ACCESS (FULL) OF 'HORIZON_LINE' (Cos
    t=24 Card=87 Bytes=2175)
    11 9 TABLE ACCESS (FULL) OF 'GNRL_LDGR_CHRT_OF_
    ACCNT' (Cost=2 Card=111 Bytes=3774)
    12 8 TABLE ACCESS (FULL) OF 'HORIZON_SUBLEDGER_LI
    NK' (Cost=11037 Card=142561 Bytes=2566098)
    13 7 TABLE ACCESS (BY INDEX ROWID) OF 'SUBLEDGER_EN
    TRY' (Cost=2 Card=1 Bytes=60)
    14 13 INDEX (UNIQUE SCAN) OF 'SUBLEDGER_ENTRY_PK'
    (UNIQUE) (Cost=1 Card=1)
    15 4 HASH JOIN (Cost=46456 Card=354 Bytes=64782)
    16 15 TABLE ACCESS (FULL) OF 'SUBLEDGER_CHART_OF_ACCOU
    NT' (Cost=2 Card=57 Bytes=1881)
    17 15 NESTED LOOPS (Cost=46453 Card=354 Bytes=53100)
    18 17 HASH JOIN (Cost=11065 Card=17694 Bytes=1362438
    19 18 HASH JOIN (Cost=27 Card=87 Bytes=5133)
    20 19 TABLE ACCESS (FULL) OF 'HORIZON_LINE' (Cos
    t=24 Card=87 Bytes=2175)
    21 19 TABLE ACCESS (FULL) OF 'GNRL_LDGR_CHRT_OF_
    ACCNT' (Cost=2 Card=111 Bytes=3774)
    22 18 TABLE ACCESS (FULL) OF 'HORIZON_SUBLEDGER_LI
    NK' (Cost=11037 Card=142561 Bytes=2566098)
    23 17 TABLE ACCESS (BY INDEX ROWID) OF 'SUBLEDGER_EN
    TRY' (Cost=2 Card=1 Bytes=73)
    24 23 INDEX (UNIQUE SCAN) OF 'SUBLEDGER_ENTRY_PK'
    (UNIQUE) (Cost=1 Card=1)
    Statistics
    1134 recursive calls
    0 db block gets
    38903505 consistent gets
    598254 physical reads
    60 redo size
    901 bytes sent via SQL*Net to client
    461 bytes received via SQL*Net from client
    1 SQL*Net roundtrips to/from client
    34 sorts (memory)
    0 sorts (disk)
    0 rows processed
    Thanks a lot in advance
    Jerry

    Hi
    Basically there are two kinds of tables
    - fact
    - lookup
    The number of records in a lookup table is usually small.
    The number of records in a fact table is usually huge.
    However, in test systems the number of records in a fact table is often also small.
    This results in different execution plans.
    I notice again you don't post version and platform info, and you didn't make sure your explain is properly idented
    Please read the FAQ to make sure it is properly idented.
    Also using the word 'buddies' is as far as I am concerned nearing disrespect and rudeness.
    Sybrand Bakker
    Senior Oracle DBA

  • How to handle refreshing TEST schema with PRODUCTION schema ?

    - we have database 10g ( standard edition), database name : ABC
    - Schema name: ABC.PRODUCTION (which is our production schema)
    - Schema name: ABC.TEST (which is our testing schema, where developers work)
    Both the production & Test schemas exist in the same database.
    Now once a week I wanted to refresh TEST schema with PRODUCTION data
    Here is what I have been doing all these years:
    => Take a logical backup (EXPDP) of PRODUCTION schema (prod.dmp)
    => Drop user TEST cascade ( i don't need a backup of this TEST schema)
    => Create user TEST
    => Import PROD.DMP data into TEST schema
    All the above 4 steps are being done manually.
    Questions:
    ======
    1. Is there any easier way of doing the above steps using some tool ?
    2. Does Oracle enterprise manager which comes free with database installation (http://localhost:118/em)
    has any utility or tool to do this job ?
    3. I want everything to be refreshed (all database objects including data) ?
    Thanks
    John P
    Edited by: johnpau2013 on Feb 23, 2011 4:32 AM

    This is crazy. One inadvertent typo and you'll overwrite your Production schema. Plus, what happens if a developer 'tests' against the test schema and slows the Production database to a crawl.
    I presume you know all about this, though and can't make the case to management. I hope it's not a business-critical Production database!
    Anyway, your method is decent. I would advise against doing it automatically, to be honest, especially when your system is so precariously set up. But if you exist, you could use encapsulate all the steps into a script and use crontab to automate the process. I, personally, wouldn't use DBMS_SCHEDULER as you have to be careful with priorities and workload sometimes (at least in my experience) and you might end up having your export/import clash with other jobs in the system if you don't pay attention.
    Here are the steps I would use:
    Create a 'create user' script for the test schema based on dynamic SQL. That way you can be sure you have all the grants necessary for the user, in case things change.
    Drop the test user (use EXTRA caution and be defensive when coding this part!)
    Export the schema using FLASHBACK_SCN to ensure you have a consistent export
    Run your 'create user' script to create the test user
    Import the schema with a REMAP_SCHEMA option (use EXTREME caution with this!!!!)
    Compile invalid objects
    Compare objects and exclude any recycle_bin objects. Send an email alert if the object counts are different.
    Compare invalid objects. Any objects which aren't invalid in Production should be flagged up if they're invalid in test.
    Again, it's absolute insanity to have a test schema in a Production database. You absolutely must insist on addressing that with management.
    Mark

  • Need to refresh Test environment from Production

    Hi All,
    I need to refresh our Test environment from Production.
    Can anyone help me with the list of special tools from different vendors to do it?
    Any kind of help is appreciated.

    Hi Manoj,
    Thanks for your response. I am aware of the system copy.
    can you help me with some tools from different vendors that can be an alternative to system copy.
    we have categorized the task into 3 categories:
    Database restore
    Client Copy
    Special tools
    The third one includes tools from different vendors (for example: TDMS from SAP itself).

  • Data migration test: Deletion of POs

    Hello gurus,
    we would like to do a migration test for open orders.
    Is there a better way to delete the imported POs afterwards than SE16N (e.g. a report)? We would like to do several tests and delete the imported data after each one.
    Thanks
    Alicia

    http://sap-img.com/materials/how-to-use-me98-to-delete-po-from-system-completely.htm

  • Migrating SOA configuration to production

    I have developed VC apps and consumed a WS in the model. Iam getting ready to migrate to production. However I have a question regarding QA and production. Once I migrate the VC app to QA and then to production, what do I have to do in the ECC system where my service resides. Is there any SID specific information in the VC model when I transport it. Or is this all transparent?
    Has anyone migrated to QA or production and what has their experience been?
    Thanks
    Mikie

    You didn't give much information about the services you are consuming.  Are they "out of the box" or did you develop then using Java or Abap?  For Abap services you developed yourself, you transport the Function Module and something called a Virtual End Point.  However, you still need to run transaction SOAMANAGER in QA and PRD to create the rest of the service.  This is the process for Netweaver 7.1 and 7.0 after I believe SP 15.

  • Segregation of testing and the production environments- Oracle EBS implementation project

    I am currently auditing a oracle EBS implementation and I need to test the segregation of testing and the production environments, How can I approach that ?

    In Oracle EBS environments TEST and PRODUCTION are different systems altogether, you can say two entities which are identical in features/data but physically different. So you may have to test on below
    1.) DB level passwords (different in both instances) and are following the statndard corporate policies.
    2.) Unwanted application users are end-dated in TEST instances.
    3.) Data is masked appropriately in TEST instances.
    regards
    PRavin

  • How to migrate custom user attributes (UDF) from test environment to production when a sandbox is published

    Hi all,
    I like to migrate custom attributes from test environment to my production environment. I read OIM documentation and i tried to fallow these steps but I cannot export sandbox and import it because all sandboxes are published in the test environment.
    I exported and imported users metadata by deployment manager only. Now, all migrated attributes are in the OIM DB but I do not see these attributes in Administration Console.
    How can i solve this issue? Is it possible to export published sandbox and import it in the other environment?
    Thank you.
    Milan

    In OIM 11g R2 you need to export sandbox before publishing sandbox for custom user fields from DEV or TEST environment.
    Then import exported sandbox in the another environment.
    If you didn't exported custom user fields sandbox from your TEST or DEV in that case you need to create those again in another environment manually. There is no other option for this in OIM 11g R2.

  • Migrating from Test Database to Production Database

    Hi,
    I have a situation where I created my Portal Pages on test database and now, I want to move everything from test database to my production database. The portal version on both the sites is same (3.0.9.8). I have content areas and applications in my existing test site. What is the best and easiest way to achieve this?
    Thanks in advance.
    Regards,
    Jatinder

    Hi,
    the easiest way is to make an export of the portal schema en import it into you're production database.
    greetings
    <BLOCKQUOTE><font size="1" face="Verdana, Arial">quote:</font><HR>Originally posted by Jatinder:
    Hi,
    I have a situation where I created my Portal Pages on test database and now, I want to move everything from test database to my production database. The portal version on both the sites is same (3.0.9.8). I have content areas and applications in my existing test site. What is the best and easiest way to achieve this?
    Thanks in advance.
    Regards,
    Jatinder<HR></BLOCKQUOTE>
    null

  • Create test env from production system

    Hi,
    Please sorry,
    I am new in this area of Oracle Applications 11.5.10 and I need help in couple of questions.
    We have production system with Oracle Applications 11.5.10, APPS tier on 32bit Red Hat Linux EE 4 server, and 10.2.0.3 DB tier on 64bit Red Hat Linux EE 4 server.
    We get the 2 new servers for test Oracle Applications 11.5.10 environment with same versions of OS like production.
    First test 32bit for APPS,
    Second test 64bit for DB tier.
    Now we need to create test environment, same like production.
    Questions>
    1) What is the best way to create same env in this situation (32bit app tier, 64bit database tier). I suppose that cloning using rapid clone procedure isn't possible on this clear machines.
    2) If I need to install Oracle Applications 11.5.10 software first on both machines, is it possible to use 32bit rapidwiz installer (existing in stage) to create test db tier on this 64bit node, or not.
    I have read in the document Oracle Applications 11.5.10 - Installation Update Notes for Linux x86
    You can only install Oracle Applications on an x86-64 architecture server if the operating system is 32-bit Linux or Windows. If your operating system is 64-bit, contact your operating system vendor to obtain a 32-bit operating system before installing Oracle Applications.
    3) How to now from which stage production is created. When I try to create the stage for my test environment using perl command perl /mnt/cdrom/Disk1/rapidwiz/adautostg.pl
    I get these options:
    1 - to choose Oracle Applications
    2 - to choose Oracle Applications with NLS
    3 - to choose Oracle Database technology stack (RDBMS)
    4 - to choose Oracle Applications database (Databases)
    5 - to choose Oracle Applications technology stack (Tools)
    6 - to choose APPL_TOP
    7 - to choose National Language Support (NLS) Languages
    Because I haven't seen directory oraNLS is this 1 good choose for stage.
    Thanks, and sorry because I am new in this area.
    Regards
    Edited by: user12009428 on Sep 30, 2010 12:12 PM

    Hi,
    1) What is the best way to create same env in this situation (32bit app tier, 64bit database tier). I suppose that cloning using rapid clone procedure isn't possible on this clear machines. Use Rapid Clone.
    Rapid Clone Documentation Resources, Release 11i and 12 [ID 799735.1]
    FAQ: Cloning Oracle Applications Release 11i [ID 216664.1]
    2) If I need to install Oracle Applications 11.5.10 software first on both machines, is it possible to use 32bit rapidwiz installer (existing in stage) to create test db tier on this 64bit node, or not.Type "linux32 bash" -- See this thread for details.
    How to install 11i on Red Hat Linux 64 bit
    Re: How to install 11i on Red Hat Linux 64 bit
    You can only install Oracle Applications on an x86-64 architecture server if the operating system is 32-bit Linux or Windows. If your operating system is 64-bit, contact your operating system vendor to obtain a 32-bit operating system before installing Oracle Applications. What is the database version?
    To migrate the database from 32-bit to 64-bit you need to follow the steps in these docs (depends on your version).
    Using Oracle Applications with a Split Configuration Database Tier on Oracle 9i Release 2 [ID 304489.1]
    Using Oracle Applications with a Split Configuration Database Tier on Oracle 10g Release 2 [ID 369693.1]
    Using Oracle EBS with a Split Configuration Database Tier on 11gR2 [ID 946413.1]
    3) How to now from which stage production is created. When I try to create the stage for my test environment using perl command perl /mnt/cdrom/Disk1/rapidwiz/adautostg.pl
    I get these options:
    1 - to choose Oracle Applications
    2 - to choose Oracle Applications with NLS
    3 - to choose Oracle Database technology stack (RDBMS)
    4 - to choose Oracle Applications database (Databases)
    5 - to choose Oracle Applications technology stack (Tools)
    6 - to choose APPL_TOP
    7 - to choose National Language Support (NLS) Languages
    Because I haven't seen directory oraNLS is this 1 good choose for stage.oraNLS is only required when you want to install additional languages in addition the base English one. If you have no installed languages you can skip this one.
    Please run md5sum as per (MD5 Checksums for 11i10.2 Rapid Install Media [ID 316843.1]) to verify the integrity of the stage area directory before you run Rapid Install.
    Thanks,
    Hussein

Maybe you are looking for

  • I am using Windows 7 and no matter what I try, I can not delete the history/search on the left of the screen? Even if I hi-lite and delete, it always comes back?

    Have tried all the tips from troubleshooting. Private Browsing, checking and unchecking of boxes. It used to clear every time I closed Firefox? I don't normally open this or use it for links since I have most of what I use under Bookmarks...Thanks

  • How to set a Swing component as Modal on a SWT Component?

    How to set a Swing component as Modal on a SWT Componen?.I mean, I have a SWt Component window and from that window I am displaying a SWING Component Modal window. The problem is my swing window is not working as Modal always. When I opened the swing

  • Muted Sound in Ga

    I've been using an external Sound Blaster card for quite some time, worked perfectly with no problems. Unfortunately, last night it simply died, so I got a Sound Blaster Audigy SE. All mp3s and system sounds play just fine over the speakers, but when

  • Final Cut locks up when camera is plugged in

    I am using Final Cut Pro and it works just fine until I plug in my camera(canon optura 20) and then the program locks up. If I unplug the camera then everything is fine and it works again. This is a recent problem and has started happening in the pas

  • LongText Display For Fiscal Year

    Dear Experts, I want to display TextElement in my Query as Fiscal Year Long Text but it is displaying FiscalYearVariant Text and FiscalYear Key when i check Fiscal Year Text/Variable  in filters My Fiscal Year Varnt Text is   APR - MAR When FiscalYea