Golden Gate Initial load from 3 tb schema

Hi
My source database is 9i rdbms on solaris 5.10. I would like to build 11gR2 database on oracel Enterprise linux .
How can i do the initial load of 3tb size schema , from my source to target ( which is cross platform and different version of rdbms)
Thanks

Couple of options.
Use old export/import to do the initial load. While that is taking place, turn on change capture on the source so any transactions that take place during exp/imp timeframe are captured in the trails. Once the init load is done, you start replicat with the trails that have accumulated since exp started. Once source and target are fully synchronized, do your cutover to the target system.
Do an in-place upgrade of your 9i source, to at least 10g. Reason: use transportable tablespaces (or, you can go with expdp/impdp). If you go the TTS route, you will also have to take into account endian/byte ordering of the datafiles (Solaris = big, Linux = little), and that will involve time to run RMAN convert. You can test this out ahead of time both ways. Plus, you can get to 10g on your source via TTS since you are on the same platform. When you do all of this for real, you'll also be starting change capture so trails can be applied to the target (not so much the case with TTS, but for sure with Data Pump).

Similar Messages

  • Golden Gate Initial Load - Performance Problem

    Hello,
      I'm using the fastest method of initial load. Direct Bulk Load with additional parameters:
    BULKLOAD NOLOGGING PARALLEL SKIPALLINDEXES
    Unfortunatelly the load of a big Table 734 billions rows (around 30 GB) takes about 7 hours. The same table loaded with normal INSERT Statement in parallel via DB-Link takes 1 hour 20 minutes.
    Why does it take so long using Golden Gate? Am I missing something?
    I've also noticed that the load time with and without PARALLEL parameter for BULKLOAD is almost the same.
    Regards
    Pawel

    Hi Bobby,
    It's Extract / Replicat using SQL Loader.
    Created with following commands
    ADD EXTRACT initial-load_Extract, SOURCEISTABLE
    ADD REPLICAT initial-load_Replicat, SPECIALRUN
    The Extract parameter file:
    USERIDALIAS {:GGEXTADM}
    RMTHOST {:EXT_RMTHOST}, MGRPORT {:REP_MGR_PORT}
    RMTTASK replicat, GROUP {:REP_INIT_NAME}_0
    TABLE Schema.Table_name;
    The Replicat parameter file:
    REPLICAT {:REP_INIT_NAME}_0
    SETENV (ORACLE_SID='{:REPLICAT_SID}')
    USERIDALIAS {:GGREPADM}
    BULKLOAD NOLOGGING NOPARALLEL SKIPALLINDEXES
    ASSUMETARGETDEFS
    MAP Schema.Table_name, TARGET Schema.Table_tgt_name,
    COLMAP(USEDEFAULTS),
    KEYCOLS(PKEY),
    INSERTAPPEND;
    Regards,
    Pawel

  • Golden Gate - Initial Load using parallel process group

    Dear all,
    I am new to GG and I was wondering if GG can support initial load with parallel process groups? I have manage to do an initial load using "Direct BULK Load" and "File to Replicat", but I have several big tables and replicat is not catching up. I am aware that GG is not ideal for making initial load, but it is complicated to explain why I am using it.
    Is it possible to user @RANGE function while performing Initial Load regardless of which method is used (file to replicat, direct bulk, ...) ?
    Thanks in advance

    you may use datapump for initial load for large tables.

  • Initial Load from ABAP Failed

    Hi.
    Initial load from ABAP described in following PDF document failed.
    Identity Management for SAP System Landscapes: Configuration Guide
    http://service.sap.com/~sapidb/011000358700001449662008E
    System log Error
    lang.ExceptionInInitializerError: JCO.classInitialize(): Could not load middleware layer 'com.sap.mw.jco.rfc.MiddlewareRFC'
    JCO.nativeInit(): Could not initialize dynamic link library sapjcorfc [C:\WINDOWS\system32\sapjcorfc.dll: Can't find dependent libraries]. java.library.path [C:\Program Files (x86)\Java\j2re1.4.2_16\bin;.;C:\WINDOWS\system32;C:\WINDOWS;C:\WINDOWS\system32;C:\WINDOWS;......
    Environment is
    IdM 7.1
    NW AS JAVA 7.0 SP 17
    Windows 2003 Server 64bit
    SQL Server
    Any help will be most appreciated !

    Hi You Nakamura,
        We are facing the same problem during AS ABAP - initial load with the same error message.
    I downloaded the msvcp71.dll file and placed it in System32 folder. But the problem remains the same (even after server restart).
    The system log shows the same error message and no entries in job log.
    Please let me know if you had followed any different way to install the dll file.
    Regards,
    Vijay.K

  • Initial load from CRM to CDB

    Hi,
    I'm trying to do the initial download from CRM to CDB for a mobile scenario, using the transaction R3AS.
    In this transaction, I'm selecting the load object, Source site, but unable to select the destionation site as CDB. Is there any configuration needs to be done before we do the initial download?
    And important point is that there is an entry for CDB in the table SMOHSITEID.
    Please help me in this regard.
    Regards,
    Praveen

    HI
    Praveen you forgot the basic thing is you need to create a site in CDB also
    the site you created previously for crm data base not for cdb as u also know cdb is connected to crm data base through middleware first you create site, subscription, publication  in in cdm also later start down loading R3as before golive if your mobile clients are already live use R3AC4 delta again you have to set up filters and parallel processing for reduce the down loading time
    Reward with Points if Helpful
    Venkat

  • Time-out in 0CRM_SRV_IBASE_ATTR initial load from CRM5.0

    Hi All,
    we are facing the following problem.
    On our staging system an initial load of the 0CRM_SRV_IBASE_ATTR extraction times out in the source system. Strange thing is that this same extraction does work in our development system...
    When i start the infopackage with the same parameters as in the dev system (in background,  PSA and datatargets parralel, datas default data transfer settings) and check the source system, the job is running in background on dev, but as dialog on staging!!
    Does anybody know how we can solve this issue?
    Kind regards
    Immanuel

    Hi,
    In which system ur chking in?
    In BI system it occupies dialog only, but in R/3 system it shuld go in BKGD....
    So chk in R/3 system, if still in R/3 it runs in dialog once chk the USERNAME on which the job in running whether he was permitted to run in BKGD...here i mean chk the permissions for tht user...
    As said u can chk with BASIS....
    rgds,
    Edited by: Krishna Rao on May 13, 2009 2:11 PM

  • Initial load from ECC/R3 to MDM

    Hi,
    Could any one tell me the process for one time intial load from  R3 to MDM.
    preferably without XI.
    how to extract data from R3 ?
    what format?
    how to map source to MDM repository?
    Farah

    Hi Farah,
    There are two transactions to extract the data from R/3
    MDMGX - For Extracting Lookup data
    MDM_CLNT_EXTR - For Extracting Main table data
    MDMGX Steps:
    1.Go to transaction MDMGX.
    2. Define Object Types: There are some standard object defined in MDMGX, you can use from them as per your requirement and you can also create the new objects.
    3. Define Repositories and associate it with the object created in the previous step.Also add the FTP Server details.
    4. Maintain Ports and Check tables: Add the Codes of the ports and select the corresponding table from where you want to extract the data.
    5. Generate XSD: to generate the XSD file to use in console while creating port.
    6. Start Extraction: To Extract the data from the R3 to MDM via FTP Server.
    MDM_CLNT_EXTR steps:
    Refer the below link that contains the entire configuration needed to extract the Main table data.
    [https://www.sdn.sap.com/irj/sdn/go/portal/prtroot/docs/library/uuid/50260ae5-5589-2910-4791-fc091d9bf797]
    Regards,
    Jitesh Talreja

  • Data load from Oracle schema

    Hi ,
    This is a simple question.
    I am planning to load data from Oracle. What connection info should I need to acheive this?
    When I select 'Open SQL' from EAS console, Open SQL datasource dialog box contains server name, application name and database name, what should be entered for these?

    Hi,
    Please refer the below link to create an ODBC connection in Linux/Unix box.
    http://publib.boulder.ibm.com/infocenter/c8bi/v8r4m0/index.jsp?topic=/com.ibm.swg.im.cognos.rdm_cr_updates.8.4.0.doc/rdm_cr_updates_id1183SettingUpODBCConnections.html
    h4. Open SQL datasource dialog box contains server name, application name and database name, what should be entered for these?
    Here you have to provide the server address or complete server name where the Oracle is installed and in the application name and database name you have to provide the Database name and the table name what you have created in the Oracle.
    Hope this helps.
    Greetings
    SST.....

  • Initial Load for Master Data from R/3 to CRM

    Hi Middleware Experts,
        I have an ambiguity on initial load of enhanced master data from R/3 to CRM. The current situation is
    1)I have already enhanced master data on R/3 side with some additional fields. Now shall i go ahead with standard intial load of these Master data to CRM without creating middleware interfaces to map those enhanced fields on CRM side?
    2)Also after doing initial load from R/3 to CRM, if I develop middleware interfaces for mapping these new fields of master data and then again do the initial load, so that new fields also get mapped, will the initial load be done properly as i have already done it once? will it overwrite the previous initial load data on CRM side properly?
    Please suggest whether it can be done or not and which is the better option.
    Regards,
    Rahul

    Hi Rahul,
    Since you have not done the mapping yet the middleware will not bring those fields to CRM it will work just like a standard initial load without considering the enhanced master data field.
    When you develop the middleware interface and enhance the CRM data structure also and then you can see the exchange of those fields through MW. You can start the intial load at that time also its not that you can do the initial load just once. But the better option would be that you avoid doing initial load several times because its time consuming activity. You do all the enhancement and then finally perform the initaial load.
    <b>Allot points if my post helps!!</b>
    Best regards,
    Vikash.

  • Golden gate - purging processin source db should not replicate in Target db

    Oracle : 11.2.0.3
    Server: Linux 64-bit
    I have a requirement in my golden gate environment that, some purging process in Source database should not reflect in target database.
    (few tables will be purged once in 3 months ,that should not be replicate in target)
    Golden gate is configured for few schemas.
    Could anyone please help me to solve this.
    Thanks in advance,

    Osama thanks for your promt response and help..
    I think the link which you have provided is not suitable for my scenario ( http://mverzijl.wordpress.com/2011/10/15/handle-truncates-in-goldengate/ ),
    In my scenario, we want to delete some 2 years record from a table in source db, but this delete should not replicate in target. We want to maintain the records in target db.
    Find below my EXTRACT & REPLICATE Parameters
    EXTRACT ext1     
    USERID ggs_owner, PASSWORD ggs_owner
    RMTHOST lnxdev.hq.company.com, MGRPORT 7809
    RMTTRAIL /oradata/ORACLE/goldengate/dirdat/rt
    DDL INCLUDE MAPPED          
    TABLE CLG_OWNR.*;     
    TABLE LOC_OWNR.*;     
    SEQUENCE CLG_OWNR.*;
    SEQUENCE LOC_OWNR.*;
    -Rep
    REPLICAT rep1
    ASSUMETARGETDEFS
    USERID ggs_owner, PASSWORD ggs_owner
    MAP CLG_OWNR.*, TARGET CLG_OWNR.*;
    MAP LOC_OWNR.*, TARGET LOC_OWNR.*;
    DDLERROR DEFAULT IGNORE RETRYOP
    thanks & regards,
    Feroz Khan

  • Initial Load of LDAP Groups

    I am running an initial load from LDAP using the template job.
    The users have been successfully loaded into the Id store table but the group read pass does not do anything.
    What should the source and destination tabs look like for the Read groups pass.
    Thanks
    S.

    Hi
    In my case the InitialLoad-Jobs for ADS/LDAP had some information missing in the pass "ReadGroupOfUniqueNamesFromLdap".
    In the Source-Tab the LDAP URL should look like this:
    LDAP://%$rep.LDAP_HOST%:%$rep.LDAP_PORT%/%$rep.LDAP_STARTING_POINT_GROUPS%?*?SUB?%$rep.LDAP_FILTER_GROUPS%
    For that you should create additional repository-constants "LDAP_FILTER_GROUPS" and "LDAP_STARTING_POINT_GROUPS" which look like this in my case:
    LDAP_FILTER_GROUPS=(objectclass=group)
    LDAP_STARTING_POINT_GROUPS=ou=groups,ou=idm,dc=example,dc=com
    I didn't change anything at the Destination-tab.
    Hope this helps...

  • Handling password while initial load process

    Dear Experts,
    This is about password handling in IDM. While doing initial load, I do not want to bring passwords from my target systems (AD/SAP) into IDM.
    So which password(s) the users will use to login into target systems (AD/SAP) after initial load ? What can be achieved with pass "update system privilege trigger attribute" which is available in initial load job ?
    Is it something like, IDM creates a default password on initial load which is sent back to target systems(from which initial load was done) which changes the password for the target systems to this new default password ?
    Can we handle this default password being sent to target systems with the help of this pass "update system privilege trigger attribute" in initial load? so that this default password is not sent to target systems ??
    So if the default password is not sent back to target systems after initial load, then users will keep using their existing passwords for their login in the target systems. After that, If I need to assign UMEJAVA only privilege to the users, the password for the target systems will be changed with the default password being sent on email to the users. Since the password on AD is now changed, how the users gonna login into AD to check their emails for the
    new password ?
    It seems I have written a BIG query here .... sorry for that. But please let me know if any thing above does not make any sense.
    Also please share your views/expertise/best practice on the same.
    Many thanks in advance!
    Naveen
    Version: IDM 7.2

    Hi Naveen,
    Firstly to answer your query on what basis we decide he backend type for your UMEJAVA repository, its the business. If you want users to authenticate against AD, when they try to login to IDM UI, you have to configure the LDAP as backend and you have to choose datasource as Microsoft ADS (Deep Hierarchy) + Database (ume database).
    If you want the Users to authenticate against UME database, by default ume points to UME database and you need to create the users in the UME database.
    So, if you have configured AS JAVA with ADS+Database, in IDM you have to select the repository as SAP netweaver as Java (ldap backend)
    In the repository constants, there is an attribute called BACKED_REPOSITORY which should be your AD repository name that is configured.
    If you have a look at the AS JAVA connectors in the provisionign framework, in the create user plugin, IDM first checks for backend type. If it is LDAP backend, it just sets the JAVA account attibute, If the backend type is DB, IDM will create the user in the UME database.
    Considering your system details, i would suggest you the below approach.
    1. Configure your UMEJAVA with Microsoft ADS (Deep Hierarchy) + Database (ume database). For more information on how to configure your     
        UMEJAVA with LDAP backend refer to this link
    2. So, now the users who try to login to IDM UI or any app on AS Java, will be authenticated against your active directory.
    3. Perform the initial load from HR.
    4. Perform the initial load from AD.
    5. Perform the initial load from you AS UMEJAVA.
    6. Now, all the user information/role assignment information is loaded to IDM.
    7. Now lets discuss about password management. There are two things here
      a. Change password (by user)  - User changes password in IDM --> password changes are provisioned to AD and user can login with new password.
      b .Password reset self service. - User resets password in IDM --> password changes in AD (as UME is configured to use AD)
    Change password (by user)
    By default the users who are successfully authenticated when they try to login to IDM UI, will get access to self-services tab. To allow users to change the password on their own, you create the corresponding ordered tasks and maintain the access control tab for selfservice.
    So that when users wants to change their passwords, they can change on their own.
    How IDM will provision the new password to target system is something you have to configure the logic. For example, my sandbox looks like this.
    Password reset self-service.
    The user can reset their password on their own if they cannot remember their password. To implement this, look at this document. http://scn.sap.com/docs/DOC-17111
    Hope this helps. please let me know for any further queries.
    ~ Krishna.

  • Document currency is blank in 2nd ODS after loaded from 1st ODS!

    1st ODS contains the document currency field and each record does have the value for it.  In the update rule from 1st ODS to 2nd ODS, there is only two fields populated, one is the document currency (the update type is Overwrite for this unit and there are only two update types for this unit: one is Overwrite and the other one is No Update), the other one is a KF field and it's update type is Addition.  After performing the initial load from 1st ODS to 2nd ODS, the KF field values are fine, but the Document Currency value is blank for each record.  The Document Currency field in the update rule takes the update method of Source field from 1st ODS. 
    Any clue on why this Document Currency is blank in 2nd ODS and any solution?
    Thanks!

    hi Ravi,
    There should be no any transfer rule from one ODS to another.  In the update rule, our 1st post has already clearly stated that the update type is Overwrite for Document Currency.  Sorry, we can't give you any reward points since you didn't carefully read our post!
    Thanks

  • Data Guard Vs Golden Gate

    Hi Experts,
    I am looking for High Availability and Disaster Recovery architecture for my data layer i.e. Oracle Database 11g R2
    We have two physical locations and the distance between two sites is around 20 miles.
    Site 1:
    We already implemented RAC setup with two node in site 1.
    Site 2:
    We are going to implement standalone database. (Not RAC)
    My requirements:
    1. Both databases at Site 1 & Site 2 should be replica of each other.
    2. Both databases should be in sync always.
    3. Site 1 is active and Site 2 is stand by.
    4. Client applications on Site 1 & Site 2 should always talk to RAC database on Site1.
    5.. If RAC at site 1 goes down completely, then ONLY client apps should connect to Site2 database without human intervention.
    How can acheive my requirement ? I was doing some research & found two solutions. 1. Active Data Guard 2. Golden Gate.
    Questions:
    1. Do Data Guard and Golden Gate offers same features ?
    2. Which products offers solutions to all my requirements or Do I need to use both ?
    3. If Data Guard and Golden Gate are different from each other then What is the difference between them and what are the overlapping features among them ?
    Thanks

    1. Do Data Guard and Golden Gate offers same features ?No, there's simple compare here :
    http://www.oracle.com/technetwork/database/features/availability/dataguardgoldengate-096557.html
    2. Which products offers solutions to all my requirements or Do I need to use both ?Data Guard will work and you don't need anything else. I cannot speak to Golden Gate.
    3. If Data Guard and Golden Gate are different from each other then
    What is the difference between them and what are the overlapping features among them ?Again this document :
    http://www.oracle.com/technetwork/database/features/availability/dataguardgoldengate-096557.html
    1. Both databases at Site 1 & Site 2 should be replica of each other.
    Data Guard can do this.
    2. Both databases should be in sync always.
    Data Guard can do this.
    3. Site 1 is active and Site 2 is stand by.
    Data Guard can do this.
    4. Client applications on Site 1 & Site 2 should always talk to RAC database on Site1.
    You can set your tnsnames to handle this and more. Using DBMS_SERVICE you can create an alias
    to handle this.
    Ex.
    ernie =
    (DESCRIPTION =
        (ADDRESS_LIST =
           (ADDRESS = (PROTOCOL = TCP)(HOST = primary.host)(PORT = 1521))
           (ADDRESS = (PROTOCOL = TCP)(HOST = standby.host)(PORT = 1521))
           (CONNECT_DATA =
           (SERVICE_NAME = ernie)
    )5. If RAC at site 1 goes down completely, then ONLY client apps should connect to Site2 database without human intervention.
    You can set your tnsnames to handle this and more.
    Best Regards
    mseberg

  • Initial Load ISU / CRM Issues

    Hi all
    I have a  few questions.
    1. When you start an initial load for BUPA_MAIN in R3AS (from ISU to CRM) it seems only 1 DIAG WP is used (SM50). Is this normal?
    2. In R3AM1 you can monitor the objects. Everything seems to be loaded but yet the status is still 'Running', is this because of BDoc validation errors?
    Kind regards
    Lucas

    Hello Lucas,
    There's a param(CRM_MAX_QUEUE_NUMBER_INITIAL) you can maintain in the table CRMPAROLTP of the source system(maintain this in the ERP system if you're doing an initial load from ERP to CRM)
    Reference Notes:
    350176
    765236
    The parval1 for Parameter CRM_MAX_QUEUE_NUMBER_INITIAL should be an integer(lets say 5) instead of a character 'X'. In such a case 5 queues would be formed for initial load.
    Here's an example :
    Parameter name CRM_MAX_QUEUE_NUMBER_INITIAL
    Param. Name 2   <Object name>    " for example CUSTOMER_MAIN
    Param. Name 3
    User            <User>           " for example CRM, if CRM is connected
    Param. Value    <no. of queues>  " for example 5
    Param. Value 2
    So using the example above 5 R3AI_CUSTOMER_MAIN queues would be formed on the ERP and sent to CRM for processing.
    The no. of available dialog work processes in SM51 available to the qRFC scheduler in CRM system should be able to handle the parallel processing of queues. So consult your basis administrators on this front.
    Regards,
    Rohit

Maybe you are looking for

  • Error: This application required an Xtra (ActiveX...) that either does not exist or failed

    Hi All, im having trouble few days now , i hope someone can give me a lead here. i have a director mx file written in arabic, i open it in Director 11.5, the file behave strangely, very slow publishing process, very slow access to the xtra list. afte

  • 1st Gen Time Capsule thrashing modem

    Hi, I've got a 1st Generation Time Capsule with version 7.6 software, which was a replacement for one of the faulty ones a year or so back.  This one work fine for about a week, then appears to start polling (thrashing?) the modem, where the modem se

  • Inbound IDOC not creating B007 relationship (HRP1001)

    Hi, I have an inbound IDOC containing object type S (Position) and object type C (Job), plus various relationships including A007 and B007 between the Job and the Position. All of the relationships get created except the B007 i.e. the Position cannot

  • Transaction F110: Payment program: RFFOUS_D

    Hi, I have copied the standard payment program RFFOUS_D for check printing into a Z program and included the code to create the Idoc. Further I have done all the configurations like assigning the program to company code and house bank for the country

  • Dequeue from Advanced Queue

    Hi All, I have used the DBMS_AQ package provided dequeue functionality for dequeuing msgs from my advanced queue. Now there is a job which dequeue the items from this queue and sends out email. Now what is happening is the job picks up an item from t