Initial Load Extract (Date Format)

Hi ,
I'm doing an Initial Load Extract (File To Replicat ) for module of 30 GB , and I'm getting the following error at on the extracted tables :
ERROR   OGG-00665  OCI Error error executing fetch with error code 1801  (status = 1801-ORA-01801: date format is too long for internal buffer), SQL<SELECT ........
My concerns are :
1- How to overcome this error , without updating the source data ?
2- How to deal with that at the target side (If the data replicated )  , so it'll not affect the business needs
Thanks So Much

Thanks 960104  for your interest
>> Source GG
11.1.1.1.2
DB is 10.2.0.4
++++++++++++++++++++++++++++++++++
Target GG
Version 12.1.2.1.0
DB is 12.1.0.2.0
Report file is only for extract ,as I'm pushing the trails to remote only without doing the replicat at the moment
=================================================================
2015-01-25 10:23:53  INFO
OGG-01026  Rolling over remote file /u04/GG_TRAILS/ff000001.
2015-01-25 10:23:56  INFO
OGG-01026  Rolling over remote file /u04/GG_TRAILS/ff000002.
2015-01-25 10:24:00  INFO
OGG-01026  Rolling over remote file /u04/GG_TRAILS/ff000003.
551606 records processed as of 2015-01-25 10:24:00 (rate 49204,delta 49204)
2015-01-25 10:24:05  INFO
OGG-01026  Rolling over remote file /u04/GG_TRAILS/ff000004.
2015-01-25 10:24:08  INFO
OGG-01026  Rolling over remote file /u04/GG_TRAILS/ff000005.
1062596 records processed as of 2015-01-25 10:24:10 (rate 50097,delta 51098)
2015-01-25 10:24:12  INFO
OGG-01026  Rolling over remote file /u04/GG_TRAILS/ff000006.
2015-01-25 10:24:16  INFO
OGG-01026  Rolling over remote file /u04/GG_TRAILS/ff000007.
2015-01-25 10:24:20  INFO
OGG-01026  Rolling over remote file /u04/GG_TRAILS/ff000008.
1471164 records processed as of 2015-01-25 10:24:21 (rate 46541,delta 39289)
Source Context :
  SourceModule       
: [ggdb.ora.sess]
  SourceID           
: [/scratch/pradshar/view_storage/pradshar_bugdbrh40_12927937/oggcore/OpenSys/src/gglib/ggdbora/ocisess.c]
  SourceFunction     
: [OCISESS_try]
  SourceLine         
: [501]
  ThreadBacktrace    
: [10] elements
: [/u01/GG/extract(CMessageContext::AddThreadContext()+0x26) [0x664446]]
: [/u01/GG/extract(CMessageFactory::CreateMessage(CSourceContext*, unsigned int, ...)+0x7b2) [0x65aee2]]
: [/u01/GG/extract(_MSG_ERR_ORACLE_OCI_ERROR_WITH_DESC_SQL(CSourceContext*, int, char const*, char const*, char const*, CMessageFactory::MessageDisposition)+0xb2) [0x613232]]
: [/u01/GG/extract(OCISESS_try(int, OCISESS_context_def*, char const*, ...)+0x48b) [0x5a3c2b]]
: [/u01/GG/extract(DBOCI_get_query_row(file_def*, int, int*)+0x95e) [0x923558]]
: [/u01/GG/extract(gl_get_query_row(file_def*)+0x10) [0x933e2c]]
: [/u01/GG/extract [0x87d18d]]
: [/u01/GG/extract(main+0x11e1) [0x527aa1]]
: [/lib64/libc.so.6(__libc_start_main+0xf4) [0x392f81d994]]
: [/u01/GG/extract(__gxx_personality_v0+0x1ea) [0x4f32ca]]
2015-01-25 10:24:21  ERROR   OGG-00665  OCI Error error executing fetch with error code 1801  (status = 1801-ORA-01801: date format is too long for internal buffer), SQL<SELECT x."ID",x."STATUS",x."STATUS_DATE",x."UPDATE_DATE",x."CREATED_BY",x."CREATION_DATE",x."UPDATED_BY",x."DIR_ID",x."LOC_ID",x."TICKET_DATE",x."TICKET_TIME",x."ROAD_SPEED",x."VEHICLE_SPEED",x."RADAR>.
2015-01-25 10:24:21  ERROR   OGG-01668  PROCESS ABENDING.
=================================================================
End of Report file
Thanks

Similar Messages

  • SQL*LOADER and date formatted data

    Hi there, I don't want to load complex things in my table - just a simple date....
    The control file is:
    load data
    infile 'c:\data\mydata.csv'
    into table test_table
    fields terminated by "," optionally enclosed by '"'
    ( sampledate, name )
    The mydata.csv file is:
    30-12-2003, Test1
    31-12-2003, Test2
    However this format is not being accepted by the database... I tried other combinations of date (eg, 30122003, 20-DEC-2003, 30/12/2004, etc...) but always had a 'data error'
    Thanks for any suggestions.

    Example:
    Control File for Case Study 3
    This control file loads the same table as in case 2, but it loads three additional columns (hiredate, projno, and loadseq). The demonstration table emp does not have columns projno and loadseq. To test this control file, add these columns to the emp table with the command:
    ALTER TABLE emp ADD (projno NUMBER, loadseq NUMBER);
    The data is in a different format than in case 2. Some data is enclosed in quotation marks, some is set off by commas, and the values for deptno and projno are separated by a colon.
    1) -- Variable-length, delimited, and enclosed data format
    LOAD DATA
    2) INFILE *
    3) APPEND
    INTO TABLE emp
    4) FIELDS TERMINATED BY "," OPTIONALLY ENCLOSED BY '"'
    (empno, ename, job, mgr,
    5) hiredate DATE(20) "DD-Month-YYYY",
    sal, comm, deptno CHAR TERMINATED BY ':',
    projno,
    6) loadseq SEQUENCE(MAX,1))
    7) BEGINDATA
    8) 7782, "Clark", "Manager", 7839, 09-June-1981, 2572.50,, 10:101
    7839, "King", "President", , 17-November-1981,5500.00,,10:102
    7934, "Miller", "Clerk", 7782, 23-January-1982, 920.00,, 10:102
    7566, "Jones", "Manager", 7839, 02-April-1981, 3123.75,, 20:101
    7499, "Allen", "Salesman", 7698, 20-February-1981, 1600.00,
    (same line continued) 300.00, 30:103
    7654, "Martin", "Salesman", 7698, 28-September-1981, 1312.50,
    (same line continued) 1400.00, 3:103
    7658, "Chan", "Analyst", 7566, 03-May-1982, 3450,, 20:101
    Joel Pèrez
    http://otn.oracle.com/experts

  • SQL Loader with date formatting

    Hi,
    I'm trying to get sql loader to insert a date into a column. After much browsing, reading trial and error I still get an array of errors.
    I'm using oracle XE
    my control file looks like this
    LOAD DATA
    INFILE 'posmeters/meters.csv'
    INTO TABLE position_meters
    FIELDS terminated by ","
    ID CONSTANT '0',
    POSITION_ID,
    DATETIME DATE "DD/MM/YYYY HH24:MI:SS",
    CASH_IN,
    CASH_OUT,
    NOTES_IN,
    CHANGE_OUT,
    WINNINGS,
    VTP,
    REFILL,
    TOKEN_IN,
    TOKEN_OUT,
    ELEC_PAY,
    ELEC_CREDIT,
    REMOTE_PAY,
    REMOTE_CREDIT,
    INSERT_TS EXPRESSION "TO_CHAR(SYSDATE, 'DD/MM/YYYY HH24:MI:SS')",
    FIFTY_PND,
    TWENTY_PND,
    TEN_PND,
    FIVE_PND,
    TWO_PND,
    ONE_PND,
    FIFTY_P,
    TWENTY_P,
    TEN_P,
    FIVE_P
    It is the DATETIME field which gives me grief. I have a test data file that looks like this
    0,1010,29/09/2011 10:23:24,1,2,3,4,5,6,7,8,9,10,11,12,13,14,15,16,17,18,19,20,21,22,23,24
    and my table is defined as follows
    ID NUMBER NOT NULL,
    POSITION_ID NUMBER,
    DATETIME TIMESTAMP(6) DEFAULT localTIMESTAMP NOT NULL,
    CASH_IN NUMBER,
    CASH_OUT NUMBER,
    NOTES_IN NUMBER,
    CHANGE_OUT NUMBER,
    WINNINGS NUMBER,
    VTP NUMBER,
    REFILL NUMBER,
    TOKEN_IN NUMBER DEFAULT (0) NOT NULL,
    TOKEN_OUT NUMBER DEFAULT (0) NOT NULL,
    ELEC_PAY NUMBER DEFAULT (0) NOT NULL,
    ELEC_CREDIT NUMBER DEFAULT (0) NOT NULL,
    REMOTE_PAY NUMBER DEFAULT (0) NOT NULL,
    REMOTE_CREDIT NUMBER DEFAULT (0) NOT NULL,
    INSERT_TS TIMESTAMP(6) DEFAULT (localtimestamp) NOT NULL,
    FIFTY_PND NUMBER DEFAULT 0,
    TWENTY_PND NUMBER DEFAULT 0,
    TEN_PND NUMBER DEFAULT 0,
    FIVE_PND NUMBER DEFAULT 0,
    TWO_PND NUMBER DEFAULT 0,
    ONE_PND NUMBER DEFAULT 0,
    FIFTY_P NUMBER DEFAULT 0,
    TWENTY_P NUMBER DEFAULT 0,
    TEN_P NUMBER DEFAULT 0,
    FIVE_P NUMBER DEFAULT 0
    I have tried defining the control file with
    DATETIME DATE "DD/MM/YYYY HH24:MI:SS",
    DATETIME EXPRESSION "TO_DATE(:DATETIME, 'DD/MM/YYYY HH24:MI:SS')",
    DATETIME EXPRESSION "TO_TIMESTAMP(:DATETIME, 'DD/MM/YYYY HH24:MI:SS')",
    I get errors such as
    Record 1: Rejected - Error on table "SITE_MAIN"."POSITION_METERS", column DATETIME.
    ORA-01861: literal does not match format string
    SQL*Loader-291: Invalid bind variable DATETIME in SQL string for column DATETIME.
    any help would greatfully appreciated.

    It seems that the problem was caused by the constant at the beginning of the record and had nothing to do with date formats.
    My control file now looks like this
    LOAD DATA
    INFILE 'posmeters/meters.csv'
    APPEND INTO TABLE position_meters
    FIELDS terminated by ","
    POSITION_ID          ,
    DATETIME      date "DD/MM/YYYY HH24:MI:SS",
    CASH_IN          ,
    CASH_OUT          ,
    NOTES_IN          ,
    CHANGE_OUT          ,
    WINNINGS          ,
    VTP               ,
    REFILL          ,
    TOKEN_IN          ,
    TOKEN_OUT          ,
    ELEC_PAY          ,
    ELEC_CREDIT          ,
    REMOTE_PAY          ,
    REMOTE_CREDIT     ,
    INSERT_TS      "TO_TIMESTAMP(SYSDATE, 'DD/MM/YYYY HH24:MI:SS')",
    FIFTY_PND          ,
    TWENTY_PND          ,
    TEN_PND          ,
    FIVE_PND          ,
    TWO_PND          ,
    ONE_PND          ,
    FIFTY_P          ,
    TWENTY_P          ,
    TEN_P          ,
    FIVE_P          
    all is good :o)

  • Initial load of data to UCM for Customer Hub PIP

    Hi,
    What is the recommended approach to have XREF tables populated during an initial load to UCM (via EIM), when the Accounts already exist in Siebel CRM?
    Our approach as of now, using EIM, is as follows:
    1) Set required customer information in EIM_UCM_ORG
    2) Look up the customer's existing Row_ID in Siebel, and populate EIM_UCM_ORG.EXT_SYSTEM_NUM and EIM_UCM_ORG.UCM_EXT_ID) accordingly
    3) Run the EIM job and UCM Batch Process to import the customer into UCM
    The account then appears in UCM with the correct reference to siebel/row_id under the "external account IDs" tab. HOWEVER, it also contains a reference to a newly created duplicate record for that account in Siebel. Looking at the xref tables, there is no reference to the existing Siebel/row_id specified in the EIM batch load, and our hypothesis is that this is the reason the account cannot be found (and a duplicated is created). What we want to achieve here is to tell UCM that the accounts we are inserting do infact already exist in Siebel CRM, and can be identified by the row_id that we pass along.
    The relevant system versions are Customer Hub PIP 11g with AIA 3. Siebel CRM and Siebel UCM are on patch 8.1.1.7 (and pertinent ACRs have been incorporated in the two Siebel instances).
    Any hints or suggestions on how to approach this would be appreciated
    -M
    Edited by: 968713 on Nov 1, 2012 5:05 AM
    Edited by: 968713 on Nov 1, 2012 5:05 AM
    Edited by: 968713 on Nov 1, 2012 5:06 AM

    Do you really need to populate the XREF table/transaction History table for initial load?

  • Multiple Initial Load

    Good day!
    I have 27 schemas (which contains 1000+ tables each) and I needed to replicate all of it in our DR machine, my question, is it possible to create multiple initial load extract for me to be able to finish initial loading in a small amount of time?
    Thanks!!
    Regards,
    Mela

    I'm having this kind of problem.
    I need to migrate database from Solaris with raw devices to AIX RAC with ASM.
    As it is a production server and it cannot spend to much time down to transfer huge data using expdp from one server to another, the only way I found to do it is using GoldenGate.
    Can somebody help me?
    Regards,

  • Problem to extract data using DS 2LIS_11_VAHDR & 2LIS_11_VAITM

    Hello Experts,
    my requirement is to extract the SD sales Item & sales header data int BI.  while executing the Reports in BI  I find some incosistencies in data
    Ex:  for for some Sales organizations  
          0CRM_NUMDOC = 17 and the corresponding   0NETVALORD = 0,00  
    and for some sales organization
         0CRM_NUMDOC = 0 and the corresponding   0NETVALORD = 5200,00  
    we are using  the data sopurce 2LIS_11_VAHDR to load Document header data  and the data sopurce 2LIS_11_VAITM to get  Item Net value data. and used two trtansformations to the same SAP standard cubes 0CSAL_C05 & 0CSAL_C03.
    asper  my analysis in development system I find 
           VBAK contains  1405 records but the DS 2LIS_11_VAHDR extracts only 1235 records.
    but
           VBAP conains   2665  records and the DS 2LIS_11_VAITM extracts all 2665  records.
    is it creats the problem?
    I deleted the setuptables and again fill them  even thought I have the same problem ?
    If you have any idea regarding this issue please share with me.
    Thanks in advance.
    cheers.............

    HI,
    Check indeed the extraction in your source system with RSA3.
    Also perform the load and check your data in PSA (before any transformation takes place).
    Are you going to use delta or full loading? I assume it's delta? So if you perform your initial load with data, no use of the setup table is required...
    If you have too much data to do this, try an initial load without data, first delta load and next a full repair load (which will read your setup table). Do not forget to fill your setup table with all the historical data you need in transaction SBIW.
    Hope this helps.
    Best Regards,
    Michel Hoefkens

  • Initial Load Error

    Hi All
    I am trying to make inital load but I am receiving the following error.
    But trail file has been created on target in dirdat folder about 2GB size
    Source and Destination windows2003 32 Bit
    GG version is 11.2.1.0.1 OGGCORE_11.2.1.0.1_PLATFORMS_120423.0230
    Have you any idea??
    Source Context :
    SourceModule : [er.extrout]
    SourceID : [er/extrout.c]
    SourceFunction : [complete_tcp_msg]
    SourceLine : [1480]
    2013-01-09 18:00:57 ERROR OGG-01033 There is a problem in network communication, a remote file problem, encryption keys for target and source do not match (if using ENCRYPT) or an unknown error. (Remote file used is ./dirdat/INITLOAD01.DAT, reply received is Error 0 (The operation completed successfully.) getting position in ./dirdat/INITLOAD01.DAT).
    2013-01-09 18:00:57 ERROR OGG-01668 PROCESS ABENDING.
    ggserr.log on target
    OGG-01223 Oracle GoldenGate Collector for Oracle: Error 0 (The operation completed successfully.) getting position in ./dirdat/INITLOAD01.DAT.
    INFO OGG-01670 Oracle GoldenGate Collector for Oracle: Closing ./dirdat/INITLOAD01.DAT.
    INFO OGG-01676 Oracle GoldenGate Collector for Oracle: Terminating after client disconnect

    Why do you keep naming the file?
    Enter the parameters listed in Table 26 in the order shown, starting a new line for each parameter statement. The following is a sample initial-load Extract parameter file for
    loading data from file to Replicat.
    <your extract name>
    SOURCEISTABLE
    SOURCEDB mydb, USERID ogg, PASSWORD
    AACAAAAAAAAAAAJAUEUGODSCVGJEEIUGKJDJTFNDKEJFFFTC &
    AES128, ENCRYPTKEY securekey1
    RMTHOST ny4387, MGRPORT 7888, ENCRYPT AES 192 KEYNAME mykey
    ENCRYPTTRAIL AES192, KEYNAME mykey1
    RMTFILE /ggs/dirdat/initld, MEGABYTES 2, PURGE
    TABLE hr.*;
    TABLE sales.*;

  • CRM business role - user mapping initial load

    Hi,
    I'm tyring to initial load the data from the CRM systems into IDM. I'm able to get all data of the users expect their business role mapping(parameter in CRM).
    in the read pass of the user from the abap initial load , the attribute for the parameter is as follows,
                        Target                                                                             Source
    sap%$rep.$NAME%Parameter1:Info:VARCHAR:255|                        parameter1
    but no data is stored in the parameter1 table though the user has some parameter mapped in the CRM system                 
    Can anyone plz help to load this user-business role mapping details.
    Thanks in Advance.
    Regards,
    Pricy

    Hello Pricy,
    can you just give me some hints on what you are trying to do exactly?
    My assumptions:
    - CRM business role is stored as user parameter in ABAP SU01 user data, right?
    - you want to read all the ABAP user data from your CRM system INCLUDING the user parameter data from ABAP SU01 user data, right?
    If that is the case at least the loading part should work pretty fine. I just tried this on my local system and had no issues, all user parameters of my existing ABAP users where loaded into one temporary table.
    -> Pass: ReadABAPUsers -> table "sap%$rep.$NAME%Parameter1:Info:VARCHAR:255|"
    Did you find this temp table created on your database correctly? My table is there and is called "sapT01_001Parameter1" (where my repository is named T01_001) and it contains all the existing user parameters.
    What exactly is your issue?
    Regards,
    René

  • Error when extracting data from ETL - Missing messages and missing InfoIdoc

    Hi All,
    We are using BW 3.0 and extracting data from other source systems through Informatica Powercenter (ETL). The system is working fine but when we try to extract data on 31st Dec , we get the following error. Only this load gives the error and all the other load to other data targets are going fine. All the data are one-to-one mapped from ETL to BW.
    Error messages from Monitor -Status tab:-
       "InfoIdocs missing; external system
       Diagnosis :- InfoIDocs are missing. The IDocs are created in BW with non-SAP systems as source    
       systems that transfer the data and metadata to BW using BAPIs. Since the IDocs are missing, a   
       short dump in BW probably occurred.
       System response:  No selection information arrived from the source system"
    Error messages from Monitor -Details tab:-
        Missing message: Number of sent records,   Missing message: Selection completed
    Highly appretiate your suggestions.
    Vinod.CH

    Hi Rathy Moorthy,
    Thank you very much for your reply. The source system connections are OK and we are able to load data to other Data targets from ETL, we have issue only with this this particular load. The load extracts data and I have loaded the data from ETL to PSA and have checked the data content and it is correct. But when I update the same data to the target I get this error. I have also tried to update from PSA to target and also directly from ETL to target.
    Appretiate your suggestions.

  • Middleware initial load error on funcloc segment type (ADR2) does not exist

    Hello,
    We are doing initial loads of data into a new CRM 7.0 system.
    So far all objects have been done except Functional locations (FUNCLOC).
    During the initial load of the functional locations the various BDOCs all go into error state.
    The error is the same everywhere: segment type adr2 does not exist.
    If we deactivate the adr2 table In transaction R3AC1 and process again, we get the error message for the next table
    (Like:  segment type IFLOTX does not exist) and so on.
    IS there any settings we can do to manage these 'segments' or would there be another wa to solve this?
    Thanks.
    Joram

    Hello,
    Take a look at note 1271465, it does not talk about the ADR2 segment but about segment errors in SMW01 for functional location download. It might be useful. Observe especially the constraint in ERP version number, it might be that your ERP system is out-of-date/incompatible for this scenario.
    Second thing to check, go to SPRO>CRM>Master Data>Installed Base>Replication of technical objects from ERP
    There you'll find a mandatory customizing activity for Functional location download.
    And in the documentation of activity "Replication of Functional locations: Prerequisites and General information" you'll find all the necessairy actions to perform.
    This helped me in setting up funcloc replication in CRM 7.0.
    Hope this helps,
    Kind regards,
    Joost

  • Get date format given a locale

    Hello All
    if i have a Locale how can i extract date format attached to the Locale given?
    Is there any api support . i do not want to use simpleDAteFormatter or such classes....
    method should return date format eg "M/d/yyyy" for US local?
    can any one help
    thanks
    prash

    Primarily why do u not want to use the SimpleDateFormat class or related classes ? It would make formatting pretty easy, for e.g. i'm using the DateFormat class to give you a rouch indication of the formatting -
    Date dt = new Date();
    DateFormat df = DateFormat.getDateInstance(DateFormat.SHORT, Locale.US);
    String formattedDate = df.format(dt);
    System.out.println("Formatted date: " + formattedDate);

  • How to extract data from multiple flat files to load into corresponding tables in SQL Server 2008 R2 ?

    Hi,
              I have to implement the following scenario in SSIS but don't know how to do since I never worked with SSIS before. Please help me.
              I have 20 different text files in a single folder and 20 different tables corresponding to each text file in SQL Server 2008 R2 Database. I need to extract the data from each text file and
    load the data into corresponding table in Sql Server Database. Please guide me in how many ways I can do this and which is the best way to implement this job.  Actually I have to automate this job. Few files are in same format(with same column names
    and datatypes) where others are not.
    1. Do I need to create 20 different projects ?
                   or
        Can I implement this in only one project by having 20 packages?
                 or
        Can I do this in one project with only one package?
    Thanks in advance.

    As I said I don't know how to use object data type, I just given a shot as below. I know the following code has errors can you please correct it for me.
    Public
    Sub Main()
    ' Add your code here 
    Dim f1
    As FileStream
    Dim s1
    As StreamReader
    Dim date1
    As
    Object
    Dim rline
    As
    String
    Dim Filelist(1)
    As
    String
    Dim FileName
    As
    String
    Dim i
    As
    Integer
    i = 1
    date1 =
    Filelist(0) =
    "XYZ"
    Filelist(1) =
    "123"
    For
    Each FileName
    In Filelist
    f1 = File.OpenRead(FileName)
    s1 = File.OpenText(FileName)
    rline = s1.ReadLine
    While
    Not rline
    Is
    Nothing
    If Left(rline, 4) =
    "DATE"
    Then
    date1 (i)= Mid(rline, 7, 8)
     i = i + 1
    Exit
    While
    End
    If
    rline = s1.ReadLine
    End
    While
    Next
    Dts.Variables(
    "date").Value = date1(1)
    Dts.Variables(
    "date1").Value = date1(2)
    Dts.TaskResult = ScriptResults.Success
    End
    Sub

  • Extract Data from XML and Load into table using SQL*Loader

    Hi All,
    We have a XML file (sample.xml) which contains credit card transaction information. We have a standard SQL*Loader control file which loads the data from a flat file and the control file code is written as position based method. Our requirement is to use this control file as per our requirement(i.e) load the data into the table from our XML file), But we need help in converting the XML to a flat file or Extract the data from the XML tags and pass the information to the control file and in turn it loads the table.
    Your suggestion is highly appreciated.
    Thanks in advance

    Hi,
    First of all go to PSA maintanance ( Where you will see PSA records ).
    Goto list---> Save-> File---> Spreadsheet (Choose Radio Button)
    > Give the proper file name where you want to download and then-----> Generate.
    You will get ur PSA data in Excel Format.
    Thanks
    Mayank

  • Missing dimension members - - initial extract date question

    We are configuring and implementing the out-of-the-box analytics. In the DAC, we set the initial extract date to be Jan 1 2006. It appears that when we run our initial full load ETL, it only loads suppliers that have been modified after Jan 1 2006. Suppliers prior to that are missing.
    Considering that there are facts that can be associated to suppliers before that initial Jan 1 2006 date, we are noticing a problem. Is my understanding correct and how do we resolve this?
    Thanks in advance.

    Hi Ed,
    Yes, I have set this $$INITIAL_EXTRACT_DATE parameter as you mentioned, but this is working fine in case 1 and NOT working in case 2.
    I am using the same versions of DAC, Informatica and BI Apps in both the cases. Only the source system is different.
    In case 1, its not an initial full load.
    in case 2, its an initial load.
    Thanks,
    Harish.

  • Initial load failed to transfer the clob data

    Hi Experts
    I am trying to move my huge database from Window 10g to 11g on linux though Goldengate Initial Laod. It have clob, long, blob datatypes. When i tring to move thought below parameters its giving an error
    Error:
    The Trail file can not be used with specialrun parameter and when i create a normal replicate process to replicate the data its dissplaying an error for log_csn, log_xid and log_cmplt_csnl columns under ggs_checkpointable (unable to populate this columns)
    --Loading data from file to Replicat (Transfer Method)
    Source Database Server:
    1. EDIT PARAMS load1
    2. Add below parameter into parameter files with name load1
    SOURCEISTABLE
    USERID gguser@orcl, PASSWORD test
    RMTHOST 10.8.18.189, MGRPORT 7810
    RMTFILE /ora01/initialload/pt, MAXFILES 10000, MEGABYTES 10
    TABLE test.*;
    3. EDIT PARAMS load2
    4. Add below parameter into parameter files with name load2
    SPECIALRUN
    USERID gguser@orcl1, PASSWORD test
    EXTTRAIL/ora01/initialload/pt
    ASSUMETARGETDEFS
    MAP test.*, TARGET test.*;
    END RUNTIME
    5. Start the extract porcess on the source database:
    cmd> ogg_directory > extract paramfile dirprm\load1.prm reportfile c:\load1.rpt
    6. Start the replicat process on the target Database Server:
    $ ogg_directory> replicat paramfile dirprm/load2.prm reportfile /ora01/load1.rptt

    Checkpointtable is not needed for a initial load replicat. You could do the following,
    load2.prm
    REPLICAT LOAD2
    USERID gguser@orcl1, PASSWORD test
    -- EXTTRAIL /ora01/initialload/pt
    ASSUMETARGETDEFS
    MAP test., TARGET test.;
    -- END RUNTIME
    ggsci> add rep load2, exttrail /ora01/initialload/pt, nodbcheckpoint
    ggsci> start rep load2
    Thanks,
    Rajesh

Maybe you are looking for

  • Error messages in a table

    Hi everyone, I have a problem when displaying error messages in a table (with values, not based on a model) In a column I have an inputText and a validation that checks valuechangelistener. If not true, throw an error message for that component. face

  • How to get develop module working in LR6?

    Hi, The develop module in my LR6 is not working. The is only displayed a blue screen with a large cross across frame. I can see photo in window at upper left hand i develop module. Photos are shown in Library module. This problem applies to all my ph

  • Gather schema statistics failes after atgrup7 on 11.5.10.2

    ORA-0000: normal, successful completion Start of log messages from FND_FILE In GATHER_SCHEMA_STATS , schema_name= ALL percent= 20 degree = 4 internal_flag= NOBACKUP Error #1: ERROR: While GATHER_TABLE_STATS: object_name=APPS._temp1***ORA-20001: _TEMP

  • How can I improve large deletes?

    Hi, In order to keep our database size to an acceptable level, we delete all records older than 4 hours, every 15 minutes with a delete statement like this: DELETE FROM mytable WHERE add_time < (1174391100000 - 14400000); add_time is a BIGINT contain

  • Error "502" in iCal

    I keep getting an error on iCal as follows: "The server responded with "502" to operation CalDAVAccountRefreshQueueableOperstion". Does anyone know how to fix this issue? Thanks in advance