ERROR   OGG-01163  Bad column length (8)

Hi ,
I am receiving the following error on the Target side( which have LONG data type. one of the tale name is IMAGE_TABLE) when replicating from 9i to 11g. Table structure is same but still following error persists on REPLICAT , Please help.
2012-02-20 10:07:45 ERROR OGG-01163 Bad column length (8) specified for column SIGN_LENGTH in table TBAADM.IMAGE_TABLE, maximum allowable length is 2.
Thanks,
NC

Any chance you have an out of sync DEFGEN file ( if you are using DEFGEN that is ).
We recently had an issue where we were getting column size mismatch errors... developer changed something on the source DB... Corresponding change was not made in target DB and we needed to refresh the DEFGEN. After the refresh we were OK to continue processing without an error. Might be something to check.
-Dave

Similar Messages

  • How to solve "Bad Content-Length value" Error that show in ADF Mobile

    I was develop application that using web service from mine original ADF project and I can't fetch via AMX Page as pop-up error "Bad Content-Length value" How to solve this problem ?
    My Web-service configuration
    - "Find" is only basic operation that in view instance
    - None of View criteria
    As only find is basic operation, that seems required "findCriteria" and "findControl" parameter(as seen in "Panel From layout" generated in AMX Page)
    After drag generate form view in AMX Page and after to deploy to simulator, I was found pop-up error was said Bad Content-Length value as shown below.
    I am also using HTTP Analyzer and didn't found any request from ADF Mobile Application
    http://img844.imageshack.us/img844/7720/screenshot20130305at163.png

    Hi,
    I'm also got a problem with this tutorial too, after touch at "Salary Upgrade" button and got same error too
    http://docs.oracle.com/cd/E18941_01/tutorials/BuildingMobileApps/ADFMobileTutorial_2.html
    I have some conclusion with develop ADF Mobile in OSX will related with this problem as tutorial are properly working and nobody was said about this problem.
    My current development machine is OSX 10.8.2 with jDeveloper 11.1.2.3.0
    Edited by: meddlesome on Mar 7, 2013 2:49 AM

  • ERROR OGG-01148 programming error, data type not supported for column

    I am getting following error when I put null in insert statement
    2011-03-31 18:30:45 ERROR OGG-01148 programming error, data type not supported for column TXID in table advoss.tblaudittrail.
    I am replicating MySQL 5.5.9 to Oracle 11g rel2 via goldengate 11

    I am able to diagnose what is cuasing the problem
    unsigned flag was the culprit of this error
    I am able to insert null after removing unsigned flag.
    thank you very much for your kind support

  • External Table error: KUP-04043: table column not found in external source

    I am trying to get the syntaxc correct for an external table.
    I keep getting this error:
    ORA-29913: error in executing ODCIEXTTABLEOPEN callout
    ORA-29400: data cartridge error
    KUP-04043: table column not found in external source: SITE
    29913. 00000 - "error in executing %s callout"
    *Cause:    The execution of the specified callout caused an error.
    *Action:   Examine the error messages take appropriate action.
    Data looks like: (some of one of many files, where the character field widths are variable)
    ZZ,ANYOLDDATA,77777,25002000,201103,12,555.555,11.222
    ZZ,ANYOLDDATA,77777,25002300,201103,34,602.162,8.777
    ZZ,ANYOLDDATA,77777,25002400,201103,12,319.127,9.666
    ZZ,OTHERDATA,77121,55069600,201103,34,25.544,1.332
    ZZ,OTHERDATAS,77122,55069600,201103,22, 1.011,0.293
    External table def I have:
    CREATE TABLE MY_INPUT (
    FIRST_CODE VARCHAR2(10),
    SECOND_CODE VARCHAR2(20),
    MY_NUMBER VARCHAR2(20),
    THIRD_CODE VARCHAR2(20),
    YEARMO VARCHAR2(6),
    N NUMBER,
    MEAN NUMBER,
    SD NUMBER
    ORGANIZATION EXTERNAL (
    TYPE ORACLE_LOADER
    DEFAULT DIRECTORY INPUT_DIR
    ACCESS PARAMETERS (
    RECORDS DELIMITED BY newline
    BADFILE INPUT_LOGDIR:'bad.bad'
    LOGFILE INPUT_LOGDIR:'log.log'
    DISCARDFILE INPUT_LOGDIR:'discards.log'
    fields terminated by ',' LRTRIM
    MISSING FIELD VALUES ARE NULL
    REJECT ROWS WITH ALL NULL FIELDS
    ( THIRD_CODE,N,MEAN,SD) )
    LOCATION ( 'myfile.rpt')
    NOPARALLEL
    REJECT LIMIT UNLIMITED;
    I have the directories INPUT_DIR and INPUT_LOGDIR defined, and read/write access granted to the user who creates the table and tried to query from it.
    I have tried various combinations of VARCHAR2 lengths and NUMBER vs VARCHAR2 for some of the numeric fields.
    I am not getting any Bad, Log or Discard files.
    I can do a GET from the SQL prompt, and see the data:
    SQL> GET 'C:\temp\input_dir'myfile.rpt'
    and I see the data.
    Windows 7
    Oracle 11.2
    I am not positive of the newline record delimiter - these files are generated by an automated system. Probably generated on a UNIX machine.
    Any suggestions on what to try would be helpful.
    KUP-04043 error message says to check the syntax .. .I am running out of thigns to check.
    Thank you - Karen

    And the get ( I created the sanitized file, so we have a real working, failing, santiized example):
    SQL> get c:\Inputfiles\myfile.rpt
    1 ZZ,ANYOLDDATA,77777,25002000,201103,12,555.555,11.222
    2 ZZ,ANYOLDDATA,77777,25002300,201103,34,602.162,8.777
    3 ZZ,ANYOLDDATA,77777,25002400,201103,12,319.127,9.666
    4 ZZ,OTHERDATA,77121,55069600,201103,34,25.544,1.332
    5* ZZ,OTHERDATAS,77122,55069600,201103,22, 1.011,0.293
    So the full series is:
    CREATE DIRECTORY INPUT_DIR AS 'C:\InputFiles';
    -- grant READ and WRITE
    GRANT READ ON DIRECTORY INPUT_DIR TO ILQC;
    GRANT WRITE ON DIRECTORY INPUT_DIR TO ILQC;
    -- As SYS, create the bad/log/discard directory:
    CREATE DIRECTORY LOGDIR AS 'C:\InputFiles\Logs';
    -- grant READ and WRITE
    GRANT READ ON DIRECTORY LOGDIR TO ILQC;
    GRANT WRITE ON DIRECTORY LOGDIR TO ILQC;
    CREATE TABLE MY_INPUT (
    FIRST_CODE VARCHAR2(10),
    SECOND_CODE VARCHAR2(20),
    MY_NUMBER VARCHAR2(20),
    THIRD_CODE VARCHAR2(20),
    YEARMO VARCHAR2(6),
    N NUMBER,
    MEAN NUMBER,
    SD NUMBER
    ORGANIZATION EXTERNAL (
    TYPE ORACLE_LOADER
    DEFAULT DIRECTORY INPUT_DIR
    ACCESS PARAMETERS (
    RECORDS DELIMITED BY newline
    BADFILE INPUT_LOGDIR:'bad.bad'
    LOGFILE INPUT_LOGDIR:'log.log'
    DISCARDFILE INPUT_LOGDIR:'discards.log'
    fields terminated by ',' LRTRIM
    MISSING FIELD VALUES ARE NULL
    REJECT ROWS WITH ALL NULL FIELDS
    ( THIRD_CODE,N,MEAN,SD) )
    LOCATION ( 'myfile.rpt')
    NOPARALLEL
    REJECT LIMIT UNLIMITED;
    SELECT * FROM my_input;
    and GET is as above.

  • 30EA1: Determine column lengths before csv-Import

    Hello out there,
    I tried to import a csv (actually semicolon separated and enclosed in double quotes) into a new table using SQL Developer.
    I got error messages that some columns are to short (ORA-12899). Now the question is, is there an option having SQL Developer determine the maximum length of the data to propose a suitable column width?
    Wenn I convert the csv to Excel an import that, I get similar errors but much later in the process. So Excel-import seems to look much farther ahead than csv-import.
    What I need is SQL Developer scanning the whole file to determine the maximum length of data an not only the first row or the first few 1000 rows.
    Can this be achieved somehow?
    Regards,
    dhalek

    Hi Dhalek,
    Not sure if your problem is that we are not calculating the size of the columns correctly, we are not generating the inserts correctly or if we are not reading the rows with the columns that are large during the preview phase which is where we profile the data and determine the size of the columns. Can you see the rows that are rejected in the preview? When the table is created, are the sizes of the columns correct? Is it possible that your csv file is having the same problem with the single quote as the excel file? Once we have read the data, the generation is handled in the same way for both so the inserts should look the same.
    Sqldeveloper uses java batch insert for the Insert method which is different from standard insert and inserting via a script. The rules are slightly different so rejections may differ. Also, failures reject the entire batch not just the bad row. For now, to limit the number of rejected rows, you can set the preferences->database->advanced-Sql Array Fetch Size to 1 for the import. This will generate only bad rows in the bad file and should make it easier to pinpoint the problem. You'll want to restore that value because it also affects any fetches done by SQLDeveloper. I'll need to research our generation of the single quote in the data.
    Also you may want to try setting the preferences->database->load->preview characters limit to a very high number if you think the problem is that we are not reading the rows with the large columns during the preview. This is the overalll limit of characters that will be read from the file during the preview.
    You can also try the insert script method to load your data which will eliminate any java batch insert issues.
    Thanks,
    Joyce

  • Need help in modifying the column length

    Hi Experts,
    I need to modify the length of a column from char(10) to char(5), but it's showing the error of:
    ALTER TABLE appr_group_table MODIFY (APPR_TBLNO CHAR(5)) It's showing the below error:
    ERROR at line 1:
    ORA-01441: cannot decrease column length because some value is too bigI've checked in the dbase but there is no any value of this column having more than 5 characters:
    SQL> SELECT APPR_TBLNO FROM appr_group_table;
    APPR_TBLNO
    ANTAB
    ANTAB
    ANTST
    SCCTR
    SOTAB
    SOTAB
    SOTAB
    SOTAB
    SOTAB
    SOTAB
    SOTAB
    11 rows selected.Thnx in advance for helping me out.....

    char means fixed length variable. when you say char(10) oracle will reserve 10 places for your data. that means in your output what you are seeing as blank spaces are also considered as data. that is the reason, it is not allowing you to reduce the size.
    better solution is to create new column with varchar2 or char(5) datatype and transfer this column data into new column and then drop original column.

  • Printing custom error Messages in Messages column in WEBADI Excel Template

    I have created a Custom PL/SQL API and Custom integrator to upload data in our Custom Table for Receipts and Payments.
    Im printing error messages using the RAISE APPLICATION ERROR in the Messages Column of webadi upload spreadsheet. However the error message is getting truncated to 100 characters. Does anyone know why ?
    eg: In my PL/SQL API code is RAISE_APPLICATION_ERROR(-20001,V_MESSAGE);
    I saw in oracle that v_message length can be 2048 bytes (512 characters)..then why is it truncating it
    Thanks

    I think I've resolved this myself (with reference to a few different forum posts!) by changing the javascript in the Error Page Template to:
    <script language="javascript">
    function getElementsByClass(searchClass,node,tag) {
    var classElements = new Array();
    if ( node == null ) node = document;
    if ( tag == null ) tag = '*';
    return node.getElementsByTagName(tag);
    function show_error_text() {
    errorElements = getElementsByClass("ErrorPageMessage",document,"div");
    if (errorElements.length > 0 ) {
    errorText = errorElements[0].innerHTML;
    search_term = "SSL";
    str_check = errorText.indexOf(search_term);
    if (str_check==-1) {
    search_term = "QMS";
    str_check = errorText.indexOf( search_term );
    if (str_check > 0) {
    errorText = errorText.substr(str_check);
    errorText = errorText.substr(0,errorText.indexOf("ORA")-1);
    errorText = errorText.replace(/#/g,"~");
    url_location = "f?p=&APP_ID.:200:&APP_SESSION.::NO::P200_ERROR_MESSAGE:"+errorText;
    window.location = url_location;
    show_error_text();
    </script>
    We know our error codes will always start with SSL or QMS. So this will pass the bit of the Oracle error we're interested in to P200_ERROR_MESSAGE or the full Oracle error if it hasn't been raised by some of our code.

  • How to decrease the column length in working time (CATS) table

    Hi all,
    Can you please tell me how we can decrease the column length in CATS table in the portal. Is it something we can do in webdynpro or in the back end.
    Thanks in advance.
    RK Reddy

    I was looking for something similar and ran across OSS Note 989453. I would recommend checking out this note to see if it helps. Apparently, there is a JAVA program error that makes the columns wider than they need to be.
    We have not yet tried this so if you happen to use it, let me know how it works for you.
    Mark Musser
    County of Sacramento

  • Bad Strand Length Assert

    Hi all,
    I am getting the following asserts while i am trying to open a document which was created by using my plugin, i am having some text boxes in which i have placed text from word document (which contains some track change data like strike tru text). With normal text placed in these text boxes i wont get any asserts and the document opens correctly, but with the documents containing word data i am getting the following asserts and indesign crashes, can anybody guide me in this regards.
    ASSERTS
    Bad strand length, class id = kFrameListBoss, strandLength = 535, modelLength = 541
    ValidateTextModel - length of strands are out of sync.
    waxLine starts before story thread boundry
    wax line crosses story thread boundry
    no thread?
    Wax is screwy. See WaxWorld.txt for details.
    Thanks
    Madhu

    This is not a bug on tabular forms. You will have to add a length check using one of these methods. (I am not being rigorous here on syntax, you may need to search the forum for better examples.)
    You can create a page validation (pl/sql body) after submit, that checks the length of a g_fxx(i) column in a loop. Search the forum for g_fxx examples.
    or Apply a little Javascript. Under the query's report attributes, you can add this js to the column's Element attributes: onfocus="this.setAttribute('maxLength','80');
    or dive into Dynamic actions using jQuery selectors...
    td[headers='my column name'] input
    (execute javascript action)
    this.triggeringElement.setAttribute('maxLength','80');
    The Apex team may provide a method for setting the length in the future but these are the options you have now.
    Kelly
    Edited by: klsharpe on Mar 8, 2011 9:27 AM

  • Error OGG-01168 abending replicat process

    Hello All,
    Greetings !
    I have tables that do not have any unique key in source and target database. Replicat process work fine in this case for some tables but for certain table it stops. and generates error:
    "ERROR OGG-01168 Encountered an update for target table SAPSR3.BTCCTL, which has no unique key defined. KEYCOLS can be used to define a key. Use ALLOWNOOPUPDATES to process the update without applying it to the target database. Use APPLYNOOPUPDATES to force the update to be applied using all columns in both the SET and WHERE clause."
    What it exactly means ? I am not understanding, since both the tables have same structures. Pleas help.
    Best Regards,
    R.Kapil

    Hi,
    1. how you have added supplemental lod data on database ? database level or table level?
    2.Did you execute ADD TRANDATA <schemaname.*> on source side?
    3. have you used any COLMAP or Tokens in your parameter files?
    4. if you paste the extract and replicat prm files that would be help to check the details.

  • How to change (decrease) primary key column length?

    Hi all,
    I have plenty of data in the table and I need to decrease the primary key column from CHAR(17) to CHAR(13).
    I try to use:
    ALTER TABLE xx MODIFY (prmy_key (13));
    but Oracle give:
    ERROR at line 1:
    ORA-01441: column to be modified must be empty to decrease column length
    Can anyone give me advice on what is the best way to decrease column length? Thanks in advance for your help.

    To add to the above, I have done a simple test in the lines that I was suggesting the solution for this problem,
       CREATE TABLE testA
       (pk_col   varchar2(20), jd date, constraint pk1 PRIMARY KEY(pk_col));
       --Insert data into testA
       create table testB as select * from testA;
       TRUNCATE table testA;
       ALTER TABLE testA modify(pk_col VARCHAR2(10));   
       INSERT INTO testA SELECT substr(pk_col, 1, 10), jd from testB;
       DROP TABLE testB;
       Select constraint_name, constraint_type, status from all_constraints
       Where table_name = 'TESTA';
       CONSTRAINT_NAME                C STATUS
       PK1                            P ENABLEDAlter command is successful as the table is empty. TRUNCATE flushes the data and resets the high water mark keeping all the constraints. If we safely want to use the TRUNCATE as far as storage issues are concerned,
    TRUNCATE table table_name reuse storage;
    I think, this would accoplish what has been asked.
    Let me know,
    SriDHAR

  • HOW to increase code column length in se38

    hi,
    i am  trying to copy paste code ( SE38) from one server to another  but
    it showing error while pasting i.e code column lenght . on previous server column length
    was  more than 125 charater per line. but here i am getting only 68 charater per line.
    so how can i increase the length of charater per line.
    please help me....
    Manoj...

    hi,
    follow this
    REPORT  zsample  NO STANDARD PAGE HEADING
                                      LINE-SIZE 132
                                      LINE-COUNT 65
                                      MESSAGE-ID yy.
    ~linganna

  • ERROR OGG-00868 Error code 1291, error message: ORA-01291: missing logfile

    OGG Version 12.1.2.1.0 OGGCORE_12.1.2.1.
    DB : 11.2.0.4.3
    I am getting below error.
    2014-12-31 09:53:09  ERROR   OGG-00868  Error code 1291, error message: ORA-01291: missing logfile
      (Missing Log File <unknown>. Read Position SCN: 2585.802983323 (11103293443483)).
    Our solution uses ADG and OGG
    We have source and target where ADG was setup . We later broke ADG setup and made Oracle DB's in source and target into standalone.
    As part of our solution during our deployment window ,we would break ADG i.e we would make both the source and target as independent DB’s with PRIMARY Read /Write mode.
    -Take Guaranteed restore point on Source DB so that we can flash back at later stage(so of now we have not executed flash back command)
    -Once ADG config is disabled we will start the OGG extract which is already configured before =We are at this stage where we are hit with errors and extract is not starting .

    Hi ,
    The error shows it is waiting for the Logfile. The Integrated extract mainly needs of the availability of two things.
    1. Archivelogs.
    2. Trail Files.
    Both should be retained to the needed / required level.
    Please execute the below query and check the status of the Extract / Capture process.
    The below query displays the information of each capture process in a database.,
    COLUMN CAPTURE_NAME HEADING 'Capture|Name' FORMAT A7
    COLUMN PROCESS_NAME HEADING 'Capture|Process|Number' FORMAT A7
    COLUMN SID HEADING 'Session|ID' FORMAT 9999
    COLUMN SERIAL# HEADING 'Session|Serial|Number' FORMAT 9999
    COLUMN STATE HEADING 'State' FORMAT A20
    COLUMN TOTAL_MESSAGES_CAPTURED HEADING 'Redo|Entries|Evaluated|In Detail' FORMAT 9999999
    COLUMN TOTAL_MESSAGES_ENQUEUED HEADING 'Total|LCRs|Enqueued' FORMAT 9999999999
    SELECT c.CAPTURE_NAME,
           SUBSTR(s.PROGRAM,INSTR(s.PROGRAM,'(')+1,4) PROCESS_NAME,
           c.SID,
           c.SERIAL#,
           c.STATE,
           c.TOTAL_MESSAGES_CAPTURED,
           c.TOTAL_MESSAGES_ENQUEUED
      FROM V$STREAMS_CAPTURE c, V$SESSION s
      WHERE c.SID = s.SID AND
            c.SERIAL# = s.SERIAL#;
    Also run this query to check, if the capture is waiting for which logfile.,
    COLUMN CONSUMER_NAME HEADING 'Capture|Process|Name' FORMAT A15
    COLUMN SOURCE_DATABASE HEADING 'Source|Database' FORMAT A10
    COLUMN SEQUENCE# HEADING 'Sequence|Number' FORMAT 99999
    COLUMN NAME HEADING 'Required|Archived Redo Log|File Name' FORMAT A40
    SELECT r.CONSUMER_NAME,
           r.SOURCE_DATABASE,
           r.SEQUENCE#,
           r.NAME
      FROM DBA_REGISTERED_ARCHIVED_LOG r, DBA_CAPTURE c
      WHERE r.CONSUMER_NAME =  c.CAPTURE_NAME AND
            r.NEXT_SCN      >= c.REQUIRED_CHECKPOINT_SCN;
    The above query clearly shows for which logfile the Extract / Capture process is waiting. Check if that logfile is available in your system.
    Regards,
    Veera

  • SSH Disconnecting: Bad packet length

    If I log into my new Xserves (running 10.4.4) using an invalid username, after I submit a password, ssh will hang for a long time then report:
    Disconnecting: Bad packet length xxxxxx
    My older Xserves (running 10.2 and 10.3) don't have this problem, they will just say "Permission denied, please try again" and prompt for the password again. eventually they'll disconnect you, but not with an error.
    Has Apple supplied a new version of SSH in 10.4 that is broken in some way?
    I have noticed this on both OS X server, and the desktop version (All at 10.4.4)
    This doesn't seem to be a problem if you use a valid login, only when you use an invalid userid.
    Any one else run into this? Any idea how to make ssh behave?
    thanks
    -jason

    In this case I was attempting to connect via my laptop (a powerbook) that was connected to the servers via a gigabit switch (the only intervening piece of equipment). Needless to say, these are about the most reliable connections you could expect to have.
    However, I've seen this behavior, when connecting to:
    2 different Xserves running 10.4.4 Server
    1 DualG5 powermac running 10.4.4 and
    1 Powerbook also running 10.4.4
    However when connecting to 10.2.x or 10.3.x servers, I receive the Permission denied response as one would expect.
    I just had an epiphany ....
    While attempting to disable password authentication on the 2 xserves altogether, I had to set not only "PasswordAuthentication no", but also "UsePAM no".
    So, perhaps the problem is that UsePAM is set to yes by default....
    I just attempted to log in to my workstation (10.4.4 with an unmodified sshd_config file) using a bogus username and I received:
    "Disconnecting: Bad packet length 4185019582" after an extended delay
    Then I changed the config file to set "UsePAM no" and tried the login again. this time I received:
    "Permission denied, please try again." almost immediately
    So, It appears that the default configuration that has PAM enabled for sshd is the problem here.
    Thanks for having me revisit this after sitting on it for a couple of days. It led me to the solution.
    -jason

  • Update statements encounterd  Errors  OGG-01168 & SQL error 1403 mapping

    Hi Expetrs,
    Update statements throwing below error:
    2012-11-07 10:40:11  WARNING OGG-00869  Oracle GoldenGate Delivery for Oracle, rep1.prm:  No unique key is defined for table AAAA. All viable columns will be used to represent the key, but may not guarantee uniqueness.  KEYCOLS may be used to define the key.
    2012-11-07 10:40:12  WARNING OGG-00869  Oracle GoldenGate Delivery for Oracle, rep1.prm:  No unique key is defined for table AAAA. All viable columns will be used to represent the key, but may not guarantee uniqueness.  KEYCOLS may be used to define the key.
    2012-11-07 10:40:15  WARNING OGG-00869  Oracle GoldenGate Delivery for Oracle, rep1.prm:  No unique key is defined for table BBBB. All viable columns will be used to represent the key, but may not guarantee uniqueness.  KEYCOLS may be used to define the key.
    2012-11-07 10:40:15  WARNING OGG-00869  Oracle GoldenGate Delivery for Oracle, rep1.prm:  No unique key is defined for table BBBB. All viable columns will be used to represent the key, but may not guarantee uniqueness.  KEYCOLS may be used to define the key.
    2012-11-07 10:40:16  WARNING OGG-01004  Oracle GoldenGate Delivery for Oracle, rep1.prm:  Aborted grouped transaction on 'abc.BBBB', Database error 100 (retrieving bind info for query).
    2012-11-07 10:40:16  WARNING OGG-01003  Oracle GoldenGate Delivery for Oracle, rep1.prm:  Repositioning to rba 17466 in seqno 1384.
    2012-11-07 10:40:16  WARNING OGG-01154  Oracle GoldenGate Delivery for Oracle, rep1.prm:  SQL error 1403 mapping abc.BBBB to abc.BBBB.
    2012-11-07 10:40:16  WARNING OGG-01003  Oracle GoldenGate Delivery for Oracle, rep1.prm:  Repositioning to rba 20104 in seqno 1384.
    2012-11-07 10:40:16  ERROR   OGG-01296  Oracle GoldenGate Delivery for Oracle, rep1.prm:  Error mapping from abc.BBBB to abc.BBBB.
    2012-11-07 10:40:16  ERROR   OGG-01668  Oracle GoldenGate Delivery for Oracle, rep1.prm:  PROCESS ABENDING.If I use KEYCOLS:
    --extract file:
    TABLE abc.INDIA , KEYCOLS (ID);--replicat parameter file
    MAP abc.INDIA, TARGET abc.INDIA , KEYCOLS (ID);Encountered below error
    2012-11-09 00:37:54  WARNING OGG-00869  Oracle GoldenGate Delivery for Oracle, rep1.prm:  No unique key is defined for table INDIA. All viable columns will be used to represent the key, but may not guarantee uniqueness.  KEYCOLS may be used to define the key.
    2012-11-09 00:48:34  ERROR   OGG-01168  Oracle GoldenGate Delivery for Oracle, rep1.prm:  Encountered an update for target table abc.INDIA, which has no unique key defined.  KEYCOLS can be used to define a key.  Use ALLOWNOOPUPDATES to process the update without applying it to the target database.  Use APPLYNOOPUPDATES to force the update to be applied using all columns in both the SET and WHERE clause.
    2012-11-09 00:48:34  ERROR   OGG-01668  Oracle GoldenGate Delivery for Oracle, rep1.prm:  PROCESS ABENDING.any clues plz ??
    Regards,
    Edited by: user13403707 on 19 Nov, 2012 10:55 PM

    If you know of any columns that can be used to identify the records you want to replicat changes to at the target table, use them as KEYCOLS.
    Then add supplemental logging using those columns as
    ADD TRANDATA owner.table_name, COLS (col1, col2).
    If you can't think of any such columns, just do
    ADD TRANDATA owner.table_name;
    It will add supplemental logging with the below warning
    2012-11-19 09:59:13 WARNING OGG-00869 No unique key is defined for table table_name. All viable columns will be used to represent the key, but may not guarantee uniqueness. KEYCOLS may be used to define the key.
    Logging of supplemental redo data enabled for table owner.table_name.

Maybe you are looking for

  • Bells out of tune

    I am not sure what's going on but the last mix is okay and this version sounds wrong and awful. I loaded the tuner and played the String Bell on the other track checking the tuning on the other track. The tuning was really off. A3, B and E plays 40 c

  • DBMS_CRYPTO on 11g Standard Edition

    I have a customer on 11g standard edition that wants to encrypt data at rest inthe tables. My firstthought was using the DBMS_OBFUSCATION, which I know is supported on 11g standard. I see that Oracle recommends DBMS_CRYPTO, but I see contradicting th

  • Mail Rules vs Vacation for reply

    I found the mobileme mail vacation preference and tested it. It will only reply one time to each sender, & I want it to continue to reply to each sender for each email received until I remove it. Can I use the Add Rules preference to do this?? I am r

  • Strange WLAN behaviour of iPhone

    Hello. I recognized a behaviour I cannot understand: Sometimes when my iPhone logs in at my WLAN router (Fritzbox) there seems to be a double / parallel login at the router. But let me describe it step by step: 1. start conditions: Router just switch

  • Display of tables

    When converting a PDF file to Excel the fromating of a table was lost and all the data ended up in the 1st column. How do I get the data in its respective columns?