Getting ORA-17629 and ORA-17627 during setting up of data guard

Hi All,
I am working on data guard setup, and getting ORA-17629 and ORA-17627. The orapwd file was copied to DR server, but still getting the same error when doing a duplicate database command.
RMAN> run{
2> allocate channel c1 type disk;
3> allocate channel c2 type disk;
4> allocate auxiliary channel a1 type disk;
5> allocate auxiliary channel a2 type disk;
6> duplicate target database for standby from active database
7> spfile
8> set db_unique_name='DRTELPRD'
9> set standby_file_management='AUTO'
10> set FAL_CLIENT='DRTELPRD'
11> set FAL_SERVER='TELRPD'
12> nofilenamecheck;
13> }
allocated channel: c1
channel c1: SID=950 device type=DISK
allocated channel: c2
channel c2: SID=942 device type=DISK
allocated channel: a1
channel a1: SID=97 device type=DISK
allocated channel: a2
channel a2: SID=96 device type=DISK
Starting Duplicate Db at 26-AUG-11
contents of Memory Script:
backup as copy reuse
file 'e:\app\product\11.1.0\db_1\DATABASE\PWDtelprd.ORA' auxiliary format
'e:\app\product\11.1.0\db_1\DATABASE\PWDdrtelprd.ORA' file
'E:\APP\PRODUCT\11.1.0\DB_1\DATABASE\SPFILETELPRD.ORA' auxiliary format
'E:\APP\PRODUCT\11.1.0\DB_1\DATABASE\SPFILEDRTELPRD.ORA' ;
sql clone "alter system set spfile= ''E:\APP\PRODUCT\11.1.0\DB_1\DATABASE\SPF
ILEDRTELPRD.ORA''";
executing Memory Script
Starting backup at 26-AUG-11
RMAN-03009: failure of backup command on c1 channel at 08/26/2011 14:44:17
ORA-17629: Cannot connect to the remote database server
ORA-17627: ORA-01031: insufficient privileges
ORA-17629: Cannot connect to the remote database server
continuing other job steps, job failed will not be re-run
released channel: c1
released channel: c2
released channel: a1
released channel: a2
RMAN-00571: ===========================================================
RMAN-00569: =============== ERROR MESSAGE STACK FOLLOWS ===============
RMAN-00571: ===========================================================
RMAN-03002: failure of Duplicate Db command at 08/26/2011 14:44:17
RMAN-03015: error occurred in stored script Memory Script
RMAN-03009: failure of backup command on c2 channel at 08/26/2011 14:44:17
ORA-17629: Cannot connect to the remote database server
ORA-17627: ORA-01031: insufficient privileges
ORA-17629: Cannot connect to the remote database server
RMAN>

Thanks for your instructions. It worked very good. But, I am getting errors for redo01.log files. The redo log files are not created.
ORA-00313: open failed for members of log group 1 of thread 1
ORA-00312: online log 1 thread 1: 'E:\ORACLE\ORADATA\TELPRD\REDO01.LOG'
ORA-27041: unable to open file
OSD-04002: unable to open file
O/S-Error: (OS 2) The system cannot find the file specified.
Errors in file e:\app\diag\rdbms\drtelprd\drtelprd\trace\drtelprd_mrp0_352.trc:
ORA-00313: open failed for members of log group 1 of thread 1
ORA-00312: online log 1 thread 1: 'E:\ORACLE\ORADATA\TELPRD\REDO01.LOG'
ORA-27041: unable to open file
OSD-04002: unable to open file
O/S-Error: (OS 2) The system cannot find the file specified.
Clearing online redo logfile 1 E:\ORACLE\ORADATA\TELPRD\REDO01.LOG
Clearing online log 1 of thread 1 sequence number 3418
Errors in file e:\app\diag\rdbms\drtelprd\drtelprd\trace\drtelprd_mrp0_352.trc:
ORA-00313: open failed for members of log group 1 of thread 1
ORA-00312: online log 1 thread 1: 'E:\ORACLE\ORADATA\TELPRD\REDO01.LOG'

Similar Messages

  • My husband and I both have iPhones on the same account. When he did his update yesterday, he chose both his number and my number during set up. Now he is getting my messages, too. Can someone please help me fix this?

    My husband and I both have iPhones on the same account. When he did his update yesterday, he chose both his number and my number during set up. Now he is getting my messages, too. Can someone please help me fix this?

    What do you mean he chose his and your number during setup. Chose them for what? Is you number showing up in the Settings>Messages>"You can be reached by imessage at" section? If it is he needs to remove it.
    Cheers,
    GB

  • Logical goes down for some reason and it can't be recovered.(data guard)

    This is not an issue, just wanted to know how this can be done.
    Logical goes down and it can't be recovered. How can one recreate the logical standby?
    Is this a fresh restore of primary? what are the steps to follow?
    Thanks in advance.

    For Physical standby, since it's block to block identical with primary. You could use backup of primary to recover physical standby. There's usually no need to backup physical standby.
    For logical standby, since you could have other schema hosted on it in addition to the one that protected by data guard. It's good idea to have separate backup from primary.
    Otherwise, rebuild from primary is always an option.

  • Getting ora-01756 during IMPORT of (packaged) application

    Hi experts,
    I am trying to install one of the packaged apps from otn into my APEX 3.1 on XE under WinXP and get the following error.
    ORA-20001: GET_BLOCK Error. ORA-20001: Execution of statement was unsuccessful. ORA-01756: ...
    Error installing application.
    Same thing happens when IMPORING an app into the same Workspace from which I have EXPORTED it.
    It installs properly on apex.oracle.com!!
    Possibly it has to do with NLS SETTINGS for Switzerland?
    Grand separator in numerical characters could be '
    Any ideas, anyone with the same problems?
    Thanks in front,
    Lutz
    Edited by: [email protected] on Dec 11, 2008 7:59 PM
    Edited by: [email protected] on Dec 11, 2008 8:16 PM
    Edited by: [email protected] on Dec 11, 2008 8:26 PM

    I have found a workaround:
    instead of using a Swiss OS I installed a German version of WIN XP
    now everything works fine.
    I guess it has to do with a ' which is missing somewhere in the code.
    With Swiss locale ' is used as a grand separator which might cause the ORA-01756 thrown by the installer.
    BR,
    Lutz
    =;-)

  • Hi, I am getting ora-600 during xml db installtion.

      2   stmt_basiclob   varchar2(3000);
      3   stmt_seclob     varchar2(3000);
      4  begin
      5    stmt_basiclob := ' create table XDB.XDB$RESOURCE of sys.xmltype ' ||
      6                     ' xmlschema "http://xmlns.oracle.com/xdb/XDBResource.xsd" ' ||
      7                     '      id ''' || '8758D485E6004793E034080020B242C6' || ''' ' ||
      8                     ' element "Resource" id 734 ' ||
      9                     ' type XDB.XDB$RESOURCE_T ';
    10    stmt_seclob := stmt_basiclob || ' lob (xmldata.xmllob) store as securefile ';
    11
    12    if (:usesecfiles = 'YES') then
    13     execute immediate stmt_seclob;
    14    else
    15     execute immediate stmt_basiclob;
    16    end if;
    17  end;
    18  /
    declare
    ERROR at line 1:
    ORA-00600: internal error code, arguments: [ktsladdfcb-bsz], [1], [], [], [], [], [], [], [], [], [], []
    ORA-06512: at line 13
    and below objects are invalid,
    Warning: XDB now invalid, invalid objects found:
    object_name                                 object_type
    DBMS_XDB_ADMIN                             PACKAGE BODY
    DBMS_RESCONFIG                             PACKAGE BODY
    I have tried multiple times with dbca and manuall installtion. But still no luck? Requesting some expert advises. Thanks

    In a forum of volunteers there is no such thing as urgent. Your of urgent must be considere abuse of this forum.
    Also this is not an Oracle issue.
    Sybrand Bakker
    Senior Oracle DBA

  • I am getting ORA-02063 error while copying bulk data from DBLINK

    Hi all ,
    I am trying to copy data from DBlink ,whenever i am trying to pull more data ,The db link break down ,
    I have try to put /*+ APPEND */  (Hint) in insert statement.Below is the insert code from my procedure
    from where i am getting below error
    "Error in copy_data_pkg-3113 ORA-03113: end-of-file on communication channel
    ORA-02063: preceding line from TESTDBLINK"
    Insert code:
    INSERT /*+ APPEND */ INTO target_tab
    select
    col1,
    col2,
    col3,
    col4,
    col5,
    o_time
    FROM padmin.V_tab1@testdblink
        where
    trunc (sysmodtime) = trunc(sysdate)-1
    and O_time >= al_time
    and (CLEAR_TIME is null or  CLEAR_TIME >= occur_time)
        and (res_code is null or (UPPER(res_code) not LIKE '%TEST TT%' and UPPER(res_code) not LIKE '%TEST SITE%'))
        and ((UPPER(BRIEF_DESC) not like '%TEST TT%' and UPPER(BRIEF_DESC) not LIKE '%TEST SITE%'))
        and (o_time+(330/(24*60))) <= trunc(SYSDATE)-1/24/60/60
        and  catvar IN('POS')
      and UPPER(action_var) not like '%SC_NET%' and UPPER(action) not like '%ROAM_NET%' and UPPER(action) not like '%PUP_NET%';
      COMMIT ;
    Please suggest me ,why i am getting this error.

    Can you elaborate the thing... I don't think  /*+ APPEND */ this is working for me ,still I am getting same error.
    If you have any other suggestion,I would like to hear.
    Should i not put commit after some 500 records inserted ? As i am putting commit once after whole data gets inserted.

  • Getting the context and host name during initialization

    I am trying to find a way to get the Context path and the host name during the ContextInitialized event, but without any luck. I think it is very strange that you can not even find the context path or the host name of the application you are running yourself.
    Does anybody have an idea?

    What I mean are the configuration elements which are put in the server.xml
    The BIG problem with the getServletContextName is that this returns the name which is put in the web.xml. But I have a web application which is generic and only some config parameters should be altered. Thats why I need to now in which context I am.
    I think the deployment of web applications is wrong by design. I have to specify deployment information inside the application package which in fact should be presented to the servlet container separately. It's like installing new applications on a computer. It should be possible to create 1 installation file (read a web archive) and then specify the location where you want to install it while deploying it (read specify the host name, context location etc.) Now these two are mixed together which I think is wrong.
    Besides the above problem, I think it is also not correct to let users specify properties while configuring a context (either in the server.xml or the web.xml) and then not having an equivelant in the object model (Read the ServletContext object).
    Morten

  • Get Old Value and the new value based on the date

    Hi
    I have a table called roster created below with following insert statements.
    CREATE TABLE ROSTER
    ROSTER_EMPLOYEE_DEF_ID NUMBER,
    EMPLOYEE_ID NUMBER,
    DEFINITION_REGION_CODE NUMBER,
    DEFINITION_DISTRICT_CODE NUMBER,
    DEFINITION_TERRITORY_CODE NUMBER,
    START_DATE DATE,
    END_DATE DATE
    INSERT INTO ROSTER
    (ROSTER_EMPLOYEE_DEF_ID,EMPLOYEE_ID,DEFINITION_REGION_CODE,DEFINITION_DISTRICT_CODE,DEFINITION_TERRITORY_CODE,START_DATE,END_DATE)
    VALUES
    (1,299,222,333,444,'1-JUN-2011','30-JUN-2011')
    INSERT INTO ROSTER
    (ROSTER_EMPLOYEE_DEF_ID,EMPLOYEE_ID,DEFINITION_REGION_CODE,DEFINITION_DISTRICT_CODE,DEFINITION_TERRITORY_CODE,START_DATE,END_DATE)
    VALUES
    (2,299,223,334,445,'1-JUL-2011','20-JUL-2011')
    INSERT INTO ROSTER
    (ROSTER_EMPLOYEE_DEF_ID,EMPLOYEE_ID,DEFINITION_REGION_CODE,DEFINITION_DISTRICT_CODE,DEFINITION_TERRITORY_CODE,START_DATE,END_DATE)
    VALUES
    (3,299,224,335,446,'1-AUG-2011','30-AUG-2011')
    INSERT INTO ROSTER
    (ROSTER_EMPLOYEE_DEF_ID,EMPLOYEE_ID,DEFINITION_REGION_CODE,DEFINITION_DISTRICT_CODE,DEFINITION_TERRITORY_CODE,START_DATE,END_DATE)
    VALUES
    (4,300,500,400,300,'1-JUN-2011','20-JUN-2011')
    INSERT INTO ROSTER
    (ROSTER_EMPLOYEE_DEF_ID,EMPLOYEE_ID,DEFINITION_REGION_CODE,DEFINITION_DISTRICT_CODE,DEFINITION_TERRITORY_CODE,START_DATE,END_DATE)
    VALUES
    (5,300,501,401,301,'1-JUL-2011','20-JUL-2011')
    In the above table we have columns like
    EMPLOYEE_ID,DEFINITION_REGION_CODE,DEFINITION_DISTRICT_CODE,DEFINITION_TERRITORY_CODE,START_DATE,END_DATE
    The result i am looking from the above table is based on the EMPLOYEE_ID OF START_DATE AND END_DATE
    I need to get OLD_DEFINITION_REGION_CODE and the NEW_DEFINITION_CODE
    Similarly OLD_DEFINITION_REGION_CODE and the NEW_DEFINITION_REGION_CODE
    and OLD_DEFINITION_TERRITORY_CODE and the NEW_DEFINITION_TERRITORY_CODE
    I need to get one row of data for each employee saying old value and new value
    for employee 299 there are 3 records it must give the new record which is the latest date i.e start date 1-aug-2011 and end date 30-aug-2011 old record will be
    start date 1-jul-2011 and 20-jul-2011
    For the above table data i need to get the data as below
    EMPLOYEE_ID OLD_DEFINITION_REGION_CODE NEW_DEFINITION_CODE OLD_DEFINITION_REGION_CODE NEW_DEFINITION_REGION_CODE START_DATE END_DATE
    299 223 224 334 335 20-JUL-11 30-AUG-11
    300 500 501 400 401 20-JUN-11 20-JUL-11
    Please suggest me to get the above result based on the data. Please let me know if my posts are not clear
    Thanks
    Sudhir

    SELECT  EMPLOYEE_ID,
            OLD_DEFINITION_REGION_CODE,
            NEW_DEFINITION_REGION_CODE,
            OLD_DEFINITION_DISTRICT_CODE,
            NEW_DEFINITION_DISTRICT_CODE,
            OLD_DEFINITION_TERRITORY_CODE,
            NEW_DEFINITION_TERRITORY_CODE,
            START_DATE,
            END_DATE
      FROM  (
             SELECT  EMPLOYEE_ID,
                     ROW_NUMBER() OVER(PARTITION BY EMPLOYEE_ID ORDER BY START_DATE DESC) RN,
                     LAG(DEFINITION_REGION_CODE) OVER(PARTITION BY EMPLOYEE_ID ORDER BY START_DATE) OLD_DEFINITION_REGION_CODE,
                     DEFINITION_REGION_CODE NEW_DEFINITION_REGION_CODE,
                     LAG(DEFINITION_DISTRICT_CODE) OVER(PARTITION BY EMPLOYEE_ID ORDER BY START_DATE) OLD_DEFINITION_DISTRICT_CODE,
                     DEFINITION_DISTRICT_CODE NEW_DEFINITION_DISTRICT_CODE,
                     LAG(DEFINITION_TERRITORY_CODE) OVER(PARTITION BY EMPLOYEE_ID ORDER BY START_DATE) OLD_DEFINITION_TERRITORY_CODE,
                     DEFINITION_TERRITORY_CODE NEW_DEFINITION_TERRITORY_CODE,
                     LAG(END_DATE) OVER(PARTITION BY EMPLOYEE_ID ORDER BY START_DATE) START_DATE,
                     END_DATE
               FROM  ROSTER
      WHERE RN = 1
    EMPLOYEE_ID OLD_DEFINITION_REGION_CODE NEW_DEFINITION_REGION_CODE OLD_DEFINITION_DISTRICT_CODE NEW_DEFINITION_DISTRICT_CODE OLD_DEFINITION_TERRITORY_CODE NEW_DEFINITION_TERRITORY_CODE START_DAT END_DATE
            299                        223                        224                          334                          335                           445                           446 20-JUL-11 30-AUG-11
            300                        500                        501                          400                          401                           300                           301 20-JUN-11 20-JUL-11
    SQL>  SY.

  • ORA-04031 during export on CTX-index

    Hi !
    I get ORA-04031 during export when an interMedia CTX-Index should get exported. The exact error message is:
    unable to allocate 4072 bytes of shared memory ("shared pool", "DBMS_SYS_SQL", "PL/SQL MPCODE", "BAMIMA: BAM Buffer")
    followed by
    ORA-06508: PL/SQL: could not find program unit being called
    ORA-06512: in "SYS.DBMS_SQL", line 9
    ORA-06512: in "SYS.DBMS_EXPORT_EXTENSION", line 244
    Any hints for me what I could do?
    TIA,
    Stefan

    Interesting.  You should open a  thread with a more relevant title about views with pk / fk constraints.
    I don't have a way to solve your problem --- to identify such views.
    Hemant K Chitale

  • ORA-12704: character set mismatch odbc

    Hi I have a query which is working find in most of the machines through ODBC connection. On one oracle server i am getting "ORA-12704: character set mismatch"
    SELECT '' as Col1, Fname from CustTable
    Where FPrefix = ''

    user8971316 wrote:
    Hi I have a query which is working find in most of the machines through ODBC connection. On one oracle server i am getting "ORA-12704: character set mismatch"
    SELECT '' as Col1, Fname from CustTable
    Where FPrefix = ''
    bcm@bcm-laptop:~$ oerr ora 12704
    12704, 00000, "character set mismatch"
    // *Cause: One of the following:
    //         - The string operands(other than an nlsparams argument) to an
    //           operator or built-in function do not have the same character
    //           set.
    //         - An nlsparams operand is not in the database character set.
    //         - String data with character set other than the database character
    //           set is passed to a built-in function not expecting it.
    //         - The second argument to CHR() or CSCONVERT() is not CHAR_CS or
    //           NCHAR_CS.
    //         - A string expression in the VALUES clause of an INSERT statement,
    //           or the SET clause of an UPDATE statement, does not have the
    //           same character set as the column into which the value would
    //           be inserted.
    //         - A value provided in a DEFAULT clause when creating a table does
    //           not have the same character set as declared for the column.
    //         - An argument to a PL/SQL function does not conform to the
    //           character set requirements of the corresponding parameter.
    // *Action:

  • I get Ora-00600 when I try to fetch large data via Xml

    Hi all
    I user Oracle8i ver 8.1.6, when I execute the following
    procedure:
    PROCEDURE TTG_EXEC_SQL (SELECT_STATMENT in VarChar2, Result Out
    Clob, err_no Out Number) IS
    -- This Procedure excutes SELECT command and return the result
    data throw Xml CLOB parameter.
    QueryCtx DBMS_XMLquery.CtxType;
    ErrorNum NUMBER;
    ErrorMsg VARCHAR2(200);
    RES1 CLOB;
    Begin
    QueryCtx := DBMS_XMLQuery.NewContext( Select_Statment );
    DBMS_XMLQuery.setRaiseException(QueryCtx, true);
    DBMS_XMLQuery.setRaiseNoRowsException(QueryCtx, true);
    DBMS_XMLQuery.propagateOriginalException(QueryCtx,true);
    ResULT := DBMS_XMLQuery.getXML(QueryCtx);
    DBMS_XMLQuery.CloseContext(QueryCtx);
    Exception
    when others then
    DBMS_XMLQuery.getExceptionContent (QueryCtx,ErrorNum,
    ErrorMsg);
    ERR_NO := NVL(ErrorNum,0);
    END;
    I get Ora-00600 error when the retrived data is more than 800
    records.
    can any one help me Pls ASAP.
    THNX
    Husam

    Can you get the query result with SQL command alone?

  • Problem during the setting the query data source name in forms 6i

    We are now doing assignments in forms 6i actually there is a problem during setting the query data source name in run time.
    The problem is
    declare
    v_query varchar2(200),
    begin
    v_query :='(select empno,ename,deptno from emp where hiredate =' || control .value1 ||')';
    go_block('emp');
    set_block_property('emp',query_data_source_name,v_query);
    end;
    that bolded area is the problem that ia varchar value the single qoutes embedded with that value.how can i embedded single qoutes in that concatenation.
    That is now query will run like select empno,ename,deptno from emp where hiredate =control.value
    But we need like select empno,ename,deptno from emp where hiredate ='control.value'
    Message was edited by:
    Punithavel

    Thanks for your response
    We fixed the error
    and the answer is
    declare
    v_query varchar2(200),
    begin
    v_query :='(select empno,ename,deptno from emp where hiredate = ' ' ' || control .value1 ||' ' ' ) ' ;
    go_block('emp');
    set_block_property('emp',query_data_source_name,v_query);
    end;

  • OS and DB upgrades to databases under data guard

    Hello,
    Here is the scenario. I have an OPS db and a physical standby of OPS. Platform is solaris 9 with both a 9i instance and a 10g instance. I want to be able to do a switchover of the ops to the standby and then upgrade the standby. I believe that I can upgrade the OS to solaris 10 with no problems. However, I also want to be able to upgrade the 9i instance to 10gR2. Is it possible to do an upgrade of this degree to a standby under data guard? At the point that the db upgrade has completed data guard will no longer have the same configuration. This scenario is starting to send off alarms to me. Thanks in advance for your responses.

    Hi,
    Is it possible to do an upgrade of this degree to a standby under data guard? Yes. Steve Karam does it all the time, and he has these notes:
    "Dataguard's physical backups are a GREAT DR situation for a failed upgrade. You will have an exact physical copy of your database ready to switchover to if anything goes wrong with the upgrade. Basically you switch your final archive log and shut down your primary, then begin the standard upgrade.
    If something goes wrong, you can perform a failover on your standby server and you're back up with no possibility of inconsistencies since it's a physical backup. In my opinion, the only method better than DataGuard for a backup for upgrade scenario would be to use SAN tools like NetApp's snapMirror or Sun's
    Availability Suite in order to back it up; these tools can snapshot a DB in a matter of seconds and are good for this sort of situation: if the upgrade goes wrong, simply restore the snapshot.
    If you do it right, you can set up dataguard with only 5 minutes of downtime, then perform your database upgrade with only 5 minutes of downtime. I have done this method for clients in the past and they have been very happy with the results."
    Bipul Kumar has a book devoted to using Data Guard, that you may find helpful, and he notes the steps for a Data Guard-based upgrade:
    http://www.rampant-books.com/book_2004_2_dataguard.htm
    "Upgrading to Oracle10g database software in a Data Guard environment involves the following activities:
    If the Data Guard configuration is managed by Data Guard broker, the broker configuration needs to be removed using the existing version of database software. In other words, remove the Oracle9i configuration before upgrading to Oracle10g. Execute the following command after connecting to any site in the configuration using DGMGRL to remove the configuration:
    REMOVE CONFIGURATION;
    Install Oracle10g database software. The Data Guard feature is not available with the Standard edition, so the Enterprise edition must be installed. Additionally, install Oracle Enterprise Manager (OEM) 10.1 in order to manage the configuration using Data Guard Manager GUI interface, which is tightly integrated with OEM.
    The primary and standby databases should be upgraded to Oracle10g. The log management services should be stopped during the primary database upgrade. All of the redo logs generated during the database upgrade should be applied manually on each standby database. For more information on upgrading to Oracle10g, refer to the documentation, “Oracle Database Upgrade Guide 10g Release 1(10.1)”.
    Update the initialization parameter files or server parameter files on the primary database and all the standby databases to include the parameter db_unique_name. The database unique name must be identical on all the databases in a configuration.
    Once all the databases in the Data Guard configuration are upgraded to Oracle10g, the broker configuration file will have to be created in order to manage it using Data Guard broker. Execute the following command, connecting to the primary database using DGMGRL, to recreate the broker configuration file and to add standby databases"
    Also, see the docs, with their instructions:
    http://download.oracle.com/docs/cd/B28359_01/server.111/b28294/upgrades.htm
    Hope this helps. . .
    Donald K. Burleson
    Oracle Press author
    Author of "Oracle Tuning: The Definitive Reference":
    http://www.dba-oracle.com/bp/s_oracle_tuning_book.htm

  • Data Guard and XMLDB

    Has anyone set up or can provide advise on using Data Guard and XMLDB?
    Right off the top of my head I have a couple of questions:
    Can we use Active Data Guard and XMLDB?
    Can we use Logical Data Guard and XMLDB?

    Yes and yes !
    On a primary database we have a schema dedicated for xml.
    We use xml in structured and unstrucutred mode (I mean the xmldb with the xml table that make you be able to load the xml in the database and modify it, and the other that is just putting xml in lob).
    You just do your data guard for your database like another one. Working for us withtou any special things to do.

  • Ora-1555 during exports and imports. possible causes. ?

    From my understanding : I know that this error will occur due to a undo retention being smaller sizer. or rather I should put it that increasing this parameter should help fix the issue.
    Whats not clear is below :
    Qn. Is it possible that ORA-1555 errors can occur during 'import' even if no other sessions are connected and performing any transaction/dmls ?
    Qn. Also why does a ORA-1555 occur during a 'export' ? Is the same reasons ie. there could be possible DMLs occuring ?

    Hello,
    About your first question:
    Qn. Is it possible that ORA-1555 errors can occur during 'import' even if no other sessions are connected and performing any transaction/dmls ?I've never got this error during import but, I always care to get enough place on the UNDO Tablespace.
    With classical import you have a commit after each Table's import (by default) and a commit after each row's import if COMMIT=Y so as to use less space in the Rollback Segment.
    With Datapump, I often decrease the undo_retention parameter before importing so as to use less space on the UNDO Tablespace.
    About the second question:
    Qn. Also why does a ORA-1555 occur during a 'export' ? Is the same reasons ie. there could be possible DMLs occuring ?To get a consistent image of the exported data with the classical export you may use the parameter CONSISTENT=Y. While you may use the FLASHBACK_TIME parameter with Datapump (so it means that the undo_retention should be large enough when exporting).
    Both use the Undo entries, so I imagine that's possible to get some error (may be ORA-01555) if you don't have enough place on your UNDO Tablespace.
    It's possible (thank to the Rollback Segments) to have concurrent DML on the database while exporting.
    Anyway, from my point of view, while exporting or importing if you have enough space on your UNDO tablespace and a correct undo_retention setting (not too large when importing not too small when exporting) it should be fine.
    Hope this help.
    Best regards,
    Jean-Valentin

Maybe you are looking for

  • 10g Grid (RAC) on Linux with ASM and OCR?

    I was trying to install 10g on Linux with ASM. The docs are not terribly clear on this, but I'd like to have the Oracle Cluster Registry (OCR) on one of my four ASM disks, rather than dedicate space to OCFS. I have 4 18.2 GB drives setup with ASM now

  • Delete Pricing Condition IN Sales Order

    Hello Forum, Does anyone know how to delete a specific pricing condition in a Sales Order. In a sales order for each line item there are multiple pricing conditions. I need to delete a specific price condition 'ZZZZ' for instance then how do I go abo

  • Migration to BO 3.0

    Hello everybody, We are facing some problems in our customer during the migration to BO 3.0 using Integration Kit for SAP. First of all, we use BW queries to construct our universes and we realize that we have to set some additional privilegies to  u

  • Type Casting and Truncating

    I need to have the use input a double amount such as 11.56 and then output the number of dollar bills and coins; such as the nubmer of quarters, dimes, nickles and pennies. I was able to get as far as outputting to console the dollars, but how do I g

  • Can't create new accounts in Skype Manager

    The second day of "The following 0 accounts were created". I heard the same was in 2011. So, the history repeating. Pls fix it!