DEADLOCK과 INITRANS (같은 TABLE내의 다른 범위의 DATA처리시 ORA-60)

제품 : ORACLE SERVER
작성날짜 : 2000-09-06
DEADLOCK과 INITRANS (같은 TABLE내의 다른 범위의 DATA처리시 ORA-60)
================================================================
deadlock에 관한 일반적인 사항은 <Bul:11742>에 정리되어 있다. 같은 data를
동시에 변경하는 transaction의 경우 deadlock이 발생하는 것은 application
logic을 수정하여 해결해야 하는 경우가 대부분이다.
그런데 같은 table에 대해서 동시에 수행되는 transaction이 각자 서로 다른
data를 처리하는 경우에도 ora-60(deadlock detected while waiting for
resource)이 발생할 수 있다.
예를 들어 한 transaction은 A table의 1월 data를 처리하고, 동시에 다른
transaction은 같은 A table의 2월 data를 처리하는 것과 같은 경우이다.
이러한 경우에도 initrans가 작게 설정되어 있으면, ora-60이 발생할 수 있는데,
이 자료에서는 이와 같이 다른 data를 처리하는 transaction들 사이에서의
ora-60이 발생하는 경우와 조치사항을 확인한다.
1. transaction entry에 대해서
~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
table이나 index에 포함된 모든 block에 update/delete/insert와 같은 dml을
수행하기 위해서는 일단 그 block에 transaction정보를 저장시킬 transaction
entry를 확보한 후에 원하는 작업이 수행가능하다.
이 transaction entry의 크기는 os dependent하기는 하나 대부분 23 bytes이며,
table이나 index의 initrans option에 의해, 미리 확보되는 block당 transaction
entry의 갯수가 결정된다. default는 table이 1, index가 2이다.
이 transaction entry가 특정 transaction에 할당되면 그 transaction이 commit
이나 rollback되기 전까지는 다른 transaction에서 사용할 수 없다. 같은
block에 다른 transaction이 dml을 수행하려면, 남은 공간중에서 23 bytes의
transaction entry를 새로 할당하거나, 공간이 없으면 앞에서 먼저 사용중인
transaction이 commit/rollback되기를 기다려야 한다.
2. ORA-60이 발생하는 경우
~~~~~~~~~~~~~~~~~~~~~~~~~
deadlock을 유발시키는 transaction이 서로간에 완전히 다른 data(같은 table)를
처리하더라도 그 data가 같은 block에 함께 들어가 있는 경우라면 ORA-60이
발생할 수 있다. 즉, transaction entry를 잡는 과정에서 block내에 남은
space가 부족한 경우, 서로 상대방의 transaction이 종료되기를 기다리는
deadlock이 발생가능하다는 것이다.
아래에 실제 예를 들어 자세한 발생 시나리오를 정리하였다.
1. ORDER table은 날짜별로 data가 추가, 변경, 삭제되는 100만건 이상의 data를
가진 table이다. 이 table에 대해서 월별로 통계작업을 수행하는데, 6개월씩
처리하기 위해 6개 transaction을 동시에 수행하였다.
즉, T1은 1월 data, T2는 2월 data, T6는 6월 data를 처리하는 식이다.
2. ORDER table은 initrans값이 1로 지정되어 있고 현재 1000개의 block이 이
table에 할당되어 있다.
3. 100번지 block에 2월, 3월, 5월 data가 함께 저장되어 있다.
200번지 block에는 2월, 3월 data가 저장되어 있다.
4. T2 transaction이 100번지 block에 이미 확보되어 있는 1개의 transaction
entry를 사용하여 transaction정보를 저장하였다.
그리고, 2월달 data에 대한 작업을 수행하였다.
5. T3 transaction이 200번지 block에 initrans 1에 의해 확보되어 있는 23 bytes
의 transaction entry를 이용하여 transaction정보를 저장한 후 3월달 data에
대한 처리를 수행하였다.
6. T2 transaction이 2월달 data중 나머지 부분을 처리하기 위해 200번지를
access하여 23 bytes의 T2 를 위한 transaction entry를 확보하려고 하였으나,
block에 남은 공간이 없어서, T3 transaction이 commit할때까지 기다린다.
5번 단계에서, initrans에 의해 미리 확보되어 있는 공간을 사용하는 T3
transaction이 종료되면, 그 부분을 T2가 사용할 수 있게 되는 것이다.
7. T3 transaction도 100번지에 있는 3월 data를 처리하기 위해 100번지내에
transaction entry를 추가적으로 확보하려 하였으나, 공간이 없어서,
마찬가지로 미리 100번지 block을 사용하고 있는 T2 transaction이 종료되기
를 기다리게 된다.
8. 6과 7상황에 의해 T2와 T3 transaction은 서로 상대방이 종료되기를 기다리는,
deadlock이 발생하게 되므로 deadlock상황을 유발시킨 T3 transaction이
ORA-60을 발생시키면서 종료되고, T3 transaction은 rollback된다.
200번지에 확보한 transaction entry부분도 반환하여 다른 transaction이
사용가능한 상태가 된다.
9. 8번에서 release된 200번지 block내의 transaction entry 23 bytes를
6번 단계에서 기다리고 있던 T2 transaction이 확보하고 작업을 진행한다.
3. deadlock을 피하기 위한 방법
~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
결론적으로, 같은 table에 대해서 다른 data를 처리하는 transaction이라
하더라도 동시에 수행하는 transaction이 많은 경우라면 initrans 값이 작은
경우 ora-60이 발생할 수 있게 된다.
하나의 table에 대해서 performance 등의 이유로 동시에 다른 data를 처리하는
transaction을 수행하고자 한다면, table의 initrans 값을 크게하여,
하나의 block에 여러 개의 transaction이 dml 처리를 수행하더라도 space가
부족하여 기다리는 상황은 없도록 하여야 한다.
initrans를 n으로 지정한다면 23*n bytes가 항상 transaction entry로 미리
확보되어 있는 것이다. 이렇게 transaction entry로 초기에 확보된 공간은
data를 저장할 수 없는 공간이 되므로, 너무 크게 하는 것은 space 낭비가
초래된다. 하나의 block에 대해서 동시에 여러 transaction이 처리되지 않는
경우라면 미리 확보된 transaction entry는 사용도 되지 않고 낭비되어, full
table scan 등의 작업에 성능 악화만 초래하게 된다.
initrans 값은 하나의 table에 대해서 동시에 처리하는 transaction의 갯수
이하로 지정하도록 한다. 해당 table에 대한 동시 transaction의 갯수 만큼
initrans를 지정하면 앞에서 설명한 상황의 deadlock은 발생하지 않는 것이
보장되나 space를 고려할 때 그 보다는 약간 작게 하는 것이 일반적인다.
initrans는 table이나 index에 대해서 create나 alter 문장 시 지정, 변경이
가능하나, alter의 경우 이미 확보된 block에는 영향을 미치지 못하므로
export/import를 이용하여 새로 지정하는 것이 필요하다.
다음에 scott.dept table에 대해서 예를 들었다.
column정의 storage등은 임의의 값을 예로 사용한 것이며, initrans도
예로 3을 지정하였다.
os>exp scott/tiger file=test.dmp tables=dept
os>sqlplus scott/tiger
SQL>drop table dept;
SQL>create table dept (deptno number(2), dname varchar2(10))
initrans 3
storage(initial 10m next 2m pctincrease 0);
os>imp scott/tiger file=test.dmp tables=dept ignore=y

Similar Messages

  • Unable to edit table data, but not for all tables

    I have multiple tables in a schema. For some tables, I am able make edits to table data directly, i.e., context menu Table | Open, and the Data tab. When I am able to edit, I do get a pencil icon inside the cell I am editing/typing (and am able to commit the changes). When I am not able to edit, it does nothing (no error messages, sound, or visual cue). I thought it had to do with who owns the table object, but I log in as the same owner of the affected table objects.
    Any pointers would be greatly appreciated so I am equipped when asking the DBA.
    Thanks,
    OS: Windows XP Professional SP2
    Java(TM) Platform: 1.6.0_11
    Oracle IDE: 2.1.1.64.45
    Versioning Support: 2.1.1.64.45
    Edited by: New2OWB10gR2 on Jun 23, 2010 12:20 PM

    Hello again,
    Here you are the DDL of the offending table:
    CREATE TABLE "DBADMEX"."T50SEC82"
    "COD_EMPRESA" CHAR(4 BYTE) DEFAULT ' ' NOT NULL ENABLE,
    "COD_EMPR_CONT" CHAR(4 BYTE) DEFAULT ' ' NOT NULL ENABLE,
    "COD_SECT_CONT" CHAR(2 BYTE) DEFAULT ' ' NOT NULL ENABLE,
    "NUM_CUEN_CONT" CHAR(18 BYTE) DEFAULT ' ' NOT NULL ENABLE,
    "COD_PAIS" CHAR(4 BYTE) DEFAULT ' ' NOT NULL ENABLE,
    "COD_SECTOR" CHAR(6 BYTE) DEFAULT ' ' NOT NULL ENABLE
    PCTFREE 10 PCTUSED 40 INITRANS 50 MAXTRANS 255 NOCOMPRESS LOGGING STORAGE
    INITIAL 65536 NEXT 1048576 MINEXTENTS 1 MAXEXTENTS 2147483645 PCTINCREASE 0 FREELISTS 1 FREELIST GROUPS 1 BUFFER_POOL DEFAULT
    TABLESPACE "TS_50" ;
    CREATE UNIQUE INDEX "DBADMEX"."I5000082" ON "DBADMEX"."T50SEC82"
    "COD_EMPRESA", "COD_EMPR_CONT", "COD_SECT_CONT", "NUM_CUEN_CONT"
    PCTFREE 10 INITRANS 50 MAXTRANS 255 COMPUTE STATISTICS STORAGE
    INITIAL 65536 NEXT 1048576 MINEXTENTS 1 MAXEXTENTS 2147483645 PCTINCREASE 0 FREELISTS 1 FREELIST GROUPS 1 BUFFER_POOL DEFAULT
    TABLESPACE "TS_50" ;
    We are using the following versions:
    Oracle database: 11.1.0.7.0
    Oracle Client: 11.2.0.1.0
    Windows (where the client runs): XP SP3 (version 5.1 Build 2600_spsp_sp3_gdr.080814-1236) in spanish.
    SQL Developer: 2.1.1.64 (MAIN-64.45)
    I think I haven't forgotten anything.
    Thanks in advance for your help!

  • Importing partitioned table data into non-partitioned table

    Hi Friends,
    SOURCE SERVER
    OS:Linux
    Database Version:10.2.0.2.0
    i have exported one partition of my partitioned table like below..
    expdp system/manager DIRECTORY=DIR4 DUMPFILE=mapping.dmp LOGFILE=mapping_exp.log TABLES=MAPPING.MAPPING:DATASET_NAPTARGET SERVER
    OS:Linux
    Database Version:10.2.0.4.0
    Now when i am importing into another server i am getting below error
    Import: Release 10.2.0.4.0 - 64bit Production on Tuesday, 17 January, 2012 11:22:32
    Copyright (c) 2003, 2007, Oracle.  All rights reserved.
    Connected to: Oracle Database 10g Enterprise Edition Release 10.2.0.4.0 - 64bit Production
    With the Partitioning, OLAP, Data Mining and Real Application Testing options
    Master table "MAPPING"."SYS_IMPORT_FULL_01" successfully loaded/unloaded
    Starting "MAPPING"."SYS_IMPORT_FULL_01":  MAPPING/******** DIRECTORY=DIR3 DUMPFILE=mapping.dmp LOGFILE=mapping_imp.log TABLE_EXISTS_ACTION=APPEND
    Processing object type TABLE_EXPORT/TABLE/TABLE
    ORA-39083: Object type TABLE failed to create with error:
    ORA-00959: tablespace 'MAPPING_ABC' does not exist
    Failing sql is:
    CREATE TABLE "MAPPING"."MAPPING" ("SAP_ID" NUMBER(38,0) NOT NULL ENABLE, "TG_ID" NUMBER(38,0) NOT NULL ENABLE, "TT_ID" NUMBER(38,0) NOT NULL ENABLE, "PARENT_CT_ID" NUMBER(38,0), "MAPPINGTIME" TIMESTAMP (6) WITH TIME ZONE NOT NULL ENABLE, "CLASS" NUMBER(38,0) NOT NULL ENABLE, "TYPE" NUMBER(38,0) NOT NULL ENABLE, "ID" NUMBER(38,0) NOT NULL ENABLE, "UREID"
    Processing object type TABLE_EXPORT/TABLE/TABLE_DATA
    Processing object type TABLE_EXPORT/TABLE/GRANT/OWNER_GRANT/OBJECT_GRANT
    ORA-39112: Dependent object type OBJECT_GRANT:"MAPPING" skipped, base object type TABLE:"MAPPING"."MAPPING" creation failed
    ORA-39112: Dependent object type OBJECT_GRANT:"MAPPING" skipped, base object type TABLE:"MAPPING"."MAPPING" creation failed
    ORA-39112: Dependent object type OBJECT_GRANT:"MAPPING" skipped, base object type TABLE:"MAPPING"."MAPPING" creation failed
    ORA-39112: Dependent object type OBJECT_GRANT:"MAPPING" skipped, base object type TABLE:"MAPPING"."MAPPING" creation failed
    ORA-39112: Dependent object type OBJECT_GRANT:"MAPPING" skipped, base object type TABLE:"MAPPING"."MAPPING" creation failed
    ORA-39112: Dependent object type OBJECT_GRANT:"MAPPING" skipped, base object type TABLE:"MAPPING"."MAPPING" creation failed
    ORA-39112: Dependent object type OBJECT_GRANT:"MAPPING" skipped, base object type TABLE:"MAPPING"."MAPPING" creation failed
    Processing object type TABLE_EXPORT/TABLE/INDEX/INDEX
    ORA-39112: Dependent object type INDEX:"MAPPING"."IDX_TG_ID" skipped, base object type TABLE:"MAPPING"."MAPPING" creation failed
    ORA-39112: Dependent object type INDEX:"MAPPING"."PK_MAPPING" skipped, base object type TABLE:"MAPPING"."MAPPING" creation failed
    ORA-39112: Dependent object type INDEX:"MAPPING"."IDX_UREID" skipped, base object type TABLE:"MAPPING"."MAPPING" creation failed
    ORA-39112: Dependent object type INDEX:"MAPPING"."IDX_V2" skipped, base object type TABLE:"MAPPING"."MAPPING" creation failed
    ORA-39112: Dependent object type INDEX:"MAPPING"."IDX_PARENT_CT" skipped, base object type TABLE:"MAPPING"."MAPPING" creation failed
    Processing object type TABLE_EXPORT/TABLE/CONSTRAINT/CONSTRAINT
    ORA-39112: Dependent object type CONSTRAINT:"MAPPING"."CKC_SMAPPING_MAPPING" skipped, base object type TABLE:"MAPPING"."MAPPING" creation failed
    ORA-39112: Dependent object type CONSTRAINT:"MAPPING"."PK_MAPPING_ITM" skipped, base object type TABLE:"MAPPING"."MAPPING" creation failed
    Processing object type TABLE_EXPORT/TABLE/INDEX/STATISTICS/INDEX_STATISTICS
    ORA-39112: Dependent object type INDEX_STATISTICS skipped, base object type INDEX:"MAPPING"."IDX_TG_ID" creation failed
    ORA-39112: Dependent object type INDEX_STATISTICS skipped, base object type INDEX:"MAPPING"."PK_MAPPING" creation failed
    ORA-39112: Dependent object type INDEX_STATISTICS skipped, base object type INDEX:"MAPPING"."IDX_UREID" creation failed
    ORA-39112: Dependent object type INDEX_STATISTICS skipped, base object type INDEX:"MAPPING"."IDX_V2" creation failed
    ORA-39112: Dependent object type INDEX_STATISTICS skipped, base object type INDEX:"MAPPING"."IDX_PARENT_CT" creation failed
    Processing object type TABLE_EXPORT/TABLE/COMMENT
    ORA-39112: Dependent object type COMMENT skipped, base object type TABLE:"MAPPING"."MAPPING" creation failed
    ORA-39112: Dependent object type COMMENT skipped, base object type TABLE:"MAPPING"."MAPPING" creation failed
    ORA-39112: Dependent object type COMMENT skipped, base object type TABLE:"MAPPING"."MAPPING" creation failed
    ORA-39112: Dependent object type COMMENT skipped, base object type TABLE:"MAPPING"."MAPPING" creation failed
    ORA-39112: Dependent object type COMMENT skipped, base object type TABLE:"MAPPING"."MAPPING" creation failed
    ORA-39112: Dependent object type COMMENT skipped, base object type TABLE:"MAPPING"."MAPPING" creation failed
    ORA-39112: Dependent object type COMMENT skipped, base object type TABLE:"MAPPING"."MAPPING" creation failed
    ORA-39112: Dependent object type COMMENT skipped, base object type TABLE:"MAPPING"."MAPPING" creation failed
    ORA-39112: Dependent object type COMMENT skipped, base object type TABLE:"MAPPING"."MAPPING" creation failed
    ORA-39112: Dependent object type COMMENT skipped, base object type TABLE:"MAPPING"."MAPPING" creation failed
    ORA-39112: Dependent object type COMMENT skipped, base object type TABLE:"MAPPING"."MAPPING" creation failed
    ORA-39112: Dependent object type COMMENT skipped, base object type TABLE:"MAPPING"."MAPPING" creation failed
    ORA-39112: Dependent object type COMMENT skipped, base object type TABLE:"MAPPING"."MAPPING" creation failed
    Processing object type TABLE_EXPORT/TABLE/CONSTRAINT/REF_CONSTRAINT
    ORA-39112: Dependent object type REF_CONSTRAINT:"MAPPING"."FK_MAPPING_MAPPING" skipped, base object type TABLE:"MAPPING"."MAPPING" creation failed
    ORA-39112: Dependent object type REF_CONSTRAINT:"MAPPING"."FK_MAPPING_CT" skipped, base object type TABLE:"MAPPING"."MAPPING" creation failed
    ORA-39112: Dependent object type REF_CONSTRAINT:"MAPPING"."FK_TG" skipped, base object type TABLE:"MAPPING"."MAPPING" creation failed
    ORA-39112: Dependent object type REF_CONSTRAINT:"MAPPING"."FK_TT" skipped, base object type TABLE:"MAPPING"."MAPPING" creation failed
    Processing object type TABLE_EXPORT/TABLE/INDEX/FUNCTIONAL_AND_BITMAP/INDEX
    ORA-39112: Dependent object type INDEX:"MAPPING"."X_PART" skipped, base object type TABLE:"MAPPING"."MAPPING" creation failed
    ORA-39112: Dependent object type INDEX:"MAPPING"."X_TIME_T" skipped, base object type TABLE:"MAPPING"."MAPPING" creation failed
    ORA-39112: Dependent object type INDEX:"MAPPING"."X_DAY" skipped, base object type TABLE:"MAPPING"."MAPPING" creation failed
    ORA-39112: Dependent object type INDEX:"MAPPING"."X_BTMP" skipped, base object type TABLE:"MAPPING"."MAPPING" creation failed
    Processing object type TABLE_EXPORT/TABLE/INDEX/STATISTICS/FUNCTIONAL_AND_BITMAP/INDEX_STATISTICS
    ORA-39112: Dependent object type INDEX_STATISTICS skipped, base object type INDEX:"MAPPING"."IDX_TG_ID" creation failed
    ORA-39112: Dependent object type INDEX_STATISTICS skipped, base object type INDEX:"MAPPING"."IDX_V2_T" creation failed
    ORA-39112: Dependent object type INDEX_STATISTICS skipped, base object type INDEX:"MAPPING"."PK_MAPPING" creation failed
    ORA-39112: Dependent object type INDEX_STATISTICS skipped, base object type INDEX:"MAPPING"."IDX_PARENT_CT" creation failed
    ORA-39112: Dependent object type INDEX_STATISTICS skipped, base object type INDEX:"MAPPING"."IDX_UREID" creation failed
    Processing object type TABLE_EXPORT/TABLE/STATISTICS/TABLE_STATISTICS
    ORA-39112: Dependent object type TABLE_STATISTICS skipped, base object type TABLE:"MAPPING"."MAPPING" creation failed
    Job "MAPPING"."SYS_IMPORT_FULL_01" completed with 52 error(s) at 11:22:39Please help..!!
    Regards
    Umesh Gupta

    yes, i have tried that option as well.
    but when i write one tablespace name in REMAP_TABLESPACE clause, it gives error for second one.. n if i include 1st and 2nd tablespace it will give error for 3rd one..
    one option, what i know write all tablespace name in REMAP_TABLESPACE, but that too lengthy process..is there any other way possible????
    Regards
    UmeshAFAIK the option you have is what i recommend you ... through it is lengthy :-(
    Wait for some EXPERT and GURU's review on this issue .........
    Good luck ....
    --neeraj                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                   

  • How to get the plsql table data into output cursor

    Hi,
    Could anybody please help me.
    Below is an example of the scenario..
    CREATE OR REPLACE PACKAGE chck IS
    PROCEDURE getdata(dept_no IN VARCHAR2,oc_result_cursor OUT sys_REFCURSOR);
    TYPE get_rec is record (ename varchar2(20),
    eno number(12));
    TYPE t_recs IS TABLE OF get_rec INDEX BY BINARY_INTEGER;
    emp_tab t_recs;
    END chck;
    CREATE OR REPLACE PACKAGE BODY chck AS
    PROCEDURE getdata(dept_no IN VARCHAR2,oc_result_cursor OUT sys_REFCURSOR)
    is
    BEGIN
    select ename, eno
    bulk collect into emp_tab
    from emp;
    open oc_result_cursor for select * from table(emp_tab); -- I believe something is wrong here ....
    END;
    END chck;
    the above package is giving me an error:
    LINE/COL ERROR
    10/29 PL/SQL: SQL Statement ignored
    10/43 PL/SQL: ORA-22905: cannot access rows from a non-nested table
    item
    let me know what needs to be changed
    Thanks
    Manju

    manjukn wrote:
    once i get the data into a plsql table, how to get this plsql table data into the cursor?There is no such thing as a PL/SQL table - it is an array.
    It is nothing at all like a table. It cannot be indexed, partitioned, cluster, etc. It does not exist in the SQL engine as an object that can be referenced. It resides in expensive PGA memory and needs to be copied (lock, stock and barrel) to the SQL engine as a bind variable.
    It is an extremely primitive structure - and should never be confused as being just like a table.
    Its use in SQL statements is also an exception to the rule. Sound and valid technical reasons need to justify why one want to push a PL/SQL array to the SQL engine to run SELECT 's against it.

  • Error in import table data using oracle datapump

    i am trying to import table data using oracle datapump
    CREATE TABLE emp_xt (
    ID NUMBER,
    NAME VARCHAR2(30)
    ORGANIZATION EXTERNAL (
    TYPE ORACLE_DATAPUMP
    DEFAULT DIRECTORY backup
    LOCATION ('a.dmp')
    it return the following error
    ORA-29913: error in executing ODCIEXTTABLEOPEN callout
    ORA-29400: data cartridge error
    KUP-04084: The ORACLE_DATAPUMP access driver does not support the ROWID column.
    ORA-06512: at "SYS.ORACLE_DATAPUMP", line 19
    please help me

    .dmp file generated from exp command file not from oracle_datapump

  • Package with table data type as out parameter.

    Hi there, I managed to compile the package without error but when I am trying to test this package and
    I am keep facing this error message. 
    Am I using correctly for the table data type as out parameter.  I have no idea what is wrong with the package to fix. 
    Pls. help and advise me.  Thanks.
    Error starting at line 1 in command:
    DECLARE
    p_stmodel VARCHAR2(40):=null;
    p_item_number VARCHAR(40):='9BX158-300';
    p_item_id NUMBER:=0;
    l_attribute_out test_common_api.l_item_attr_tab:=test_common_api.l_item_attr_tab();
    BEGIN
    test_common_api.test_attribute(p_stmodel,p_item_number,p_item_id,l_attribute_out);
    END;
    Error report:
    ORA-06550: line 8, column 18:
    PLS-00302: component 'TEST_ATTRIBUTE' must be declared
    ORA-06550: line 8, column 2:
    PL/SQL: Statement ignored
    06550. 00000 -  "line %s, column %s:\n%s"
    *Cause:    Usually a PL/SQL compilation error.
    *Action:
    ---------Package.
    CREATE OR REPLACE PACKAGE test_common_api
    AS
    TYPE item_attr_rec IS RECORD (CONFIGURATION VARCHAR2(20),
                                   PRODUCTTYPE VARCHAR2(30),
                                   INTERNALPRODUCTNAME VARCHAR2(20),
                                   NUMBEROFHEADS VARCHAR2(2),
                                   NUMBEROFDISCS VARCHAR2(2),
                                   GENERATION  VARCHAR2(10),
                                   FACTORYAPPLICATION VARCHAR2(150),
                                   PRODUCTFAMILY  VARCHAR2(60),
                                   FORMFACTOR VARCHAR2(10),
                                   FORMATTEDCAPACITY NUMBER,
                                   FORMATTEDCAPACITY_UOM VARCHAR2(20),
                                   INTERFACE  VARCHAR2(30),
                                   SPINDLESPEEDRPM  NUMBER,
                                   PRODUCTCACHE VARCHAR2(10),
                                   WARRANTYMONTHS   VARCHAR2(2),
                                   PHYSICAL_SECTOR_SIZE  NUMBER,
                                   MODELHEIGHT  VARCHAR2(10),
                                   ENCRYPTION_TYPE VARCHAR2(40));
    TYPE l_item_attr_tab IS TABLE OF item_attr_rec;
    END test_common_api;
    show errors
    create or replace package body test_common_api
    AS
    PROCEDURE test_attribute (p_stmodel IN VARCHAR2,
                               p_item_number IN VARCHAR2,
                               p_item_id IN NUMBER,
                               l_item_attr_list OUT l_item_attr_tab)
    IS
    l_stmodel st.stmodelnumber%TYPE;
    l_market_segment VARCHAR2(10) ;
    l_sub_market_segment VARCHAR2(10) ;
    l_app_segment VARCHAR2(10);
    l_market_name VARCHAR2(40) ;
    l_ccitem seaeng_ccitemnumber.ccitemnumber%TYPE;
    l_query_item  VARCHAR2(1000);
    l_query_model VARCHAR2(1000);
    l_where VARCHAR2(1000);
    l_bind_var1      VARCHAR2(40);
    l_bind_var2      VARCHAR2(40);
    l_sql            NUMBER:=0;
    l_config VARCHAR2(40):=null;
    BEGIN
       IF p_stmodel is not null THEN
           l_where :='WHERE sc.ccmodel = sp.productmodelnumber AND sp.stmodelnumber = st.stmodelnumber
                      AND sc.pricingdescriptor=''MODEL''
                      AND sc.ccdashnumber =''000''
                      AND sp.detailedproductname=''GENERIC''
                      AND st.stmodelnumber= :1';
                  IF p_item_number is null AND p_item_id is null THEN
                      l_config :='null';
                  ELSE
                      l_config :='sc.configuration';
                  END IF;
            l_bind_var1 :=p_stmodel;
            l_sql :=1;
        ELSE
            IF p_item_id is null and p_item_number is not null THEN
                l_where := 'WHERE sc.ccmodel = sp.productmodelnumber AND sp.stmodelnumber = st.stmodelnumber and sc.ccitemnumber = :1';  
                l_bind_var1 :=p_item_number;
                l_sql:=2;
            ELSIF p_item_id is NOT null and p_item_number is null THEN
                l_where := 'WHERE sc.ccmodel= sp.productmodelnumber AND sp.stmodelnumber = st.stmodelnumber and sc.ccitemnumber in ( select msi.segment1
                            from mtl_system_items msi
                            where msi.inventory_item_id = :1)';
                l_bind_var1 := p_item_id;
                l_sql:=2;
            ELSIF p_item_id is not null and p_item_number is not null THEN
                l_where :='WHERE sc.ccmodel = sp.productmodelnumber AND sp.stmodelnumber = st.stmodelnumber and sc.ccitemnumber in (select msi.segment1
                           from mtl_system_items msi
                           where msi.inventory_item_id = :1
                           AND msi.segment1=:2)';
                l_sql:=3;
                l_bind_var1 := p_item_id ;
                l_bind_var2 :=p_item_number;
            END IF;
        END IF;
    l_query_item :='SELECT sc.configuration,st.producttype, sp.internalproductname, sp.numberofheads , sp.numberofdiscs,
      sp.generation,sp.factoryapplication, st.productfamily,st.formfactor , st.formattedcapacity , st.formattedcapacity_uom,
      st.interface,st.spindlespeedrpm,st.productcache,st.warrantymonths, st.physical_sector_size, st.modelheight, st.encryption_type
       FROM  pm sp , st st, seaeng_ccitemnumber sc ';
    l_query_model :='SELECT '|| l_config|| ' , st.producttype, null,null , null, null, sp.factoryapplication, null,st.formfactor, st.formattedcapacity,
      st.formattedcapacity_uom,st.interface , st.spindlespeedrpm, st.productcache, st.warrantymonths,st.physical_sector_size, st.modelheight,
      st.encryption_type
      FROM  pm sp , st st, seaeng_ccitemnumber sc ';
       IF l_sql = 1 THEN
          EXECUTE IMMEDIATE  l_query_model ||l_where
                             BULK COLLECT INTO l_item_attr_list
                             USING l_bind_var1 ;
                               dbms_output.put_line(l_query_model||l_where);
        ELSIF
             l_sql =2 THEN
               EXECUTE IMMEDIATE l_query_item || l_where
                                 BULK COLLECT INTO l_item_attr_list
                                 using l_bind_var1;
                                 dbms_output.put_line(l_query_item||l_where);
        ELSE
           EXECUTE IMMEDIATE  l_query_item ||l_where
                             BULK COLLECT INTO l_item_attr_list
                             USING l_bind_var1, l_bind_var2 ; 
                               dbms_output.put_line(l_query_item||l_where);
        END IF;
    END test_attribute;
    END test_common_api;
    show errors

    I think you forget to declare "PROCEDURE test_attribute" procedure in your package definition. like:
    CREATE OR REPLACE PACKAGE test_common_api
    AS
    TYPE item_attr_rec IS RECORD (CONFIGURATION VARCHAR2(20),
                                   PRODUCTTYPE VARCHAR2(30),
                                   INTERNALPRODUCTNAME VARCHAR2(20),
                                   NUMBEROFHEADS VARCHAR2(2),
                                   NUMBEROFDISCS VARCHAR2(2),
                                   GENERATION  VARCHAR2(10),
                                   FACTORYAPPLICATION VARCHAR2(150),
                                   PRODUCTFAMILY  VARCHAR2(60),
                                   FORMFACTOR VARCHAR2(10),
                                   FORMATTEDCAPACITY NUMBER,
                                   FORMATTEDCAPACITY_UOM VARCHAR2(20),
                                   INTERFACE  VARCHAR2(30),
                                   SPINDLESPEEDRPM  NUMBER,
                                   PRODUCTCACHE VARCHAR2(10),
                                   WARRANTYMONTHS   VARCHAR2(2),
                                   PHYSICAL_SECTOR_SIZE  NUMBER,
                                   MODELHEIGHT  VARCHAR2(10),
                                   ENCRYPTION_TYPE VARCHAR2(40));
    PROCEDURE test_attribute (p_stmodel IN VARCHAR2,
                               p_item_number IN VARCHAR2,
                               p_item_id IN NUMBER,
                               l_item_attr_list OUT l_item_attr_tab);
    TYPE l_item_attr_tab IS TABLE OF item_attr_rec;
    END test_common_api;
    show errors
    create or replace package body test_common_api
    AS
    PROCEDURE test_attribute (p_stmodel IN VARCHAR2,
                               p_item_number IN VARCHAR2,
                               p_item_id IN NUMBER,
                               l_item_attr_list OUT l_item_attr_tab)
    IS
    l_stmodel st.stmodelnumber%TYPE;
    l_market_segment VARCHAR2(10) ;
    l_sub_market_segment VARCHAR2(10) ;
    l_app_segment VARCHAR2(10);
    l_market_name VARCHAR2(40) ;
    l_ccitem seaeng_ccitemnumber.ccitemnumber%TYPE;
    l_query_item  VARCHAR2(1000);
    l_query_model VARCHAR2(1000);
    l_where VARCHAR2(1000);
    l_bind_var1      VARCHAR2(40);
    l_bind_var2      VARCHAR2(40);
    l_sql            NUMBER:=0;
    l_config VARCHAR2(40):=null;
    BEGIN
       IF p_stmodel is not null THEN
           l_where :='WHERE sc.ccmodel = sp.productmodelnumber AND sp.stmodelnumber = st.stmodelnumber
                      AND sc.pricingdescriptor=''MODEL''
                      AND sc.ccdashnumber =''000''
                      AND sp.detailedproductname=''GENERIC''
                      AND st.stmodelnumber= :1';
                  IF p_item_number is null AND p_item_id is null THEN
                      l_config :='null';
                  ELSE
                      l_config :='sc.configuration';
                  END IF;
            l_bind_var1 :=p_stmodel;
            l_sql :=1;
        ELSE
            IF p_item_id is null and p_item_number is not null THEN
                l_where := 'WHERE sc.ccmodel = sp.productmodelnumber AND sp.stmodelnumber = st.stmodelnumber and sc.ccitemnumber = :1';  
                l_bind_var1 :=p_item_number;
                l_sql:=2;
            ELSIF p_item_id is NOT null and p_item_number is null THEN
                l_where := 'WHERE sc.ccmodel= sp.productmodelnumber AND sp.stmodelnumber = st.stmodelnumber and sc.ccitemnumber in ( select msi.segment1
                            from mtl_system_items msi
                            where msi.inventory_item_id = :1)';
                l_bind_var1 := p_item_id;
                l_sql:=2;
            ELSIF p_item_id is not null and p_item_number is not null THEN
                l_where :='WHERE sc.ccmodel = sp.productmodelnumber AND sp.stmodelnumber = st.stmodelnumber and sc.ccitemnumber in (select msi.segment1
                           from mtl_system_items msi
                           where msi.inventory_item_id = :1
                           AND msi.segment1=:2)';
                l_sql:=3;
                l_bind_var1 := p_item_id ;
                l_bind_var2 :=p_item_number;
            END IF;
        END IF;
    l_query_item :='SELECT sc.configuration,st.producttype, sp.internalproductname, sp.numberofheads , sp.numberofdiscs,
      sp.generation,sp.factoryapplication, st.productfamily,st.formfactor , st.formattedcapacity , st.formattedcapacity_uom,
      st.interface,st.spindlespeedrpm,st.productcache,st.warrantymonths, st.physical_sector_size, st.modelheight, st.encryption_type
       FROM  pm sp , st st, seaeng_ccitemnumber sc ';
    l_query_model :='SELECT '|| l_config|| ' , st.producttype, null,null , null, null, sp.factoryapplication, null,st.formfactor, st.formattedcapacity,
      st.formattedcapacity_uom,st.interface , st.spindlespeedrpm, st.productcache, st.warrantymonths,st.physical_sector_size, st.modelheight,
      st.encryption_type
      FROM  pm sp , st st, seaeng_ccitemnumber sc ';
       IF l_sql = 1 THEN
          EXECUTE IMMEDIATE  l_query_model ||l_where
                             BULK COLLECT INTO l_item_attr_list
                             USING l_bind_var1 ;
                               dbms_output.put_line(l_query_model||l_where);
        ELSIF
             l_sql =2 THEN
               EXECUTE IMMEDIATE l_query_item || l_where
                                 BULK COLLECT INTO l_item_attr_list
                                 using l_bind_var1;
                                 dbms_output.put_line(l_query_item||l_where);
        ELSE
           EXECUTE IMMEDIATE  l_query_item ||l_where
                             BULK COLLECT INTO l_item_attr_list
                             USING l_bind_var1, l_bind_var2 ; 
                               dbms_output.put_line(l_query_item||l_where);
        END IF;
    END test_attribute;
    END test_common_api;
    show errors

  • Import table data in right order to avoid violating foreign key constraints

    Gentlemen
    I am trying to import table data into an existing 10g schema using datapump import in table mode.
    However, in order to avoid violating foreign key constraints, the tables must be loaded in a specified order. I tried specifying the order in the TABLES parameter:
    TABLES=table1,table2,table3 etc.
    However, datapump seems to chose its own order leading to errors like the following:
    ORA-31693: Table data object "SCHEMAX"."TABLE3" failed to load/unload and is being skipped due to error:
    ORA-02291: integrity constraint (SCHEMAX.TABLE3_TABLE1#FK) violated - parent key not found
    I want to try to avoid having to disable all foreign keys because there are hundreds of them.
    Any advice?
    Yours
    Claus Jacobsen, Denmark

    Thanks Anantha.
    Since I am only loadding data (the constraints are already defined in the target database), I am not sure whether this approach would work. Meanwhile I have solved the problem of moving data from one system to another using another, tedious and far from elegant approach that I would prefer to not eloborate on:-)
    However, I have also discovered another probable reason why the foreign key constraints were violated, other than wrong order of table data loading. It turns out almost every single table in the schema contains a trigger supposed to generate a unique row ID from a sequence on insert such as:
    CREATE OR REPLACE TRIGGER "SCHEMAX"."TABLEX#B_I_R"
    BEFORE INSERT
    ON TABLEX
    FOR EACH ROW
    DECLARE
    BEGIN
    SELECT tablex_seq.nextval INTO :NEW.ID FROM dual;
    END;
    If the import mechanism fires this trigger, and the sequences in the source and the target systems are not synchronized, then I guess that referred records a more than likely to end up with wrong ID's compared to the row ID's in the referring rows?
    Spooky. Anybody can confirm this theory?
    Yours
    Claus
    Message was edited by:
    user586249

  • How to copy table data from onde DB to another DB using clipboard

    HI,
    i copied table data from one DB to another DB, but it displays an error as "policy with check option violation" when inserting the table data.. so how to resolve the proble.. thanks in advance.

    DECLARE
    log_utl_dir VARCHAR2(100) :=('/apps/home/cmsftp/log/gaa');
    CURSOR tb_compy_cur is
    select tb.compy_acronym
    -- QC 158113 - added below
    ,tb.ivr_plan_num
    from tb_fc_compy tb,tb_xop_entitlements te
    where tb.grant_award_accept_flag = 'Y'
    and tb.ivr_plan_num = te.ivr_plan_num
    and te.entitle_name = 'GAA_RECONCILED'
    union all
    select compy_acronym
    -- QC 158113 - added below
    ,tb.ivr_plan_num
    from tb_fc_compy tb
    where tb.res_stock_flag = 'Y'
    --and   (tb.res_auto_lapse_flag = 2 OR
    --tb.res_auto_lapse_flag = 3)
    and exists (select entitle_name from tb_xop_entitlements te
    where tb.ivr_plan_num = te.ivr_plan_num
    and te.entitle_name = 'GAA_RES_FLAG'
    and te.optionee = 'Y'
    and te.psrep = 'Y'
    and te.sponsor = 'Y'
    and te.advisor = 'Y');
    v_xopgrantz_insertcount NUMBER := 0;
    -- QC 158113 - added below
    v_xopgrantz_accpt_count NUMBER := 0;
    v_user_id VARCHAR2(30);
    insert_file_id UTL_FILE.FILE_TYPE;
    insert_log_file varchar2(45) := 'xop_grantz_insertstats.log';
    BEGIN
    DBMS_OUTPUT.PUT_LINE('success1');
    insert_file_id := UTL_FILE.fopen(log_utl_dir,insert_log_file,'w');
    UTL_FILE.put_line(insert_file_id,'Starting the Process at '|| CURRENT_TIMESTAMP);
    UTL_FILE.put_line(insert_file_id,'INSERTING ROWS FOR Companies turned on for GAA_RECONCILE and GAA/RESSTOCK');
    for compy_rec in tb_compy_cur loop
    v_user_id := 'CMS'||compy_rec.compy_acronym||'_USER';
    ctx_set_session.set_user_session(v_user_id);
    dbms_output.put_line ('success2'||''|| v_user_id);
    INSERT into xop_grantz(grant_num,
    user_id,
    last_user_id,
    restrict_grant,
    child_symbol,
    parent_grant_flag,
    bulking_overide_flag,
    exerrestrict_code,
    rounding_method,
    exercisiable_dt,
    def_res_units_flag,
    opt_gain_def_elig_flag,
    opt_gain_deferred_flag,
    opt_gain_deferred_dt,
    opts_accepted,
    lst_updtby_usercd,
    accepted_type,
    GAA_eligible,
    GAA_LST_UPDTBY)
    select g.grant_num,
    v_user_id,
    'GRNTACCPT',
    'N',
    'N',
    (sel ect code
    from tb_xop_exerrestrict_codes
    where cash_allowed = 'Y'
    and cashlesshold_allowed = 'Y'
    and cashlesssell_allowed = 'Y'
    and stockswap_allowed = 'Y'
    and restricted_allowed = 'Y'
    and sar_allowed = 'Y'
    and cashmargin_allowed = 'Y'
    and cashpartial_allowed = 'Y'
    and sarsale_allowed = 'Y'),
    NULL,
    'N',
    'N',
    'N',
    NULL,
    NULL,
    NULL,
    'N',
    NULL,
    NULL
    from grantz g
    where not exists(select 1
    from xop_grantz xg
    where xg.grant_num = g.grant_num);
    v_xopgrantz_insertcount := SQL%ROWCOUNT;
    dbms_output.put_line ('1');
    -- QC158113 - Optimisation fix--starts
    DELETE FROM gt_xop_grant_accpt_type;
    INSERT INTO gt_xop_grant_accpt_type
    SELECT g.grant_num,e.ivr_plan_num,
    pk_xop_grntaccpt.fn_get_accpt_type (v_user_id,
    g.plan_num,
    g.grant_dt,
    g.opt_num,
    g.grant_cd,
    g.plan_type,
    'Y'
    FROM grantz g,tb_xop_entitlements e
    WHERE plan_type IN (2, 4, 5, 7, 8)
    and g.user_id = v_user_id
    and e.ivr_plan_num = compy_rec.ivr_plan_num
    and entitle_name = 'GAA_RES_FLAG' ;
    dbms_output.put_line ('success3');
    v_xopgrantz_accpt_count := SQL%ROWCOUNT;
    UTL_FILE.put_line(insert_file_id,'Inserted count in gt_xop_grant_acceptance '|| v_user_id||v_xopgrantz_accpt_count);
    -- QC158113 - Optimisation fix--ends
    COMMIT;
    UTL_FILE.put_line(insert_file_id,'Inserted count in XOP_GRANTZ for USER_ID '|| v_user_id||v_xopgrantz_insertcount);
    ctx_set_session.set_user_session('');
    dbms_output.put_line ('process completed');
    end loop;
    UTL_FILE.fclose(insert_file_id);
    EXCEPTION
    when others then
    rollback;
    dbms_output.put_line ('Code '||SQLCODE||':'||SQLERRM||' at '||v_user_id||' .pr_xopgrantz_insert');
    pr_xop_log_errors('Code '||SQLCODE||':'||SQLERRM||' at '||v_user_id||' .pr_xopgrantz_insert');
    pr_xop_log_errors('Code '||SQLCODE||':'||SQLERRM||'INSERTING into xop_grantz for ALL grants');
    END;
    i received this error when running the procedure also, so the table gt_xop_grant_accpt_type is not populated
    {Code -28115:ORA-28115: policy with check option violation at CMSFB_USER .pr_xopgrantz_insert}

  • ALTER TYPE MODIFY ATTRIBUTE cascade including table data

    Hi,
    does anybody know, why I get "ORA-00932: inconsistent datatypes: expected REF TYPE1_T got REF TYPE1_T"
    after ALTER TYPE MODIFY ATTRIBUTE cascade including table data when the altered type contains a nested table of type REFs.
    according to the documentation this should work? This works when the type contains a nested table of types!
    ORACLE Version: 9i EE 9.2.0.3.0 64 bit
    OS: HP-UX 11
    -- create type1
    CREATE OR REPLACE
    TYPE TYPE1_T AS OBJECT
    T1COL1 NUMBER,
    T1COL2 VARCHAR2(35),
    TYPE2REF REF TYPE2_T
    -- create coll of type1 refs
    CREATE OR REPLACE
    type TYPE1COLL_T IS TABLE OF REF TYPE1_T
    -- create type2
    CREATE OR REPLACE
    TYPE TYPE2_T AS OBJECT
    T2COL1 NUMBER,
    TYPE1COLL TYPE1COLL_T
    -- create table of type1_t
    CREATE TABLE TYPE1_OTB OF TYPE1_T
    -- create table of type2_t
    CREATE TABLE TYPE2_OTB OF TYPE2_T
    nested table type1coll store as type1coll_ntb
    -- populate type1_otb
    INSERT INTO type1_otb
    VALUES(1, 'ABCDEF',NULL);
    -- populate type2_otb
    INSERT INTO type2_otb
    VALUES(1,TYPE1COLL_T());
    select * from type1_otb;
    T1COL1 T1COL2
    TYPE2REF
    1 ABCDEF
    select * from type2_otb;
    T2COL1
    TYPE1COLL
    1
    TYPE1COLL_T()
    ALTER TYPE type1_t MODIFY ATTRIBUTE t1col2 varchar2(50) cascade including table data;
    Type altered.
    select * from type1_otb;
    T1COL1 T1COL2
    TYPE2REF
    1 ABCDEF
    select * from type2_otb;
    select * from type2_otb
    ERROR at line 1:
    ORA-00932: inconsistent datatypes: expected REF TYPE1_T got REF TYPE1_T

    Hi John,
    I am also facing the same problem after executing the command
    SQL> alter type product_object modify attribute (NAME VARCHAR2(80)) cascade;
    with the following error
    ORA-00932: inconsistent datatypes: expected REF WOLFOBJECTS.EMPLOYEE_OBJECT got
    WOLFOBJECTS.EMPLOYEE_OBJECT
    Could you please suggest any alternate or workaround for this issue?
    Thanks
    Sara

  • Utl_smtp.data ORA-06502: PL/SQL: numeric or value error

    Hi there,
    I am using utl_smtp to send a HTML email, I am passing in a clob to utl_smtp.data(), works fine for some relativley small emails nut not larger ones, which return the following error
    ORA-06502: PL/SQL: numeric or value error
    Anybody know what the limitation is on the size if the body paramater to utl_smtp.data(), and is there an alternative way of sending big emails?
    Thanks...

    Hi Aparm,
    What is that big mail ? What data you sent in mail..
    If you Sent the Table data, then you can divide it into small chunks and give it to utl_smtp.data().
    And, I think the size of the utl_smtp.data() for varchar2 Type that can be assgined is 32767 .
    Thanks,
    Shankar

  • Create XML file from table data

    Dear All,
    with dataservice 4.0, I want to create an XML file from a table data.
    Table have a single column but more record, for example:
    0001000488;100;EUR;
    0001000489;200;EUR;
    0001000450;300;EUR;
    My desired XML output:
    <Data>
      0001000488;100;GBP;
      0001000489;200;EUR;
      0001000450;300;EUR;
    </Data>
    I try with a sample query but the sistem write only the last record in XML file:
    <Data>
      0001000450;300;EUR;
    </Data>
    Can everyone help me?
    Thank in advance.
    Simone

    Hello
    That is a very simple (also odd) XML document structure, and as such doesn't require use of the XML target.  It can be easily acheived by writing a normal file with a header and footer, which is acheived using a row_generation and a query to generate the hard coded open and close tags.
    Michael

  • Table data does not refresh

    Using LV2010.
    A table displays the test configuration that has been selected by the user.  This appeared to work fine until recently. 
    Nothing in that area of the code has changed .
    The issue is that although valid data exists on the wire and it even gets written to file, nothing is displayed in the table.
    The operator can click multiple times and nothing is displayed.  Even when running with highlight execution turned ON, the table does not get refreshed.
    Unfortunately, each time the operator click the button to insert the configuration, it does.  But it is not displayed.  The same list goes to the table.  You would expect the subsequent attempts would cause all the items to be displayed when it finally does, but no... only the last selection gets displayed.
    I've recently taken over the project and did notice that a previously working feature was not working.  That feature was to allow multiple selections to be inserted at once.  I suspect the feature still workes, but the table only displays a single line of data.
    I did find a thread that started to discuss a similar behavior with a link to a description of the bug, but that page appears to have dissappeared from the website.  It was discussed in 2005.
    Is there a way to force a refresh display on a table?  Another thought... Could it be that the table is displaying data from a portion further down the list which makes it appear as if there is no data?  As I said earlier, this section of code was not touched and it is the only area where the table data is updated and the display refreshed..
    Has anyone else seen this behavior?
    As can be seen above, the probe does "see" the data on the wire.  The screen capture was taken after the data flow had completed the entire state. The wire itself claims to have a 2D array of 1 X 11 elements.  Normally, this data would be displayed.  I can't think of why it wouldn't be displayed.  If I could, I wouldn't be posting this.. 
    I am curious if this is a LV bug...
    Attachments:
    TableDataInvisible.PNG ‏21 KB

    You know me & locals... 
    Plus the property node was used for something else.
    I fear using the VI Analyzer would... well... euh..  hummm...  how to say this,...
    blow up.. 
    LOL!! 

  • Unable to display table data in Review View

    Hi Experts,
    I have one main view and one review View .In my main view i have one table (Normal Table) and several other input feilds and text views....
    If i click on review button in my main view, am able to see all the data in review view except table data.
    How can i get the table data which i have entered in main view.. the same should display in review view.
    I binded the same node in both main and review. and node is defined under component controller.
    Any ideas?
    Regards
    Farooq.

    Hi,
    I think you already binded in review view also with same node. right? So data will move automatically and display in
    review view also. In WDDOINIT of review veiw read data from that node and use bind_Table.
    Cheers,
    Kris.

  • How to loop and read repeating table data of infoPath form in Visual studio workflow.

    Hi,
    I am trying to read info Path form repeating table data in Visual studio workflow.
    could anyone elaborate me in brief how to loop through repeating table and read all rows value one by one in workflow.
    any help would be more then welcome. 
    Thanks...

    Hi Rohan,
    According to your description, my understanding is that you want to create a Visual Studio workflow to get data from info path repeating table.
    I suggest you can submit Repeating Table to a SharePoint List and then you can create a .NET workflow to read data from the SharePoint List.
    Here are some detailed articles for your reference:
    Codeless submitting InfoPath repeating table to a SharePoint list
    Create a Workflow using Visual Studio 2010
    Best Regards
    Zhengyu Guo
    TechNet Community Support

  • Excel issues with importing CSV or HTML table data from URL - Sharepoint? Office365?

    Greetings,
    We have a client who is having issues importing CSV or HTML table data as one would do using Excel's Web Query import from a reporting application.  As the error message provided by Excel is unhelpful I'm reaching out to anyone who can help us begin to
    troubleshoot problems affecting what is normal standard Excel functionality.  I'd attach the error screenshot, but I can't because my account is not verified....needless to say it says "Microsoft Excel cannot access  the file https://www.avantalytics.com/reporting_handler?func=wquery&format=csv&logid=XXXX&key=MD5
    Where XXXX is a number and MD5 is an md5 code.  The symptoms stated in the error message are:
    - the file name or path does not exist
    -The file is being used by another program
    -The workbook you are trying to save has the same name as a currently open workbook.
    None of these symptoms are the case, naturally. The user encountered this with Excel2010, she was then upgraded to Excel2013 and is still experiencing the same issue. The output of this URL in a browser (IE, Chrome, Firefox) is CSV data for the affected
    user, so it is not a network connectivity issue.  In our testing environment using both Excel2010 or 2013 this file is imported successfully, so we cannot replicate.  The main difference I can determine between our test environment and the end-user
    is they have a Sharepoint installation and appear to have Office365 as well.
    So,  my question might more appropriately be for Sharepoint or Office365 folks, but I can't be sure they're  a culprit.  Given this - does anyone have any knowledge of issues which might cause this with Sharepoint or Office365 integrated with
    Excel and/or have suggestions for getting more information from Excel or Windows other than this error message?  I've added the domain name as a trusted publisher in IE as I thought that might be the issue, but that hasn't solved anything.  As you
    can see its already https and there is no authentication or login - the md5 key is the authentication.  The certificate for the application endpoint is valid and registered via GoDaddy CA.
    I'm at a loss and would love some suggestions on things to check/try.
    Thanks  -Ross

    Hi Ross,
    >> In our testing environment using both Excel 2010 and 2013 this file is imported successfully, so we cannot replicate.
    I suspect it is caused by the difference of web server security settings.
    KB: Error message when you use Web query to a secure Web page (HTTPS://) in Excel: "Unable to open"
    Hope it will help.
    By the way, this forum is mainly for discussing questions about Office Development (VSTO, VBA and Apps for Office .etc.). For Office products feature specific questions, you could consider posting them on
    Office IT Pro forum or Microsoft Office Community.
    Regards,
    Jeffrey
    We are trying to better understand customer views on social support experience, so your participation in this interview project would be greatly appreciated if you have time. Thanks for helping make community forums a great place.
    Click
    HERE to participate the survey.

Maybe you are looking for

  • Unable to log into Presence after AD migration

    Presence 8.0.2.98000-5. CUPC clients receive 'Invalid User ID or Password' when attempting to log in. System was working fine prior to migration from Windows 2003 AD to Windows 2008 AD which necessitated a change to the LDAP Host Configuration IP Add

  • Cannot see BEx Query in Creating Universe Connection

    Hi, I have a BEx query based on Multiprovider I did check the box "Allow External Access to this query" from "Release for OLE DB for OLAP". When I come to Designer create a connection, that query does not show up. This has happened with some other qu

  • Upgrade to 10.1.2.3 now getting java.lang.VerifyError using rwrun.sh

    We recently upgraded our Application Server/Forms and Reports to Oracle 10.1.2.3. We have one application that uses a python script and calls rwrun (or rwrun.sh) from Generic Apache (not Oracle's Apache). We are now seeing the following error from th

  • External table in memory?

    Hello, I am familiar with using an external table which points to the regular computer filesystem, but is it possible to have an external table which is contained only in memory? i.e. instead of a file it is a data structure in memory. Thanks, MH

  • Word - Pages and image in the header

    Morning. I've opened a Word file in Pages. The Word file contains in the header of the first page an image (corporate logo). In Pages 3.0.2 the image isn't adjusted on the right place but one line beneath. The problem is: I can not select the image t