ROW COMPRESSION - table /SAPAPO/BOP

Hello guys,
I have a big performance problem in table /SAPAPO/BOP at SCM system. I run the /SAPAPO/BOP_DELETE to delete entries older than 7 days. I saw in my database lot executions as sequential read, update and delete during of SPP and GATP - BO processing. As I have the ROW COMPRESSION active for this table, I would like to know if this could have some impact for the performance issue.
Could you please advise about this?
Many thanks,
Carlos.

Hi
Please check the note below may help you,
Note 1416044 - BOP performance improvement for updates back from erp to scm
Thanks
Sadiq

Similar Messages

  • Space reusage after deletion in compressed table

    Hi,
    Some sources tell, that free space after DELETE in compressed table is not reused.
    For example, this http://www.trivadis.com/uploads/tx_cabagdownloadarea/table_compression2_0411EN.pdf
    Is it true?
    Unfortunatly I cannot reproduce it.

    Unfortunatly the question is still open.
    In Oracle 9i space, freed after DELETE in compressed block, was not reused in subsequent inserts.
    Isn't it?
    I saw many evidences from other people. One link I gave above.
    But in Oracle 10g I see another figures. After delete rows in compressed blocks, and subsequent insert into that block, block defragmented!
    Please, if who know any documentation about change in this behavior, please post links.
    p.s.
    in 10g:
    1. CTAS compress. Block is full.
    2. after, deleted every 4 from 5 rows.
    avsp=0x3b
    tosp=0x99e
    0x24:pri[0]     offs=0xeb0
    0x26:pri[1]     offs=0xea8 -- deleted
    0x28:pri[2]     offs=0xea0 -- deleted
    0x2a:pri[3]     offs=0xe98 -- deleted
    0x2c:pri[4]     offs=0xe90 -- deleted
    0x2e:pri[5]     offs=0xe88 -- live
    0x30:pri[6]     offs=0xe80 -- deleted
    0x32:pri[7]     offs=0xe78 -- deleted
    0x34:pri[8]     offs=0xe70 -- deleted
    0x36:pri[9]     offs=0xe68 -- deleted
    0x38:pri[10]     offs=0xe60 -- live
    0x3a:pri[11]     offs=0xe58 -- deleted
    0x3c:pri[12]     offs=0xe50 -- deleted
    0x3e:pri[13]     offs=0xe48 -- deleted
    0x40:pri[14]     offs=0xe40 -- deleted
    0x42:pri[15]     offs=0xe38  -- live
    0x44:pri[16]     offs=0xe30 -- deleted
    0x46:pri[17]     offs=0xe28 -- deleted
    0x48:pri[18]     offs=0xe20 -- deleted
    0x4a:pri[19]     offs=0xe18 -- deleted
    0x4c:pri[20]     offs=0xe10 -- live
    ...3. insert into table t select from ... where rownum < 1000;
    Inserted rows were inserted in a several blocks. Total number of not empty blocks was not changed. Chains did not occure.
    Block above looks as follow:
    avsp=0x7d
    tosp=0x7d
    0x24:pri[0]     offs=0xeb0
    0x26:pri[1]     offs=0x776 - new
    0x28:pri[2]     offs=0x84b - new
    0x2a:pri[3]     offs=0x920 - new
    0x2c:pri[4]     offs=0x9f5 - new
    0x2e:pri[5]     offs=0xea8 - old
    0x30:pri[6]     offs=0xaca - new
    0x32:pri[7]     offs=0xb9f - new
    0x34:pri[8]     offs=0x34d - new
    0x36:pri[9]     offs=0x422 - new
    0x38:pri[10]     offs=0xea0 - old
    0x3a:pri[11]     offs=0x4f7 - new
    0x3c:pri[12]     offs=0x5cc - new
    0x3e:pri[13]     offs=0x6a1 - new
    0x40:pri[14]     sfll=16  
    0x42:pri[15]     offs=0xe98 - old
    0x44:pri[16]     sfll=17
    0x46:pri[17]     sfll=18
    0x48:pri[18]     sfll=19
    0x4a:pri[19]     sfll=21
    0x4c:pri[20]     offs=0xe90 -- old
    0x4e:pri[21]     sfll=22
    0x50:pri[22]     sfll=23
    0x52:pri[23]     sfll=24
    0x54:pri[24]     sfll=26As we see, that old rows were defragmented, and repacked, and moved to the bottom of block.
    New rows (inserted after compressing of table) fill remaining space.
    So, deleted space was reused.

  • 11.2.0.3.3  impdp compress table

    HI ML :
    源库 : 10.2.0.3 compress table
    target : 11.2.0.3.3 impdp 源端的compress tables,在目标端是否是compress table
    之前在10g库直接 通过impdp dblink 导入时候 发现,入库的表需要手工做move compress。
    MOS 文档给的的测试时 在10g开始 支持导入自动维护compress table :
    Oracle Server - Enterprise Edition - Version: 9.2.0.1 to 11.2.0.1 - Release: 9.2 to 11.2
    Information in this document applies to any platform.
    Symptoms
    Original import utility bypasses the table compression or does not compress, if the table is precreated as compressed. Please follow the next example that demonstrates this.
    connect / as sysdba
    create tablespace tbs_compress datafile '/tmp/tbs_compress01.dbf' size 100m;
    create user test identified by test default tablespace tbs_compress temporary tablespace temp;
    grant connect, resource to test;
    connect test/test
    -- create compressed table
    create table compressed
    id number,
    text varchar2(100)
    ) pctfree 0 pctused 90 compress;
    -- create non-compressed table
    create table noncompressed
    id number,
    text varchar2(100)
    ) pctfree 0 pctused 90 nocompress;
    -- populate compressed table with data
    begin
    for i in 1..100000 loop
    insert into compressed values (1, lpad ('1', 100, '0'));
    end loop;
    commit;
    end;
    -- populate non-compressed table with identical data
    begin
    for i in 1..100000 loop
    insert into noncompressed values (1, lpad ('1', 100, '0'));
    end loop;
    commit;
    end;
    -- compress the table COMPRESSED (previous insert doesn't use the compression)
    alter table compressed move compress;
    Let's now take a look at data dictionary to see the differences between the two tables:
    connect test/test
    select dbms_metadata.get_ddl ('TABLE', 'COMPRESSED') from dual;
    DBMS_METADATA.GET_DDL('TABLE','COMPRESSED')
    CREATE TABLE "TEST"."COMPRESSED"
    ( "ID" NUMBER,
    "TEXT" VARCHAR2(100)
    ) PCTFREE 0 PCTUSED 90 INITRANS 1 MAXTRANS 255 COMPRESS LOGGING
    STORAGE(INITIAL 65536 NEXT 1048576 MINEXTENTS 1 MAXEXTENTS 2147483645
    PCTINCREASE 0 FREELISTS 1 FREELIST GROUPS 1 BUFFER_POOL DEFAULT)
    TABLESPACE "TBS_COMPRESS"
    1 row selected.
    SQL> select dbms_metadata.get_ddl ('TABLE', 'NONCOMPRESSED') from dual;
    DBMS_METADATA.GET_DDL('TABLE','NONCOMPRESSED')
    CREATE TABLE "TEST"."NONCOMPRESSED"
    ( "ID" NUMBER,
    "TEXT" VARCHAR2(100)
    ) PCTFREE 0 PCTUSED 90 INITRANS 1 MAXTRANS 255 NOCOMPRESS LOGGING
    STORAGE(INITIAL 65536 NEXT 1048576 MINEXTENTS 1 MAXEXTENTS 2147483645
    PCTINCREASE 0 FREELISTS 1 FREELIST GROUPS 1 BUFFER_POOL DEFAULT)
    TABLESPACE "TBS_COMPRESS"
    1 row selected.
    col segment_name format a30
    select segment_name, bytes, extents, blocks from user_segments;
    SEGMENT_NAME BYTES EXTENTS BLOCKS
    COMPRESSED 2097152 17 256
    NONCOMPRESSED 11534336 26 1408
    2 rows selected.
    The table COMPRESSED needs fewer storage space than the table NONCOMPRESSED. Now, let's export the tables using the original export utility:
    #> exp test/test file=test_compress.dmp tables=compressed,noncompressed compress=n
    About to export specified tables via Conventional Path ...
    . . exporting table COMPRESSED 100000 rows exported
    . . exporting table NONCOMPRESSED 100000 rows exported
    Export terminated successfully without warnings.
    and then import them back:
    connect test/test
    drop table compressed;
    drop table noncompressed;
    #> imp test/test file=test_compress.dmp tables=compressed,noncompressed
    . importing TEST's objects into TEST
    . . importing table "COMPRESSED" 100000 rows imported
    . . importing table "NONCOMPRESSED" 100000 rows imported
    Import terminated successfully without warnings.
    Verify the extents after original import:
    col segment_name format a30
    select segment_name, bytes, extents, blocks from user_segments;
    SEGMENT_NAME BYTES EXTENTS BLOCKS
    COMPRESSED 11534336 26 1408
    NONCOMPRESSED 11534336 26 1408
    2 rows selected.
    => The table compression is gone.
    Cause
    This is an expected behaviour. Import is not performing a bulk load/direct path operations, so the data is not inserted as compressed.
    Only Direct path operations such as CTAS (Create Table As Select), SQL*Loader Direct Path will compress data. These operations include:
    •Direct path SQL*Loader
    •CREATE TABLE and AS SELECT statements
    •Parallel INSERT (or serial INSERT with an APPEND hint) statements
    Solution
    The way to compress data after it is inserted via a non-direct operation is to move the table and compress the data:
    alter table compressed move compress;
    Beginning with Oracle version 10g, DataPump utilities (expdp/impdp) perform direct path operations and so the table compression is maintained, like in the following example:
    - after crating/populating the two tables, export them with:
    #> expdp test/test directory=dpu dumpfile=test_compress.dmp tables=compressed,noncompressed
    Processing object type TABLE_EXPORT/TABLE/TABLE
    . . exported "TEST"."NONCOMPRESSED" 10.30 MB 100000 rows
    . . exported "TEST"."COMPRESSED" 10.30 MB 100000 rows
    Master table "TEST"."SYS_EXPORT_TABLE_01" successfully loaded/unloaded
    and re-import after deletion with:
    #> impdp test/test directory=dpu dumpfile=test_compress.dmp tables=compressed,noncompressed
    Processing object type TABLE_EXPORT/TABLE/TABLE_DATA
    . . imported "TEST"."NONCOMPRESSED" 10.30 MB 100000 rows
    . . imported "TEST"."COMPRESSED" 10.30 MB 100000 rows
    Job "TEST"."SYS_IMPORT_TABLE_01" successfully completed at 12:47:51
    Verify the extents after DataPump import:
    col segment_name format a30
    select segment_name, bytes, extents, blocks from user_segments;
    SEGMENT_NAME BYTES EXTENTS BLOCKS
    COMPRESSED 2097152 17 256
    NONCOMPRESSED 11534336 26 1408
    2 rows selected.
    => The table compression is kept.
    ===========================================================
    1 到底11.2.0.3 是否 支持impdp自动维护compress table 通过dblink 方式?
    2
    This is an expected behaviour. Import is not performing a bulk load/direct path operations, so the data is not inserted as compressed.
    Only Direct path operations such as CTAS (Create Table As Select), SQL*Loader Direct Path will compress data. These operations include:
    •Direct path SQL*Loader
    •CREATE TABLE and AS SELECT statements
    •Parallel INSERT (or serial INSERT with an APPEND hint) statements
    Solution
    The way to compress data after it is inserted via a non-direct operation is to move the table and compress the data:
    以上意思在10g之前是 必须以上方式才能支持目标端入库压缩表,10g开始支持自动压缩? 貌似 10g也需要手工move。

    ODM TEST:
    Oracle Database 11g Enterprise Edition Release 11.2.0.3.0 - 64bit Production
    With the Partitioning, OLAP, Data Mining and Real Application Testing options
    SQL> create table nocompres tablespace users as select * from dba_objects;
    Table created.
    SQL> create table compres_tab tablespace users as select * from dba_objects;
    Table created.
    SQL> alter table compres_tab compress 3;
    Table altered.
    SQL> alter table compres_tab move ;
    Table altered.
    select bytes/1024/1024 ,segment_name from user_segments where segment_name like '%COMPRES%'
    BYTES/1024/1024 SEGMENT_NAME
                  3 COMPRES_TAB
                  9 NOCOMPRES
    C:\Users\ML>expdp  maclean/oracle dumpfile=temp:COMPRES_TAB2.dmp  tables=COMPRES_TAB
    Export: Release 11.2.0.3.0 - Production on Fri Sep 14 12:01:12 2012
    Copyright (c) 1982, 2011, Oracle and/or its affiliates.  All rights reserved.
    Connected to: Oracle Database 11g Enterprise Edition Release 11.2.0.3.0 - 64bit Production
    With the Partitioning, OLAP, Data Mining and Real Application Testing options
    Starting "MACLEAN"."SYS_EXPORT_TABLE_01":  maclean/******** dumpfile=temp:COMPRES_TAB2.dmp tables=COMPRES_TAB
    Estimate in progress using BLOCKS method...
    Processing object type TABLE_EXPORT/TABLE/TABLE_DATA
    Total estimation using BLOCKS method: 3 MB
    Processing object type TABLE_EXPORT/TABLE/TABLE
    . . exported "MACLEAN"."COMPRES_TAB"                     7.276 MB   75264 rows
    Master table "MACLEAN"."SYS_EXPORT_TABLE_01" successfully loaded/unloaded
    Dump file set for MACLEAN.SYS_EXPORT_TABLE_01 is:
      D:\COMPRES_TAB2.DMP
    Job "MACLEAN"."SYS_EXPORT_TABLE_01" successfully completed at 12:01:20
    C:\Users\ML>impdp maclean/oracle remap_schema=maclean:maclean1 dumpfile=temp:COMPRES_TAB2.dmp
    Import: Release 11.2.0.3.0 - Production on Fri Sep 14 12:01:47 2012
    Copyright (c) 1982, 2011, Oracle and/or its affiliates.  All rights reserved.
    Connected to: Oracle Database 11g Enterprise Edition Release 11.2.0.3.0 - 64bit Production
    With the Partitioning, OLAP, Data Mining and Real Application Testing options
    Master table "MACLEAN"."SYS_IMPORT_FULL_01" successfully loaded/unloaded
    Starting "MACLEAN"."SYS_IMPORT_FULL_01":  maclean/******** remap_schema=maclean:maclean1 dumpfile=temp:COMPRES_TAB2.dmp
    Processing object type TABLE_EXPORT/TABLE/TABLE
    Processing object type TABLE_EXPORT/TABLE/TABLE_DATA
    . . imported "MACLEAN1"."COMPRES_TAB"                    7.276 MB   75264 rows
    Job "MACLEAN"."SYS_IMPORT_FULL_01" successfully completed at 12:01:50
      1* select bytes/1024/1024 ,segment_name from user_segments where segment_name like '%COMPRES%'
    SQL> /
    BYTES/1024/1024 SEGMENT_NAME
                  3 COMPRES_TAB
    SQL> drop table compres_tab;
    Table dropped.
    C:\Users\ML>exp maclean/oracle tables=COMPRES_TAB file=compres1.dmp
    Export: Release 11.2.0.3.0 - Production on Fri Sep 14 12:03:19 2012
    Copyright (c) 1982, 2011, Oracle and/or its affiliates.  All rights reserved.
    Connected to: Oracle Database 11g Enterprise Edition Release 11.2.0.3.0 - 64bit Production
    With the Partitioning, OLAP, Data Mining and Real Application Testing options
    Export done in ZHS16GBK character set and AL16UTF16 NCHAR character set
    About to export specified tables via Conventional Path ...
    . . exporting table                    COMPRES_TAB      75264 rows exported
    Export terminated successfully without warnings.
    C:\Users\ML>
    C:\Users\ML>imp maclean/oracle  fromuser=maclean touser=maclean1  file=compres1.dmp
    Import: Release 11.2.0.3.0 - Production on Fri Sep 14 12:03:45 2012
    Copyright (c) 1982, 2011, Oracle and/or its affiliates.  All rights reserved.
    Connected to: Oracle Database 11g Enterprise Edition Release 11.2.0.3.0 - 64bit Production
    With the Partitioning, OLAP, Data Mining and Real Application Testing options
    Export file created by EXPORT:V11.02.00 via conventional path
    import done in ZHS16GBK character set and AL16UTF16 NCHAR character set
    . importing MACLEAN's objects into MACLEAN1
    . . importing table                  "COMPRES_TAB"      75264 rows imported
    Import terminated successfully without warnings.
    SQL> conn maclean1/oracle
    Connected.
      1* select bytes/1024/1024 ,segment_name from user_segments where segment_name like '%COMPRES%'
    SQL> /
    BYTES/1024/1024 SEGMENT_NAME
                  8 COMPRES_TAB
                     我的理解 对于direct load 总是能保持compression
    但是 imp默认是conventional path 即使用普通的走buffer cache的INSERT实现导入 所以无法保持compression
    而impdp不管 access_method 是external_table还是direct_path 模式都可以保持compression

  • How to add column to compressed table

    Hi gurus,
    Can any one help me how to add a column to compressed tables
    Thanks in advance

    The only difference is if added column has default value. In that case:
    SQL> create table tbl(id number,val varchar2(10))
      2  /
    Table created.
    SQL> insert into tbl
      2  select level,lpad('X',10,'X')
      3  from dual
      4  connect by level <= 100000
      5  /
    100000 rows created.
    SQL> select bytes
      2  from user_segments
      3  where segment_name = 'TBL'
      4  /
         BYTES
       3145728
    SQL> alter table tbl move compress
      2  /
    Table altered.
    SQL> select bytes
      2  from user_segments
      3  where segment_name = 'TBL'
      4  /
         BYTES
       2097152
    SQL> alter table tbl add name varchar2(5) default 'NONE'
      2  /
    alter table tbl add name varchar2(5) default 'NONE'
    ERROR at line 1:
    ORA-39726: unsupported add/drop column operation on compressed tables
    SQL> alter table tbl add name varchar2(5)
      2  /
    Table altered.
    SQL> update tbl set name = 'NONE'
      2  /
    100000 rows updated.
    SQL> commit
      2  /
    Commit complete.
    SQL> select bytes
      2  from user_segments
      3  where segment_name = 'TBL'
      4  /
         BYTES
       7340032
    SQL> select compression from user_tables where table_name = 'TBL'
      2  /
    COMPRESS
    ENABLED
    SQL> alter table tbl move compress
      2  /
    Table altered.
    SQL> select bytes
      2  from user_segments
      3  where segment_name = 'TBL'
      4  /
         BYTES
       2097152
    SQL> SY.

  • DB2 Row compression activated - Any changes during System Copy process?

    Hell All,
    I have activated DB2 Row Compression.
    Now i want to do a System Copy.
    Any changes due to this feature in the regular System Copy procedure? Like using R3load fastload COMPRESS options?
    Thanks,
    Bidwan

    Hello Bidwan,
    Please see the following blog regarding row compression:
    /people/johannes.heinrich/blog/2006/09/05/new-features-in-db2-udb-v9--part-4
    R3load has the new option '-loadprocedure fast COMPRESS'. Started with this option it loads part of the data, does a REORG to create the compression dictionary and then loads the rest of the data. This way the table does not grow up to it's full size before it is compressed. For more details see OSS note 886231.
    Regards,
    Paul

  • Row compression test

    Hello, All
    I try to research and test row compression on DB2 9.7
    I have one BW table with 67 M rows on our production system with DB2 9.1.7 what hold 793558 pages
    I create  the same general table (ODSTAB) and it's copy table (ODSTAB_COMP) with COMPRESS YES on sample DB2 9.7 database.
    Next, i load both tables with 67 M rows
    I have strange result's of table's size  after load and runstats  compared to 9.1 original not compressed table
    The table size is estimated from SYSIBMADM.ADMINTABINFO view (DATA_OBJECT_P_SIZE column)
    9.1.7 -  original not comressed table                     -  12698528
    9.7.0 -  ODSTAB (not compressed table)             -   14281344  (Wow!!!)
    9.7.0 -  ODSTAB_COMP (compressed table)           -   12344928
    Compression ratio for ODSTAB_COMP table is 76% (as presented in SYSIBMADM.ADMINTABCOMPRESSINFO) but it's not true (why?) in compare of allocated data pages.
    In the sample 9.7 database table's created each on it's own AUTRESIZE  DMS table space with 16k pagesize, extentsize 2 and INCREASESIZE 5M.
    I have two question  :
    1. Why native table size is different on 9.1 and 9.7  of  ~1,5Gb  ( 12698528 vs 14281344)  ?
    2. Why real comression ratio (14281344 vs 12344928) is too small and how  i can increase it (may be some best practice or trick's is available)  ?
    Thank you for all advises!
    With best regards,  Dmitry

    Second question is solved.
    After REORG ....RESETDICTIONARY allocated pages is reduced by 74%  
    A think it related with automatic dictionary creation when table was LOADed  (ROWS_SAMPLED was only 10132)
    With best regards, Dmitry

  • 9ASVTTY (Safety Days Supply) UoM in table '/sapapo/tspaplob'

    Hello All,
    We faced an issue whereby the safety days supply was being displayed in Base UoM of the product like 'EA'. After further exploring this issue and comparing with standard SAP Planning Areas we found that a value of '10' has to be maintained against 9ASVTTY for field UOM_PLAN in table /sapapo/tspaplob.
    After making this change the safety days supply is being displayed in DAYS again. Although the issue is resolved I still want to know what is the significance of maintaining this value '10' in the table. Also can we maintain a unit like 'D' or 'H'. After maintaining this value in the table I dont see any corresponding change in the Planning Area for 9ASVTTY, field 'UoM'.  
    An early reply is highly appreciated.
    Thank You.
    Abhi.

    Hi All,
    There is this function module called by the Planning Administration view, called CONVERSION_EXIT_CUNIT_INPUT.  Below is a code extract pointing to where exactly does the system equate D to "10"
    IF sw_use_buffer EQ space
        OR language NE sy-langu.
          SELECT * FROM t006a UP TO 1 ROWS WHERE spras EQ language
                              AND   mseh3 EQ input.
            output = t006a-msehi.
          ENDSELECT.
        ELSE.
          t006a_s_tab-mseh3 = input.
          READ TABLE t006a_s_tab WITH KEY t006a_s_tab-mseh3 BINARY SEARCH.
          output = t006a_s_tab-msehi.
        ENDIF.
        PERFORM get_t006b USING    language mseh3
                          CHANGING output l_return_code.
    From this code extract, we see that the value mapping of UoM is actually in table T006A.  If you equate the UOM_PLAN field in the Planning Area field, with MSEHI of table T006A, the value at field MSEH3 or table T006A would reflect in the planning book.
    Hope this helps

  • How to delete the row in table control with respect to one field in module pool programming?

    Hi,
    Can I know the way to delete the row in table control with respect to one field in module pool programming
    Regards
    Darshan MS

    HI,
    I want to delete the row after the display of table control. I have created push button as delete row. If I click on this push button, the selected row should get deleted.
    I have written this code,
    module USER_COMMAND_9000 input.
    DATA OK_CODE TYPE SY-UCOMM.
    OK_CODE = SY-UCOMM.
    CASE OK_CODE.
         WHEN 'DELETE'.
            LOOP AT lt_source INTO ls_source WHERE mark = 'X'.
                APPEND LS_SOURCE TO LT_RESTORE.
                DELETE TABLE LT_SOURCE FROM LS_SOURCE.
                SOURCE-LINES = SOURCE-LINES - 1.
            ENDLOOP.
    But I'm unable to delete the selected rows, It is getting deleted the last rows eventhough I select the other row.
    So I thought of doing with respect to the field.

  • Merge two rows & show in a single row in table results

    Hi, I need to merge 2 rows having 3 columns in a single row in table view
    The cols are currently shown as :
    Project NO-------(Current_Month) Revenue----------(Prior_Month) Revenue
    123123 10000
    20000
    Revenue is a single column with revenue for diffreent Period.
    10000 is for May
    20000 is for April
    Project NO for both are same, just the periods are different. if I am not displaying Period i need to merge the 2 rows & show as
    Project NO-------(Current_Month) Revenue----------(Prior_Month) Revenue
    123123 10000 20000
    Please let me know how we can acheive this??
    thanx
    Pankaj

    123123 is the project number..
    the above is not getting displayed properly....as the blank spaces are removed...
    Please consider this

  • How to get all the index of "selected rows" in table control?

    Hi Gurus,
    I have a table control, wherein I need to get selected row so that I can get its respective TABIX.
    I know that the event for capturing selected row is in PAI.
    I also ensure that the w/ selColumn name in my screenpainter is exactly the same as my declaration in ABAP.
    TOP INCLUDE
    YPES: BEGIN OF Y_ZQID_CHECK,
            IDNUM           TYPE ZQID_CHECK-IDNUM,
            WERKS           TYPE ZQID_CHECK-WERKS,
            MATNR           TYPE ZQID_CHECK-MATNR,
            LICHA           TYPE ZQID_CHECK-LICHA,
            LIFNR           TYPE ZQID_CHECK-LIFNR,
            ECOA_S          TYPE ZQID_CHECK-ECOA_S,
            ID_STAT         TYPE ZQID_CHECK-ID_STAT,
            ID_DATE         TYPE ZQID_CHECK-ID_DATE,
            FLAG_MAILCOA(1) TYPE C,
            MARK(1)         TYPE C, "Name of w/ SelColumn in ScreenPainter: T_ZQIDCHECK_DISCH-MARK
           END   OF Y_ZQID_CHECK.
    DATA: T_ZQIDCHECK_DISCH TYPE STANDARD TABLE OF Y_ZQID_CHECK WITH HEADER LINE.
    PAI
    PROCESS AFTER INPUT.
    * MODULE USER_COMMAND_9004.
    LOOP AT T_ZQIDCHECK_DISCH.
      MODULE READ_TC_DISCH .
    ENDLOOP.
    module READ_TC_DISCH input.
      DATA: W_LINE_SEL TYPE SY-STEPL,
                  W_TABIX    LIKE SY-TABIX.
      GET CURSOR LINE W_LINE_SEL.
      W_TABIX = TC_ID_ONLY-TOP_LINE + w_LINE_SEL - 1.
      MODIFY T_ZQIDCHECK_DISCH INDEX TC_ID_ONLY-current_line.
    If I am selecting single row, I can properly get the selected index via debug.
    BUG:
    When I'm selecting multiple rows in table control, only the last row is always being read inside the loop of my table control.
    Please see the screenshot.
    [url]http://img268.imageshack.us/img268/5739/tcselectedrows.jpg[url]
    Notice in the debug screenshot, even if it's just in the 1st loop of table control, it automatically gets the 4th table control index, instead of the 2nd one.
    Helpful inputs will be appreciated.
    Thanks.
    Jaime
    Edited by: Jaime Cabanban on Dec 9, 2009 3:16 PM

    Hi,
    Are you sure that you have selected multiple line for tablecontrol in the property window of the tablecontrol.
    Flowlogic.
    LOOP WITH CONTROL TC_01.
         Module Get_Marked.
    ENDLOOP.
    Module Pool
    Module Get_Marked.
    read the data from the internal table where mark  = 'X'.
    this should give you only selected records.
    Endmodule.
    Kindly check the tablecontrol property.
    Regards,
    Ranjith Nambiar

  • How to delete a row of table in Word using powershell.

    I want to search for a word which is present in Table. If that word is present than I want to delete that row from table.
    Can anybody help me with that. The script I am using is:
    $objWord = New-Object -ComObject word.application
    $objWord.Visible = $True
    $objDoc = $objWord.Documents.Open("C:\temp\Recipe.docx")
    $FindText = "DP1"
    $objSelection.Find.Execute($FindText)
    $objWord.Table.Cells.EntireRow.Delete()
    $objDoc.SaveAs("C:\Temp\P.docx")
    $Doc.Close()

    Maybe try this:
    $objWord = New-Object -ComObject word.application
    $objWord.Visible = $True
    $objWord.Documents.Open("C:\temp\Recipe.docx")
    $FindText = "DP1"
    $objWord.Selection.Find.Execute($FindText) | Out-Null
    $objWord.Selection.SelectRow()
    $objWord.Selection.Cells.Delete()
    $objWord.Documents.SaveAs("C:\Temp\P.docx")
    $objWord.Close()
    $objWord.Quit()
    [System.Runtime.InteropServices.Marshal]::ReleaseComObject([System.__ComObject]$objWord) | Out-Null
    This definitely assumes the text you're trying to find only exists in a table, per your specified requirements.  If it exists anywhere else, or in multiple tables, the code above is inadequate.
    I hope this post has helped!

  • Unable to delete a row in table control

    Hi,
    I'm unable to delete a row in table control.
    I have defined a selection column for my table control but it is not getting value 'X' when i'm selecting a row for deletion.
    Also, when I press enter, some of the columns in table control are getting initialized. I'm passing these values to the internal table along with other columns.
    Please help.
    Regards,
    Manasee
    Message was edited by: Manasee Chandorkar

    hi,
    kindly chk this.
    PROCESS BEFORE OUTPUT.
    MODULE status_9010.
    LOOP WITH CONTROL tab_control.
    MODULE move_data_to_table.
    ENDLOOP.
    PROCESS AFTER INPUT.
    LOOP WITH CONTROL tab_control.
    MODULE move_data_from_table.
    ENDLOOP.
    *& Module move_data_to_table OUTPUT
    This is to move the data from the internal table to the table control
    MODULE move_data_to_table OUTPUT.
    This is to move the data from the internal table to the table control
    zmpets_mode-modecode,zmpets_range-rangeid,zmpets_servfacto-factor are column name of table control
    READ TABLE int_factor INDEX tab_control-current_line.
    IF sy-subrc = 0.
    zmpets_mode-modecode = int_factor-modecode.
    zmpets_range-rangeid = int_factor-rangeid.
    zmpets_servfacto-factor = int_factor-factor.
    ENDIF.
    ENDMODULE. " move_data_to_table OUTPUT
    *& Module move_data_from_table INPUT
    Date is moved from the table control to the Internal Table
    MODULE move_data_from_table INPUT.
    To move the data from the table control to internal table 'INT_FACTOR'.
    int_factor-modecode = zmpets_mode-modecode.
    int_factor-rangeid = zmpets_range-rangeid.
    int_factor-factor = zmpets_servfacto-factor.
    int_factor-chk = line.
    *here if the data is there, it will modify
    MODIFY int_factor INDEX tab_control-current_line.
    IF sy-subrc NE 0. "data not exixting in table control . ie new data, then append it
    APPEND int_factor.
    CLEAR int_factor.
    ENDIF.
    ENDMODULE. " move_data_from_table INPUT
    *delete a line from table control
    MODULE user_command_9010 INPUT.
      CASE sy-ucomm.
    When an entry is deleted, and the entry is removed from the table
    control.
        WHEN 'DELETE'.
          PERFORM f_del_frm_tab_cntrl.
      ENDCASE.
    ENDMODULE.
    FORM f_del_frm_tab_cntrl .
      LOOP AT int_factor WHERE chk = 'X'.
        DELETE int_factor WHERE chk = 'X' .
        CONTINUE.
      ENDLOOP.
      CLEAR int_factor.
    ENDFORM.
    for any clarifiaction pls mail me.
    pls reward points, if this helped u.
    regards,
    anversha.
    [email protected]
    Message was edited by: Anversha s

  • How to select perticular row in table control for BDC

    Hi all
    I want to select perticular row in table control for deletion through BDC. My transaction is CA02, My input is  material no and plant , then it display table control with work center. Now i want to select W999 cost center and delete through BDC.
    Please Suggest me. it urgent.
    Thanks& Regards,
    RP

    Hi all
    I want to select perticular row in table control for deletion through BDC. My transaction is CA02, My input is  material no and plant , then it display table control with work center. Now i want to select W999 cost center and delete through BDC.
    Please Suggest me. it urgent.
    Thanks& Regards,
    RP

  • . How to put Check box in every row in Table

    Hi Friends,
    I have one doubt in Webdynpro with java. How to put Check box in every row in Table?
    For Exam My requirement is I am getting BAPI from ECC System. So I have to go given input details in first view and output details in SecondView. So in Second View I will taken Table that data will displayed in rows. I need each and every row first I need check box.
    Here Select Check Box of particular row then click GetData button.  That row data will be displayed in one popup window.
    In table suppose 6 rows available in table. Every Row first Check box available.
                             empid, name, sal  ,firstname, last Name
                             empid, name, sal  ,firstname, last Name
                             empid, name, sal  ,firstname, last Name 
    How to put Check box in every row in Table?  can you send any examples applications
    Regards
    Vijay

    Hi Friend,
    When we are getting BAPI From ECC System. that BAPI Have nodes and Attribues...in under node we can't create "CheckBox"
    attribute.
    So i am doing like this.I am create on Checbox attribue out side of Node. Check Box data type is boolean.
    next i am creating table ( that table having rows and columns) Right click on table-->Click on Insert GroupedColumn->again right click on nsert GroupedColumn---> Here Select Check Box.
    Okay...here i am getting one problem. i have got Check boxes .But i am select check box in  first row. that time all check boxes will be selected.
    i need select first row check box that only first row will be selected suppose i selected second row check box that only second will be selected.
    i need this can u help me....
    Regards
    Vijay

  • How to get color in the final row of table view( table control)

    Hi,
    iam having a table control displayed with 10 records as output,in that i need to provide a color for the final row since it is total inorder to show difference from other records.
    Kindly advise me on this.
    Thanks & Regards,
    Nehru.

    Hi Nehru,
    Checkout [THIS|Re: set color for a particular row in table view] thread .
    [This |http://www.sapdesignguild.org/resources/htmlb_guidance/table.html#at] Might also help you.
    Regards,
    Anubhav
    Edited by: Anubhav Jain on Jan 4, 2009 7:34 AM

Maybe you are looking for

  • Boot Camp - Upgrade to Windows 8.1

    I have installed the upgrade version of Windows 8.1 on my Mac Pro before the instalation of the latetst Boot Camp Assistant and ended up unavaiable to get into the original OS X. How can I get into the OS X at the starting of the computer ? Or, can I

  • Downloading Adobe Flash Player Problem

    When I attempt to dowload the flash player I get a window of four possible problems that I might encounter with the download and that is all that happens. What is causing this to occur.  I need to download the flash player, but this window stops any

  • How to customize the search screen of InputListoFValues

    I have a requirement where i have InputListofvalues, i click on serach icon and see the popup with serach panel and table. I want to customize that search panel,i need to have simple serach dont want those advance feature like Match all , advance sea

  • Character value can display automatically

    Dear Experts, i have a question for character value. In BW, we have master data for PH1-PH6. And if you know the PH6, there is unique value for PH1-PH5. For example, if PH6 = 111111, PH1 = 1, PH2 = 11, PH3 = 111,etc. This value is unique. Now we have

  • Node addElement or ChangeElement....

    Hello , I have a senario that user enters some values and later if he wishes he can change the values he entred last time.... So right now I have doing something like: wdContext.nodeAddress_Tmp().addElement(wdContext.currentContextElement().getRowNr(