Global Temp Table or Permanent Temp Tables

I have been doing research for a few weeks and trying to comfirm theories with bench tests concerning which is more performant... GTTs or permanent temp tables. I was curious as to what others felt on this topic.
I used FOR loops to test out the performance on inserting and at times with high number of rows the permanent temp table seemed to be much faster than the GTTs; contrary to many white papers and case studies that have read that GTTs are much faster.
All I did was FOR loops which iterated INSERT/VALUES up to 10 million records. And for 10 mil records, the permanent temp table was over 500k milliseconds faster...
Anyone have an useful tips or info that can help me determine which will be best in certain cases? The tables will be used for staging for ETL Batch processing into a Data Warehouse. Rows within my fact and detail tables can reach to the millions before being moved to archives. Thanks so much in advance.
-Tim

> Do you have any specific experiences you would like to share?
I use both - GTTs and plain normal tables. The problem dictates the tools. :-)
I do have an exception though that does not use GTTs and still support "restartability".
I need to to continuously roll up (aggregate) data. Raw data collected for an hour gets aggregated into an hourly partition. Hourly partitions gets rolled up into a daily partition. Several billion rows are processed like this monthly.
The eventual method I've implemented is a cross between materialised views and GTTs. Instead of dropping or truncating the source partition and running an insert to repopulate it with the latest aggregated data, I wrote an API that allows you to give it the name of the destination table, the name of the partition to "refresh", and a SQL (that does the aggregation - kind of like the select part of a MV).
It creates a brand new staging table using a CTAS, inspects the partitioned table, slaps the same indexes on the staging table, and then performs a partition exchange to replace the stale contents of the partition with that of the freshly build staging table.
No expensive delete. No truncate that results in an empty and query-useless partition for several minutes while the data is refreshed.
And any number of these partition refreshes can run in parallel.
Why not use a GTT? Because they cannot be used in a partition exchange. And the cost of writing data into a GTT has to be weighed against the cost of using that data by writing it (or some of it) into permanent tables. Ideally one wants to plough through a data set once.
Oracle has a fairly rich feature set - and these can be employed in all kinds of ways to get the job done.

Similar Messages

  • Howto replace Permanent "Temp" table?

    Hi,
    I'm joining a project, but I am too novice on PL/SQL. One particular Procedure does a search query. The approach can be summarized as:
    * Build a SQL string base on values passed in the list of parameters
    * Truncate a permanent table named Temp_SearchID
    * Execute the dynamic SQL string which will populate the Temp_SearchID table (INSERT INTO Temp_SearchID SELECT ...)
    * Populate the returned cursor by a SELECT joining other tables with the IDs in Temp_SearchID
    {color:#ff0000}*Q1.*{color} If the application load increases, there will be a higher concurrency and let assume that there are several simultaneous calls to the procedure above. What would happen as all these calls has different parameters and they all want to compete to use the Temp_SearchID table their own way?
    *{color:#ff0000}Q2.{color}* If the approach using a permanent Temp_SearchID table is not recommended, it is possible to dump the IDs in a cursor which is local only to the procedure. Then make a SELECT joining with this cursor? In the SQL Server world, this is possible using Table variable or session local temp tables. If it is possible to achieve this with PL/SQL, can you please show me the syntax or point me to some code snippets?
    Thank you very much for any help.

    CarbonFiber wrote:
    In the scenario I am dealing with, the business rules are a little bit complex. Writing the entire SQL query in a string would indeed solve the problem. But it would be come hard to debug & test as the entire query now appear as a big string. I might ressort to that. As I am used to SQL Server, I would like to know if Oracle has a memory based structure, then join it with other tables to produce the final results. I don't know PL/SQL well enough to write a query joining a cursor with a table. Can you show me the PL/SQL syntax to do that?
    From a performance perspective in Oracle the single, plain SQL approach probably would be the best, depending on some other factors, like e.g. how sharable the SQL generated would then be (or are you going to generate then literally thousands of different large SQL statement as part of your process).
    As already discussed here you have in Oracle the option to create a GLOBAL TEMPORARY TABLE that holds data visible only to the session that's using it, but it's still writing/reading that data potentially from disk.
    The subquery factoring approach outlined by David might also be a quite good option if statement complexity is your main issue.
    The in-memory approach in Oracle is a bit more complex. You would need to create an type, a collection type on top of that and then use the PL/SQL collection to use in the query, e.g. like that:
    create or replace type t_qualified_id as object (
    productid integer
    create or replace type t_qualified_id_array as table of t_qualified_id;
    create or replace procedure test_collection as
      in_memory_table t_qualified_id_array;
    begin
      select t_qualified_id(object_id)
      bulk collect into in_memory_table
      from user_objects;
      for rec in (select object_id
      from table(in_memory_table) a
      inner join user_objects b
      on a.productid = b.object_id) loop
        null;
      end loop;
    end;
    /or like that if you don't want to use the "object" type, you can just create an array of the built-in number/integer type.
    create or replace type t_qualified_id_array as table of integer;
    create or replace procedure test_collection as
      in_memory_table t_qualified_id_array;
    begin
      select object_id
      bulk collect into in_memory_table
      from user_objects;
      for rec in (select object_id
      from table(in_memory_table) a
      inner join user_objects b
      on value(a) = b.object_id) loop
        null;
      end loop;
    end;
    /But you can see that there are subtle differences and you need to study the SQL and PL/SQL developer guide to get the details. It gets even more confusing because you can have in Oracle PL/SQL collection types and SQL collection types which differ in various aspects in terms of usage.
    Regards,
    Randolf
    Oracle related stuff blog:
    http://oracle-randolf.blogspot.com/
    SQLTools++ for Oracle (Open source Oracle GUI for Windows):
    http://www.sqltools-plusplus.org:7676/
    http://sourceforge.net/projects/sqlt-pp/
    Edited by: Randolf Geist on Sep 22, 2008 6:57 PM
    Subquery factoring mentioned

  • Where the table is stored.,temp tablespace is not freed .

    I have created one table in oracle 8.1.7 containing 500 mega records .
    my question is
    1. where the table is stored as the size of users.dbf is just 20 mb and is used upto only 2 MB.
    so it must not be there.
    2. the size of tools.dbf in tools tablespce is somewhere around 13GB. is it there?
    if yes then why?
    3. also the size of temp01.dbf+temp02.dbf + temp03.dbf in temp table space is somewhere around 14 gb and it is going on increasing. say when i give command to create index it after running for some 8 hrs gives error that disk space is full . when i again run he command after adding more file and space the command runs but the space earlier utilised in temp is not released .
    i have tried restrating the oracle service . but it does not release the space utilised by temp.Also the table space is temp not permanent.
    IMP.
    4. can i manually delete the data files in temp.
    will it affect my database????
    thanksin adv.

    I have created one table in oracle 8.1.7 containing 500 mega records .
    my question is
    1. where the table is stored as the size of users.dbf is just 20 mb and is used upto only 2 MB.
    so it must not be there.
    2. the size of tools.dbf in tools tablespce is somewhere around 13GB. is it there?
    if yes then why?
    3. also the size of temp01.dbf+temp02.dbf + temp03.dbf in temp table space is somewhere around 14 gb and it is going on increasing. say when i give command to create index it after running for some 8 hrs gives error that disk space is full . when i again run he command after adding more file and space the command runs but the space earlier utilised in temp is not released .
    i have tried restrating the oracle service . but it does not release the space utilised by temp.Also the table space is temp not permanent.
    IMP.
    4. can i manually delete the data files in temp.
    will it affect my database????
    thanksin adv. Query user_segments as the table owner. This will tell you which tablespace (also the size of you table allocation).
    Query dba_data_files, may need to be a DBA account (e.g. system). This will relate files to tablespaces.
    The files in dba_data_files and in v$logfile and the control files listed in init<sid>.ora comprise your database.
    Hope this helps
    Ken

  • Unable to extend the temp segment by 2560 in table space TEMP

    Hi,
    I am running the procedure, it aggregate the data, it's taking minimum 1 hour to complete. B4 complete it throw an error:
    ORA-01652 - Unable to extend the temp segment by 2560 in table space TEMP.
    Note: Tround 5GB disk space is there.
    Please help and give me your suggestions.
    Thanks
    Sathya

    Well, I'll go out on a limb and suggest that the problem is that the procedure ran out of TEMP space. It's relatively easy to generate 5 GB of intermediate results to sort in the space of an hour. If other sessions were using TEMP space at the same time, that would obviously reduce the amount available to this procedure.
    Your two options would be to decrease the amount of sorting that the procedure needs to do or to allocate more TEMP space for it.
    Justin

  • Import IP Prefix from Global Table into a VRF Table

    Hello,
    Is it possible to import IP Prefix from Global Table into a VRF Table on ASR9001 with Version 4.3.0? Thanks.
    Regards,
    Eric

    hi Eric,
    In 4.3.1 there is a feature that will allow you to do this.
    The Border Gateway Protocol (BGP) dynamic route leaking feature provides the ability to import routes
    between the default-vrf (Global VRF) and any other non-default VRF, to provide connectivity between a
    global and a VPN host. The import process installs the Internet route in a VRF table or a VRF route in the
    Internet table, providing connectivity.
    You can follow the ASR9000 blog here to monitor when 4.3.1 will be posted to Cisco.com
    https://supportforums.cisco.com/blogs/asr9k
    or watch for the 4.3.1 Release Notes here
    http://www.cisco.com/en/US/products/ps5845/prod_release_notes_list.html
    regards,
    David

  • How to create another table with an exisiting table

    Hi,
    I want to create another table on the top of an existing table, my existing table has few constraints and checks, i want to create another table which should copy all checks and constraints from existing table.
    for example;
    <p>
    create table temp
    </p>
    <p>
    (idx number primary key,
    </p>
    <p>
    publish varchar2(250) not null,
    </p>
    <p>
    address varchar2(250) not null,
    </p>
    <p>
    rent number check (rent &gt;=0))
    --<strong>Required output</strong>
    </p>
    <p>
    create table temp1
    </p>
    <p>
    as select * from temp
    </p>
    <p>
    where 1=2
    the structure of the table received as an output has the following structure:
    </p>
    <p>
    CREATE TABLE TEMP1
    </p>
    <p>
    </p>
    <p>
    IDX NUMBER,
    </p>
    <p>
    PUBLISH VARCHAR2(250 BYTE) NOT NULL,
    </p>
    <p>
    ADDRESS VARCHAR2(250 BYTE) NOT NULL,
    </p>
    <p>
    RENT NUMBER
    </p>
    <p>
    </p>
    <p>
    where i could not find primary key and check constraint but i can see not null check. Is there anyway that i could get the all checks when i create another table on the top of an existing table.
    </p>
    Best Regards,

    A CTAS (CREATE TABLE AS SELECT ...) probably isn't what you're looking for since that doesn't handle constraints.
    Depending on the Oracle version, you can probably use the DBMS_METADATA package to get the DDL for the table and then modify the DDL to create the copy, i.e.
      1* select dbms_metadata.get_ddl( 'TABLE', 'TEMP' ) from dual
    SQL> /
    DBMS_METADATA.GET_DDL('TABLE','TEMP')
      CREATE TABLE "SCOTT"."TEMP"
       (    "IDX" NUMBER,
            "PUBLISH" VARCHAR2(250) NOT NULL ENABLE,
            "ADDRESS" VARCHAR2(250) NOT NULL ENABLE,
            "RENT" NUMBER,
             CHECK (rent >=0) ENABLE,
             PRIMARY KEY ("IDX")
      USING INDEX PCTFREE 10 INITRANS 2 MAXTRANS 255
      STORAGE(INITIAL 65536 NEXT 1048576 MINEXTENTS 1 MAXEXTENTS 2147483645
      PCTINCREASE 0 FREELISTS 1 FREELIST GROUPS 1 BUFFER_POOL DEFAULT)
      TABLESPACE "USERS"  ENABLE
       ) PCTFREE 10 PCTUSED 40 INITRANS 1 MAXTRANS 255 NOCOMPRESS LOGGING
      STORAGE(INITIAL 65536 NEXT 1048576 MINEXTENTS 1 MAXEXTENTS 2147483645
      PCTINCREASE 0 FREELISTS 1 FREELIST GROUPS 1 BUFFER_POOL DEFAULT)
      TABLESPACE "USERS"You'd then just need to replace the "TEMP" with "TEMP1".
    Justin

  • How to insert a table data into temporary table

    Hi
    Can anyone help me to insert a table data into temporary table.
    Thanks
    Navin

    If you could provide a (simplified) example of the data you have and the output you're attempting to get, that would probably be quite helpful. I'm not sure that I understand exactly what you're trying to do here...
    1) It sounds like you know the structure of the result set you're trying to generate. So it would be possible to create a temporary table once (at the same time that you create all your other tables) and write procedural PL/SQL code that would step through the data, write data to the temp table, select the data out of the temp table, and return a REF CURSOR. That would tend not to be the way that an Oracle developer would do things (there are exceptions, of course), but it would work.
    2) I don't see any inherent problems in using sub-selects and inline views to do whatever aggregation you're trying to do on the secondary tables, which would allow you to get the output in a single query. For example, given an ORDERS table and an ORDER_DETAILS table,
    SELECT o.customer_id, o.invoice_number, SUM( od.line_item_cost ) total_cost
      FROM orders o,
           order_details od
    WHERE o.order_id = od.order_id
    GROUP BY o.customer_id, o.invoice_number3) If you do need to use procedural logic, I would tend to look into the use of pipelined table functions or to read the data into an in-memory collection and to manipulate and return that collection.
    Justin

  • Transferring data from one table to few new tables

    Hi,
    I have a table that contains millions of records. I need to transfer the records from that table to few new tables. For a future convenient purpose, the data is being splitted into multiple new tables. I am planning to use a cursor to fetch the records from the old table and call a new procedure in a loop. The new procedure is the one doing the split and inserts into different tables. Also I am planning to commit each 10,000 records. This is because I am planning to store the primary key in a temp table each time and to store the error message in another temp log table. So that I can rerun the same query multiple times to complete this transfer. I know doing in a loop and frequent commit will affect the performance. But I feel those 2 is required in my scenario. Can anyone suggest a better way if any?
    Thanks

    check if "create onesmaletable as select col1, col4, colwhatever from hugetable" (CTAS) works for you.
    hth

  • Table control in internal table

    what will b the sequence to creat module pool
    i.e  topinclude , pbo ,pai , screen-layout, flow logic ,
    i m confused which sequence i follow .

    Hi,
    In the Object Browser, the module pool code belongs to one of the following categories:
    Global fields: data declarations that can be used by all modules in the module pool
    PBO modules: modules that are called before displaying the screen
    PAI modules: modules that are called in response to the user input
    Subroutines: subroutines that can be called from any position within the module pool
    By default, the system divides a module pool into one or several include programs. An include program can contain several modules of the same type (only PBO modules or only PAI modules). The main program then consists of a sequence of INCLUDE statements that link the modules to the module pool:
    *& Module pool      SAPMTZ10
    *& Display data of Table SPFLI
    Global data
    INCLUDE MTZ10TOP.
    PAI modules
    INCLUDE MTZ10I01.
    PBO modules
    INCLUDE MTZ10O01.
    In the ABAP/4 editor, you can display the code hidden behind the INCLUDE statements by choosing Edit ---> More functions ---> EXPAND include. With all INCLUDE statements expanded, the module pool looks like this:
    *& Module pool      SAPMTZ10
    *&           FUNCTION: Display data from Table SPFLI
    INCLUDE MTZ10TOP (This is the TOP include:
    the TOP module contains global data declarations)
    PROGRAM SAPMTZ10.
         TABLES: SPFLI.
         DATA OK_CODE(4).
    INCLUDE MTZ10I01 (This is a PAI include.)
    *& Module USER_COMMAND_0100 INPUT
    Retrieve data from SPFLI or leave transaction
    MODULE USER_COMMAND_0100 INPUT.
    CASE OK_CODE.
         WHEN 'SHOW'.
              CLEAR OK_CODE.
              SELECT SINGLE * FROM SPFLI WHERE CARRID = SPFLI-CARRID
                                                                      AND       CONNID = SPFLI-CONNID.
         WHEN SPACE.
         WHEN OTHERS.
              CLEAR OK_CODE.
              SET SCREEN 0.
              LEAVE SCREEN.
         ENDCASE.
    ENDMODULE.
    INCLUDE MTZ10O01 (This is a PBO include.)
    *& Module STATUS_0100
    Specify GUI status and title for screen 100
    MODULE STATUS_0100.
         SET PF-STATUS 'TZ0100'.
         SET TITLEBAR '100'.
    ENDMODULE.
    You use the ABAP/4 Dictionary to store frequently used data declarations centrally. Objects defined in the Dictionary are known throughout the system. Active Dictionary definitions can be accessed by any application. Data defined in the Dictionary can be included in a screen or used by an ABAP/4 program. You declare global data in the TOP module of the transaction, using the TABLES, STRUCTURE, LIKE statements and others
    Cheers,
    vasu.
    kindly reward if helpful.

  • Diference b/w normal tables and pl/sql tables

    what is the diference b/w normal tables and pl/sql tables.
    already we have tables then what is the purpose of pl/sql tables.
    what are the advantages of pl/sql tables compare nor mal tables.

    user10447332 wrote:
    what is the diference b/w normal tables and pl/sql tables.As BluShadow pointed out "PL/SQL tbles" are now called collections. They are memory structures that store data and go away when your program stops running - essentially arrays or "tables" in other programming languages. "normal" ("heap") tables are stored on disk and are persistant - they store data permanently.
    already we have tables then what is the purpose of pl/sql tables.Collections are used to store data temporarily in memory, when it will be accessed shortly or repeatedly for fast access. This works best with smaller data sets because if you store too much data in a collection you can run out of memory to use
    what are the advantages of pl/sql tables compare nor mal tables.Collections give you faster access to data you need to read more than once during the same application. They work well for fast lookups and for smaller data sets can improve program performance slightly with the use of the bulk collect and forall PL/SQL operators under the right conditions.

  • How to synchronize between DHCP binding table and DHCP snooping table ?

    I clear DHCP snooping table with command "clear ip dhcp snooping binding " , and PC can't communicate with other any more. So how to synchronize between DHCP binding table and DHCP snooping table ?
    dhcp-test#sh ip dhcp bind
    IP address Client-ID/ Lease expiration Type
    Hardware address
    99.1.65.32 0100.1125.353c.25 Mar 02 1993 01:05 AM Automatic
    99.1.65.33 0100.1438.059f.85 Mar 02 1993 12:01 AM Automatic
    dhcp-test#sh ip dhcp snooping binding
    MacAddress IpAddress Lease(sec) Type VLAN Interface
    Total number of bindings: 0
    thanks!

    ip dhcp snooping binding mac-address vlan vlan-id ip-address interface interface-id expiry seconds
    Add binding entries to the DHCP snooping binding database. The vlan-id range is from 1 to 4904. The seconds range is from 1 to 4294967295.
    Enter the above command for each entry that you add
    To delete the database agent or binding file, use the no ip dhcp snooping database interface configuration command. To reset the timeout or delay values, use the ip dhcp snooping database timeout seconds or the ip dhcp snooping database write-delay seconds global configuration command.To renew the database, use the renew ip dhcp snooping database privileged EXEC command.

  • Table JAVA$CLASS$MD5$TABLE is not created by loadjava

    When loading a class using loadjava, the table JAVA$CLASS$MD5$TABLE is not created (in the schema where the class is loaded) by the loadjava tool although it should do this according to the doc. Where is this table located?
    The class loads successfully and is skipped when loading a second time. The force option is not used.
    Version is: Oracle Database 10g Enterprise Edition Release 10.2.0.4.0 - 64bit

    IN order for a registered schema to be available to other users the schema must be registered as a GLOBAL, rather than a LOCAL Schema. This is controlled by the third agument passed to registerSChema, and the default is local. Note that you will also need to explicity grant appropriate permissions on any tables created by the schema registration process to other users who will be loading or reading data from these tables.

  • Excel 2010 Pivot Table VBA Not Refreshing Table

    My company recently upgraded from Excel 2003 to 2010. I had VBA written to take source data and convert it into a number of Pivot Tables on a number of worksheets. It has been working fine for years. After upgrading to 2010 the VBA crashed. I tracked it
    down to the fact that when my code was making changes to the Pivot Tables (changing fields, filters, etc...) the pivot table on the worksheet had no data, but the fields were there. I can manually go to the pivot table and manually refresh and all the data
    comes in.
    So I tried adding the VBA code to refresh the pivot table, but the pivot tables will not refresh with data.
    I tried:
    ActiveSheet.PivotTables("WO Pivot").RefreshTable
    and
    ActiveWorkbook.RefreshAll
    And these did not work.
    I also tried recording a macro for the manual steps to refresh and got:
     ActiveSheet.PivotTables("WO Pivot").PivotCache.Refresh
    This does not work either.
    The PivotTable name is correct, but I tried using the number as well, and the name works for other code manipulating the the pivot table.
    e.g.:
    With ActiveSheet.PivotTables("WOPivot").PivotFields("Task Title")
          .Orientation = xlRowField .Position = 2
          .Subtotals = Array(False, False, False, False, False, False, False, False, False, False, _False, False)
    End
    With Why isn't this working? Is there another way to refresh pivot table data in 2010?
    Thanks. P.S. I've tried formating this so it is readable, but it comes out garbled. Hope this looks better.

    The solution above didn't work for me, but the following did the trick:
    ActiveSheet.PivotTables("WOPivot").PivotCache.Refresh
    By the way, I identified it by recording a macro, then going on the Pivot Table that needed refreshing and pressing F9 to refresh it. The line of VBA code above was the result.
    Cheers,
    Marco.

  • Copy selected values from a table control into another table control

    hi there,
    as seen in the subject i need to copy selected values from a table control into another table control in the same screen. as i dont know much about table controls i made 2 table controls with the wizard and started to change the code... right now im totally messed up. nothing works anymore and i don't know where to start over.
    i looked up the forums and google, but there is nothing to help me with this problem (or i suck in searching the internet for solutions)
    i have 2 buttons. one to push the selected data from the top table control into the bottom tc and the other button is to push selected data from the bottom tc into the top tc. does somebody has a sample code to do this?

    you're funny
    i still don't get it... can't believe, there is no tutorial or sample code around how to copy multiple selected rows from a tc.
    here's my code, maybe you can tell me exactly were i have to change it:
    tc1 = upper table control
    tc2 = lower table control
    SCREEN 0100:
    PROCESS BEFORE OUTPUT.
      MODULE status_0100.
      MODULE get_nfo. --> gets data from the dictionary table
      MODULE tc1_change_tc_attr.
      LOOP AT   it_roles_tc1
           INTO wa_roles_tc1
           WITH CONTROL tc1
           CURSOR tc1-current_line.
      ENDLOOP.
      MODULE tc2_change_tc_attr.
      LOOP AT   it_roles_tc2
           INTO wa_roles_tc2l
           WITH CONTROL tc2
           CURSOR tc2-current_line.
      ENDLOOP.
    PROCESS AFTER INPUT.
      LOOP AT it_roles_tc1.
        CHAIN.
          FIELD wa_roles_tc1-agr_name.
          FIELD wa_roles_tc1-text.
        ENDCHAIN.
        FIELD wa_roles_tc1-mark
          MODULE tc1_mark ON REQUEST.
      ENDLOOP.
      LOOP AT it_roles_tc2.
        CHAIN.
          FIELD wa_roles_tc2-agr_name.
          FIELD wa_roles_tc2-text.
        ENDCHAIN.
        FIELD wa_roles_tc2-mark
          MODULE tc2_mark ON REQUEST.
      ENDLOOP.
      MODULE ok_code.
      MODULE user_command_0100.
    INCLUDE PAI:
    MODULE tc1_mark INPUT.
      IF tc1-line_sel_mode = 2
      AND wa_roles_tc1-mark = 'X'.
        LOOP AT it_roles_tc1 INTO g_tc1_wa2
          WHERE mark = 'X'.    -
    > big problem here is, that no entry has an 'X' there
          g_tc1_wa2-mark = ''.
          MODIFY it_roles_tc1
            FROM g_tc1_wa2
            TRANSPORTING mark.
        ENDLOOP.
      ENDIF.
      MODIFY it_roles_tc1
        FROM wa_roles_tc1
        INDEX tc1-current_line
        TRANSPORTING mark.
    ENDMODULE.                    "TC1_MARK INPUT
    MODULE tc2_mark INPUT.
      IF tc2-line_sel_mode = 2
      AND wa_roles_tc2-mark = 'X'.
        LOOP AT it_roles_tc2 INTO g_tc2_wa2
          WHERE mark = 'X'.             -
    > same here, it doesn't gets any data
          g_tc2_wa2-mark = ''.
          MODIFY it_roles_tc2
            FROM g_tc2_wa2
            TRANSPORTING mark.
        ENDLOOP.
      ENDIF.
      MODIFY it_roles_tc2
        FROM wa_roles_tc2
        INDEX tc2-current_line
        TRANSPORTING mark.
    ENDMODULE. 
    thx for anybody who can help with this!

  • Fetch the values from internal table inside an internal table (urgent!!)

    data : BEGIN OF PITB2_ZLINFO occurs 0,
             BEGDA LIKE SY-DATUM,
             ENDDA LIKE SY-DATUM,
             PABRJ(4) TYPE N,                       "Payroll Year
             PABRP(2) TYPE N,                       "Pay. Period
             ZL LIKE PC2BF OCCURS 0,
           END OF PITB2_ZLINFO.
    I have a internal table like this,
    How to Fetch the values from internal table inside an internal table.
    Kindly Help me on this..
    Regards,
    Ram.

    Hi,
    Try this....
    Loop at PITB2_ZLINF0.
    Loop at PITB2_ZLINF0-ZL.
    endloop.
    Endloop.
    Thanks...
    Preetham S

Maybe you are looking for

  • Using multiple ipods on the same cpu

    my little brother & older sister just got new ipod videos and while i have an ipod photo i need to know if any1 can tell me how to set it up so i can use multiple ipods on the same (windows) computer without creating multiple user accounts if its eve

  • When I open, I get an error message over and over about secure site, but may send to unsecure . My homepage is mymsn

    When I open Firefox, it is set to load my homepage at MSN. I keep getting a dialogue box that says,"Although this page is encrypted, the information you have entered is to be sent over an unencrypted connection and could easily be read by a third par

  • Photoshop 7 to 11 Upgrade Adds Wrong Keyword Tags

    I recently upgraded from Photoshop 7 to 11 and now I'd finding that my photos have been automatically assigned additional and incorrect keyword tags. Most prevalent seems to be place and event mistagging. I'm now going back through all my photos to v

  • BAPI to update table STPO

    Hi, I need to update the standard table STPO, by passing BOM number and Component(INDRK), Can anyone suggest a BAPI to update table STPO.

  • C++ compiler cannot create executables [solved]

    It turns out that colorwrapper interferes even when disabled for makepkg. Taking it out of my path fixes the issue. Original post: I have a build failing that is complaining the C++ compiler cannot create executables. This example of the occurance is