Rows to column for huge number of records

my database version is 10gr2
i want to transfer the rows to column .....i have seen the examples for small no of records but how can it be done if there are more the 1000 records in a table ...???
here is the sample data that i would like to change it to column
SQL> /
NE              RAISED                         CLEARED                        RTTS_NO        RING                                                                              
10100000-1LU    22-FEB-2011 22:01:04/28-FEB-20 22-FEB-2011 22:12:27/28-FEB-20                SR-10/ ER-16/ CR-25/ CR-29/ CR-26/ RIDM-1/ NER5/ CR-31/ RiC600-1                  
                11 01:25:22/                   11 02:40:06/
10100000-2LU    01-FEB-2011 12:15:58/06-FEB-20 05-FEB-2011 10:05:48/06-FEB-20                RIMESH/ RiC342-1/ 101/10R#10/ RiC558-1/ RiC608-1                                  
                11 07:00:53/18-FEB-2011 22:04: 11 10:49:18/18-FEB-2011 22:15:
                56/19-FEB-2011 10:36:12/19-FEB 17/19-FEB-2011 10:41:35/19-FEB
                -2011 11:03:13/19-FEB-2011 11: -2011 11:08:18/19-FEB-2011 11:
                16:14/28-FEB-2011 01:25:22/    21:35/28-FEB-2011 02:40:13/
10100000-3LU    19-FEB-2011 20:18:31/22-FEB-20 19-FEB-2011 20:19:32/22-FEB-20                INR-1/ ISR-1                                                                      
                11 21:37:32/22-FEB-2011 22:01: 11 21:48:06/22-FEB-2011 22:12:
                35/22-FEB-2011 22:20:03/28-FEB 05/22-FEB-2011 22:25:14/28-FEB
                -2011 01:25:23/                -2011 02:40:20/
10100000/10MU   06-FEB-2011 07:00:23/19-FEB-20 06-FEB-2011 10:47:13/19-FEB-20                101/IR#10                                                                         
                11 11:01:50/19-FEB-2011 11:17: 11 11:07:33/19-FEB-2011 11:21:
                58/28-FEB-2011 02:39:11/01-FEB 30/28-FEB-2011 04:10:56/05-FEB
                -2011 12:16:21/18-FEB-2011 22: -2011 10:06:10/18-FEB-2011 22:
                03:27/                         13:50/
10100000/11MU   01-FEB-2011 08:48:45/22-FEB-20 02-FEB-2011 13:15:17/22-FEB-20 1456129/       101IR11 RIMESH                                                                    
                11 21:59:28/22-FEB-2011 22:21: 11 22:08:49/22-FEB-2011 22:24:
                52/01-FEB-2011 08:35:46/       27/01-FEB-2011 08:38:42/
10100000/12MU   22-FEB-2011 21:35:34/22-FEB-20 22-FEB-2011 21:45:00/22-FEB-20                101IR12 KuSMW4-1                                                                  
                11 22:00:04/22-FEB-2011 22:21: 11 22:08:21/22-FEB-2011 22:22:
                23/28-FEB-2011 02:39:53/       26/28-FEB-2011 02:41:07/
10100000/13MU   22-FEB-2011 21:35:54/22-FEB-20 22-FEB-2011 21:42:58/22-FEB-20                LD MESH                                                                           
                11 22:21:55/22-FEB-2011 22:00: 11 22:24:52/22-FEB-2011 22:10:

could you do something like this?
with t as (select '10100000-1LU' NE,   '22-FEB-2011 22:01:04/28-FEB-2011 01:25:22/' raised ,  '22-FEB-2011 22:12:27/28-FEB-2011 02:40:06/' cleared from dual union
              select '10100000-2LU', '01-FEB-2011 12:15:58/06-FEB-2011 07:00:53/18-FEB-2011 22:04:56/19-FEB-2011 10:36:12/19-FEB-2011 11:03:13/19-FEB-2011 11:16:14/28-FEB-2011 01:25:22/',
              '05-FEB-2011 10:05:48/06-FEB-2011 10:49:18/18-FEB-2011 22:15:17/19-FEB-2011 10:41:35/19-FEB-2011 11:08:18/19-FEB-2011 11:21:35/28-FEB-2011 02:40:13/' from dual
select * from(
select NE,   regexp_substr( raised,'[^/]+',1,1) raised, regexp_substr( cleared,'[^/]+',1,1) cleared  from t
union
select NE,   regexp_substr( raised,'[^/]+',1,2) , regexp_substr( cleared,'[^/]+',1,2) cleared  from t
union
select NE,   regexp_substr( raised,'[^/]+',1,3) , regexp_substr( cleared,'[^/]+',1,3) cleared  from t
union
select NE,   regexp_substr( raised,'[^/]+',1,4) , regexp_substr( cleared,'[^/]+',1,4) cleared  from t
union
select NE,   regexp_substr( raised,'[^/]+',1,5) , regexp_substr( cleared,'[^/]+',1,5) cleared  from t
union
select NE,   regexp_substr( raised,'[^/]+',1,6) , regexp_substr( cleared,'[^/]+',1,6) cleared  from t
union
select NE,   regexp_substr( raised,'[^/]+',1,7) , regexp_substr( cleared,'[^/]+',1,7) cleared  from t
union
select NE,   regexp_substr( raised,'[^/]+',1,8) , regexp_substr( cleared,'[^/]+',1,8) cleared  from t
union
select NE,   regexp_substr( raised,'[^/]+',1,9) , regexp_substr( cleared,'[^/]+',1,9) cleared  from t
union
select NE,   regexp_substr( raised,'[^/]+',1,10) , regexp_substr( cleared,'[^/]+',1,10) cleared  from t
union
select NE,   regexp_substr( raised,'[^/]+',1,11) , regexp_substr( cleared,'[^/]+',1,11) cleared  from t
where nvl(raised,cleared) is not null
order by ne
NE     RAISED     CLEARED
10100000-1LU     28-FEB-2011 01:25:22     28-FEB-2011 02:40:06
10100000-1LU     22-FEB-2011 22:01:04     22-FEB-2011 22:12:27
10100000-2LU     28-FEB-2011 01:25:22     28-FEB-2011 02:40:13
10100000-2LU     19-FEB-2011 10:36:12     19-FEB-2011 10:41:35
10100000-2LU     19-FEB-2011 11:03:13     19-FEB-2011 11:08:18
10100000-2LU     19-FEB-2011 11:16:14     19-FEB-2011 11:21:35
10100000-2LU     06-FEB-2011 07:00:53     06-FEB-2011 10:49:18
10100000-2LU     01-FEB-2011 12:15:58     05-FEB-2011 10:05:48
10100000-2LU     18-FEB-2011 22:04:56     18-FEB-2011 22:15:17you should be able to do it without all those unions using a connect by but I can't quite get it to work
the following doesn't work but maybe someone can answer.
select NE,   regexp_substr( raised,'[^/]+',1,level) raised, regexp_substr( cleared,'[^/]+',1,level) cleared from t
connect by  prior  NE = NE and   regexp_substr( raised,'[^/]+',1,level) = prior regexp_substr( raised,'[^/]+',1,level + 1)Edited by: pollywog on Mar 29, 2011 9:38 AM
here it is with the model clause which gets rid of all the unions.
WITH t
        AS (SELECT '10100000-1LU' NE,
                   '22-FEB-2011 22:01:04/28-FEB-2011 01:25:22/' raised,
                   '22-FEB-2011 22:12:27/28-FEB-2011 02:40:06/' cleared
              FROM DUAL
            UNION
            SELECT '10100000-2LU',
                   '01-FEB-2011 12:15:58/06-FEB-2011 07:00:53/18-FEB-2011 22:04:56/19-FEB-2011 10:36:12/19-FEB-2011 11:03:13/19-FEB-2011 11:16:14/28-FEB-2011 01:25:22/',
                   '05-FEB-2011 10:05:48/06-FEB-2011 10:49:18/18-FEB-2011 22:15:17/19-FEB-2011 10:41:35/19-FEB-2011 11:08:18/19-FEB-2011 11:21:35/28-FEB-2011 02:40:13/'
              FROM DUAL)
  SELECT *
    FROM (SELECT NE, raised, cleared
            FROM t
          MODEL RETURN UPDATED ROWS
             PARTITION BY (NE)
             DIMENSION BY (0 d)
             MEASURES (raised, cleared)
             RULES
                ITERATE (1000) UNTIL raised[ITERATION_NUMBER] IS NULL
                (raised [ITERATION_NUMBER + 1] =
                      REGEXP_SUBSTR (raised[0],
                                     '[^/]+',
                                     1,
                                     ITERATION_NUMBER + 1),
                cleared [ITERATION_NUMBER + 1] =
                      REGEXP_SUBSTR (cleared[0],
                                     '[^/]+',
                                     1,
                                     ITERATION_NUMBER + 1)))
   WHERE raised IS NOT NULL
ORDER BY NEEdited by: pollywog on Mar 29, 2011 10:34 AM

Similar Messages

  • Query using system parameter LEVEL returns incorrect huge number of records

    We migrate our database from Oracle *9.2.0.6* to *11.2.0.1*
    The query below throws "ORA-01788: CONNECT BY clause required in this query block".
    select * from (
    +select a.BOARD_ID, code, description, is_displayable, order_seq,  board_parent_id, short_description, IS_SUB_BOARD_DISPLAYABLE, <font color=blue>LEVEL</font> child_level, sp_board.get_parent_id(a.board_id) top_parent_id, is_top_selected isTopSelected+
    from boards a, ALERT_MESSAGE_BOARD_TARGETS b
    where a.board_id = b.board_id and is_displayable = 'Y' and alert_message_id = 5202) temp
    start with board_parent_id = 0
    connect by prior board_id = board_parent_id
    ORDER SIBLINGS BY order_seq;
    Based from online resources we modified "*_allow_level_without_connect_by*" by executing the statement.
    alter system set "_allow_level_without_connect_by"=true scope=spfile;
    After performing the above, ORA-01788 is resolved.
    The new issue is that the same query above returns *9,015,853 records in 11g* but in *9i it returns 64 records*. 9i returns the correct number of records. And the cause for 11g returning greater number of records is due to system parameter <font color=blue>LEVEL</font> used in the query.
    Why 11g is returning an incorrect huge number of records?
    Any assistance to address this is greatly appreciated. Thanks!

    The problem lies in th query.
    Oracle <font color=blue>LEVEL</font> should not be used inside a subquery. After <font color=blue>LEVEL</font> is moved in the main query, the number of returned records is the same as in 9i.
    select c.BOARD_ID, c.code, c.description, c.is_displayable, c.order_seq, c.board_parent_id, c.short_description, c.IS_SUB_BOARD_DISPLAYABLE, <font color=blue>LEVEL</font> child_level, c.top_parent_id, c.isTopSelected
    from (
    select a.BOARD_ID, code, description, is_displayable, order_seq, board_parent_id, short_description, IS_SUB_BOARD_DISPLAYABLE, sp_board.get_parent_id(a.board_id) top_parent_id, is_top_selected isTopSelected
    from boards a, ALERT_MESSAGE_BOARD_TARGETS b
    where a.board_id = b.board_id and is_displayable = 'Y' and alert_message_id = 5202
    ) c
    start with c.board_parent_id = 0
    connect by prior c.board_id = c.board_parent_id
    ORDER SIBLINGS BY c.order_seq

  • BI Loading for Infoobject : Number of records

    Hi Experts,
    When i am loading the data from DS 0wbs_elemt_attr to Infoobject 0wbs_elemt, in rsa3 there are 102117 number of records. but when i run the infopackge i see the number of records are 102035. I would like to know if there are any checkpoints to check as to where should I check for less number of records that are coming up.
    Kindly advise.
    thanks & regards,
    M.S

    Hi experts,
    I deleted everything and reloaded. It is now reflecting 1020117 number of records(dont know how).
    Thankyou guys for your help. Assigned points.
    thanks & regards,
    M.S
    Edited by: M.S on May 7, 2010 12:14 PM

  • Process huge number of records in XI

    Hello
    I am having a simple scenario where I am sending details from SAP to legacy system on weekly basis. I am having SAP ABAP proxy as a sender and file as a receiver.
    My problem is when the data is huge (around 600,000 records) the its giving RFC timeout error on SAP ECC side. I tried to sending limited number of records per submit to XI but it creates multiple files on receiving FTP server.
    Instead of that can I use collect pattern of BPM to collect messages in BPM? Will it not be an overhead using BPM in this case as the data is tooo huge?
    If someone has already tackeled this problem pl respond.
    Thanks in advance.
    Regards
    Rajeev

    Hi Rajeev,
    dont use BPM for solving that, BPM itself has still a lot performance problems.
    If possible, split the 600.000 at R/3 side into several proxy requests. If required, use modus "add lines" of the file adapter to collect the messages again. You should think about to send the messages "Exacly Once In Order" to avoid processing more than one message at the same time.
    Regards,
    Udo

  • Multiple columns for more than one record?

    Crystal Reports Xi
    I have a list of addresses, etc.  with attached route numbers.  If an address has 2 different route numbers, it will appear as two records.  Is there a way that I can list the address once, and have the route numbers just grow to more columns?
    Example - Change this...
    ADDRESS               ROUTE
    100 Main St.             1234
    100 Main St.             1235
    To This...
    100 Main St.            1234      1235
    Note:  An address may have one route number attached, or several route numbers attached.
    I tried playing around with the multiple columns, etc, but it kept on having everything in the section become multiple columns, etc.  Any help is appreciated.

    multiple columns only work in detail section.
    place your address without the route in the 1st details
    then insert a 2nd details and place the route in the details then select multiple columns for that details section.
    then underlay the first detail section and it should line up.

  • CLIENT_TEXT_IO - Hanging on "PUT" for large number of records

    I have successfully used CLIENT_TEXT_IO but my users have run into an error where the Form hangs and spits out details such:
    "oracle.forms.net.HTTPNStream.doFlush"
    etc....
    This happens when the number of records in the datablock is high (ex: 70,000 recs). So my question is: Is there a limit on how many lines you can write to a file?
    I'm just creating a CSV file on the client's machine using CLIENT_TEXT_IO.PUT_LINE. It works fine on say a few thousand recs but after that it hangs.
    I'm on Oracle Application Server 10g, Release 9.0.4 on Windows Server 2003, and forms compiled using Oracle Developer Suite 9.0.4.
    Thanks,
    Gio

    Hello,
    When playing with huge data, it is better to generate the file on the A.S. then get it back to the client.
    <p>Read this article</p>
    Francois

  • How Calculate more than one value and store it into to different rows and column for each input?

    thx guys.....i have a progress now in LV
    But now i have new trouble here. Ok i attached my LV file (LV 7.0.1/7.1) and excel form. I just could'nt calculate more than one input value. I want a different result for each value that i enter ... and store it into different rows and column. But it just store at one row.
    Attachments:
    My Project.vi ‏31 KB
    rumus motor bakar.xls ‏14 KB

    duplicate post

  • Need help setting up rows and columns for a shop page in DW

    I sell my books and design tees on my site. I'm trying to set up a new page for a new site.
    For some reason I can't get my products in a row and column fashion.
    I have a attachment file so you can see what I actually mean. thanks to all that will give advice.

    Use a table for your product catalog.
    Nancy O.
    Alt-Web Design & Publishing
    Web | Graphics | Print | Media  Specialists
    www.alt-web.com/
    www.twitter.com/altweb
    www.alt-web.blogspot.com

  • Search row and column for return value

    Dear Sir/Madam,
                               I have a problem for searching spreadsheet and hope you can help me out a bit.  Im pretty new to Labview and Im currently using Labview 8.0.  My task is to search the spreadsheet I have attached in row and column-wise, then return the corresponding value out.  I had an attempt in doing this as you can see from the vi that i have attached.  I try inputting the 'read from measurement file' into an array and using delete, index and search array I will be able to find the index value for the relevant row and column that i searched for by inputting them into an index array with the orginal array from the 'read from measurement file'.
                              So ultimately, when i enter a row value of 0.5 and a column value of 0.3, my output will be 1.688.
                              I can't see any mistakes in my logic but I getting really strange results, like I can read my data has been entered into an array but when i try deleting the first column and put it into another array, the orginal array with nothing deleted is outputted hence making my search to give out -1 value. So could you take a look please and give me any suggestion that can solve my problem or enhance the code a bit.  Thank you for your time.
    Best Regards,
    Coato
    P.s for some reason i can't attached the .lvm file of my data hence i have attached the excel version but i think you need to convert it back to .lvm for the 'read from measurement file' function to work.
    Attachments:
    Backswing compensation.csv ‏10 KB
    Backswing comnpensation2.vi ‏109 KB

    Your VI makes absolutely no sense to me, but maybe I don't understand what you are trying to do.
    You seem to have dynamic data with 6 signals and 48 points/channel. Now you reshape this into an array of dynamic data with 4x13 elements from which you slice out one row or column, resp. "delete from array" is NOT the correct tool to do this, use "Index array" with one index unwired to get a row or column as 1D array.
    So you end up with two 1D arrays of dynamic data that you search for DBL. It is difficult to understand how you want to search for an array element that corresponds to a scalar DBL value of 0.1. Your array elements are NOT DBLs but dynamic data, each containing many signals!
    There are two elements on all your data that are "3", the rest are zero. You will never find anything that is 0.1.
    Maybe you can convert your original dynamic data to a 2D array with "rows are signals" using "convert from dynamic data", then operate on the 2D array.
    Coato wrote:
                              So ultimately, when i enter a row value of 0.5 and a column value of 0.3, my output will be 1.688.
    Sorry, Please explain.
    Please make a VI containing a simple 2D aray as diagram constant that contains e.g. 5x5 typical values. Let us know what kind of result you expect from your algorithm..
    LabVIEW Champion . Do more with less code and in less time .

  • Co88 - for huge number of production orders -running for hours

    Hello All,
    I have a issue, in our manufacturing plant, we have huge number of production orders created every day, almost 1200 per day. During month end it takes almost 8 to 10 hours for running settlement, we find very difficult in doing month end closing. We also attempted parallel processing, it errors out.
    I heard that, the co88 program looks each order by order checking the status, like closed, and thats the reason, its time consuming, i am not sure how far its true.
    I am sure this general issue, people might have come across, can anybody share their experience. How to overcome this issue to run the settlement with possible minimal time, is there any note. please guide me
    thanks
    best regards
    gj

    Yes, this is a generic issue. Most clients manage by marking the orders as closed/deleted, so that these orders are not considered for period end settlement.
    Also consider note 545932, and search notes fo further help.

  • Function module for finding number of records in table

    Do we have any function module to find number of records in a table?
    Thanks,
    Shweta

    use the EM_GET_NUMBER_OF_ENTRIES. You just have to write your table name into the input table IT_TABLES in the TABNAME field

  • Performance issue fetching huge number of record with "FOR ALL ENTRIES"

    Hello,
    We need to extract an huge amount of data (about 1.000.000 records) from VBEP table, which overall dimension is about 120 milions records.
    We actually use this statements:
    CHECK NOT ( it_massive_vbep[] IS INITIAL ) .
    SELECT (list of fields) FROM vbep JOIN vbap
                 ON vbepvbeln = vbapvbeln AND
                  vbepposnr = vbapposnr
                 INTO CORRESPONDING FIELDS OF  w_sched
                 FOR ALL ENTRIES IN it_massive_vbep
                 WHERE    vbep~vbeln   = it_massive_vbep-tabkey-vbeln
                    AND    vbep~posnr   = it_massive_vbep-tabkey-posnr
                    AND    vbep~etenr   = it_massive_vbep-tabkey-etenr.
    notice that internal table it_massive_vbep contains always records with fully specified key.
    Do you think this query could be further optimized?
    many thanks,
    -Enrico

    the are 2 option to improve performance:
    + you should work in blocks of 10.000 to 50.000
    + you should check archiving options, does this really make sense
    > VBEP table, which overall dimension is about 120 milions records.
    it_massive_vbep  into it_vbep_notsomassive (it_vbep_2)
    CHECK NOT ( it_vbep_2[] IS INITIAL ) .
      get runtime field start.
    SELECT (+list of fields+)
                  INTO CORRESPONDING FIELDS OF TABLE w_sched
                  FROM vbep JOIN vbap
                  ON vbep~vbeln = vbap~vbeln AND
                       vbep~posnr = vbap~posnr
                  FOR ALL ENTRIES IN it_vbep_2
                  WHERE vbep~vbeln = it_vbep_2-vbeln
                  AND      vbep~posnr = it_vbep_2-posnr
                  AND      vbep~etenr  = it_vbep_2-etenr.
      get runtime field stop.
    t = stop - start.
    write: / t.
    Be aware that even 10.000 will take some time.
    Other question, how did you get the 1.000.000 records in it_massive_vbep. They are not typed in, but somehow select.
    Change the FAE into a JOIN and it will be much faster.
    Siegfried

  • Poor Response Time- Total on a column for n number of rows

    I have a table with column cost on my custom OAF page.
    When I query for configuration it returns me many rows. so I have set the default rows for table = 10 and then I neatly have next button to go to other rows.
    I have enabled totalling on cost column.
    Row for Total appears showing sum for the costs only on that page( for only 10 rows). I click for next from drop down , and I see total for the costs for the second page.
    Ex:
    table has 17 rows and
    page 1 :
    total row at the end saying 1000.00
    page 2 :
    total = 1500.00
    I want to display Total Cost by summing up the costs for say 300 items returned by the query on all the pages , in above case 2500.00.
    I thought of a way to do it ;
    I added a sumVO with query "select sum(...) from table" .
    Added a new region in my page , added a messageStyleText based on the sumVO, and pulled the total cost in.
    It shows me the right result, but my problem is performance.
    It is getting very slow. I am using the same query as for displaying the results in table, but summing on cost column.
    Can I avoid writing the sum query and do it programmatically in OAF ??
    Thanks in advance.

    Even if you use programmatic approach, what do you think program will do?
    Program has to fetch all the rows in the middle tier and sum it up using for loop. No way its going to solve your problem.
    First find out the reason for the slow performance using Trace option. and fix the query.
    If your not able to fix it, try materialized view for the summation query.
    To take sql trace for OAF page refer this link infor http://prasanna-adf.blogspot.com/2009/01/sql-trace.html
    --Prasanna                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                               

  • Table Properties: Tweaks for large number of records

    Hi experts,
    I have a small table, about 5 columns, which will store records numbering in millions (at least 7 million). Is there any specific list of properties/checklist that I need to look into and ensure that the db health is good?
    Regards,
    Anand.

    Make sure you have appropriate indexing for your query needs. Make sure you have sufficient and complete declarative integrity constraints in place - yes foreign keys and check constraints matter and affect query plans. Seven million rows is not necessarily a lot, however you might consider how the data is accessed, used, and purged and see if partitioning is appropriate ( if you are licensed for the option). Also, make sure the SQL is efficient as possible - for instance if you are updating this table using a cursor loop, all it will do is get massively slower over time.

  • Time Limit exceeded Error while updating huge number of records in MARC

    Hi experts,
    I have a interface requirement in which third party system will send a big file say.. 3 to 4MB file into SAP. in proxy we
    used BAPI BAPI_MATERIAL_SAVEDATA to save the material/plant data. Now, because of huge amount of data the SAP Queues are
    getting blocked and causing the time limit exceeded issues. As the BAPI can update single material at time, it will be called as many materials
    as we want to update.
    Below is the part of code in my proxy
    Call the BAPI update the safety stock Value.
        CALL FUNCTION 'BAPI_MATERIAL_SAVEDATA'
          EXPORTING
            headdata                    = gs_headdata
            CLIENTDATA                  =
            CLIENTDATAX                 =
           plantdata                   = gs_plantdata
           plantdatax                  = gs_plantdatax
           IMPORTING
              return              = ls_return.
        IF ls_return-type <> 'S'.
          CALL FUNCTION 'BAPI_TRANSACTION_ROLLBACK'.
          MOVE ls_return-message TO lv_message.
    Populate the error table and process next record.
          CALL METHOD me->populate_error
            EXPORTING
              message = lv_message.
          CONTINUE.
        ENDIF.
    Can any one please let me know what could be the best possible approach for this issue.
    Thanks in Advance,
    Jitender
    Hi experts,
    I have a interface requirement in which third party system will send a big file say.. 3 to 4MB file into SAP. in proxy we
    used BAPI BAPI_MATERIAL_SAVEDATA to save the material/plant data. Now, because of huge amount of data the SAP Queues are
    getting blocked and causing the time limit exceeded issues. As the BAPI can update single material at time, it will be called as many materials
    as we want to update.
    Below is the part of code in my proxy
    Call the BAPI update the safety stock Value.
        CALL FUNCTION 'BAPI_MATERIAL_SAVEDATA'
          EXPORTING
            headdata                    = gs_headdata
            CLIENTDATA                  =
            CLIENTDATAX                 =
           plantdata                   = gs_plantdata
           plantdatax                  = gs_plantdatax
           IMPORTING
              return              = ls_return.
        IF ls_return-type <> 'S'.
          CALL FUNCTION 'BAPI_TRANSACTION_ROLLBACK'.
          MOVE ls_return-message TO lv_message.
    Populate the error table and process next record.
          CALL METHOD me->populate_error
            EXPORTING
              message = lv_message.
          CONTINUE.
        ENDIF.
    Can any one please let me know what could be the best possible approach for this issue.
    Thanks in Advance,
    Jitender

    Hi Raju,
    Use the following routine to get fiscal year/period using calday.
    *Data definition:
    DATA: l_Arg1 TYPE RSFISCPER ,
          l_Arg2 TYPE RSFO_DATE ,
          l_Arg3 TYPE T009B-PERIV .
    *Calculation:
    l_Arg2  = TRAN_STRUCTURE-POST_DATE. (<b> This is the date that u have to give</b>)
    l_Arg3  = 'V3'.
    CALL METHOD CL_RSAR_FUNCTION=>DATE_FISCPER(
      EXPORTING I_DATE = l_Arg2
                I_PER = l_Arg3
      IMPORTING E_FISCPER = l_Arg1  ).
    RESULT = l_Arg1 .
    Hope it will sove ur problem....!
    Please Assign points.......
    Best Regards,
    SG

Maybe you are looking for

  • MacPro 3,1 showing up as 1,1

    After terrible power issues over the weekend, one of our macPros was unbootable this morning. After pulling all third party ram, and reinitializing the hard drive we installed lion again from the recovery partition. The machine wouldn't upgrade from

  • How to calculate the variance

    good day.. i am an end-user using oracle desktop discoverer 4, how will i calculate the variance in a crosstab as follows: ......... total cost total cost ........ project draft Original variance project1 1000 400 500 project2 5000 1000 4000 thanks a

  • Problem with Dreamweaver3 Tour Films

    When accessing the local dreamweaver v ersion 3 help system and trying to view the shockwave films included in the tour I find only "placeholder" grafics and when clicking on them a browser window opens inviting me to install the latest version of sh

  • ALV Settings (e.g. SORT): How can I reload a view that contains settings

    Hi, I have saved ALV Settings in a view. After I read out the (editable) ALV, I goto backend and then update ALV by BIND_TABLE. After this ALV update, my ALV settings are lost. Therefore I am looking for a possiblity to reload / restore the ALV setti

  • Toolbar font size

                Why doesn't Adobe care about its customers ?   Adobe employees want the salary.....but don't want to help customers !!!!!!              The font size in PhotoShop Elements 10 IS very small.  Thereby, screwing the users into purchasing the