Extract last date from DSO - Number ID through ABAP

Hi everyone.
I want to delete some records that i dont want to write to DSO through ABAP routine (transformation or DTP). For example, I am reading a number id and a date (yyyymmdd), but for that number I have many dates, and I wanna store the maximun date from all of them. For example:
ID      Date
111   20111020
222   20111021
333   20111025
444   20111105
555   20111102
In DSO i want to store '444 20111105'. i tried this with a delta extraction , but the DSO receive any record. It seems the transformation read the DSO at random.
Appriciate your response
Thanks
Ignacio

Perform a "selective delete".
Right click on DSO and goto Manage.  Next, goto the Contents Tab.  Then click on the selective delete button.  This will ask you what criteria to use for the deletion.  Enter your time restrictions and your'e good to go.
Cheers,
B

Similar Messages

  • When I extracting data from DSO to Cube by using DTP.

    When i am extracting data from DSO to Cube by using DTP.
    I am getting following erros.
    Data package processing terminated (Message no. RSBK229).
    Error in BW: error getting datapakid cob_pro (Message no. RS_EXCEPTION105).
    Error while extracting from source 0FC_DS08 (type DataStore) - (Message no. RSBK242).
    Data package processing terminated. (Message no. RSBK229).
    Data package 1 / 10/04/2011 15:49:56 / Status 'Processed with Errors'. (Message no. RSBK257).
    This is the brand new BI 7.3 system. Implementing the PSCD and TRM.
    I have used the standard business content objects in FI-CA (dunning history header, item,activities) and standard Datasource (0FC_DUN_HEADER ,0FC_DUN_ITEMS, 0FC_DUN_ACTIVITIES). I have extracted data till the DSO level . when I try to pull the data to info provider level(cube) using DTP . I am getting fallowing error.
    my observation: when ever I use the DSO as source to any target like another DSO or cube. its throwing same kind of error for any flow including simple Flat file .
    please suggest any one whether do I need to maintain basic settings since its a brand new BI 7.3.
    please help me out on this issue . I am not able to move forward .its very urgent

    hello
    Have you solved the problem ?  
    I have the same error...
    as you solve this error,
    can you help me please I have the same error
    yimi castro garcia
    [email protected]

  • HT201210 the iPhone 4 will not connect through iTunes and has extracted all data from the device

    It seems like this could be an issue with the device.  IT will not connect long enough to extract the data.

    From the Community Ettiquette page found here.
    1) Post only technical support questions and answers, unless otherwise noted.
    2) Be polite, constructive and stay on topic.
    3) Search for questions similar to your own before submitting a new question to the community.
    I'm certain if you try option 3 you'll find relevant information on how to proceed.

  • I am extracting the data from ECC To bw .but Data Loading taking long tim

    Hi All,
                     i am extracting the data from ECC To BI Syatem..but Data Loading Taking Long time. from last   6 hoursinfopackage is running.still it is showing yellow.Manually i made the red.and delete again i applied repeat of the last delta.but same proble is coming .in the status job is showing bckground job is not finished at source system.we requested to basis.basis people killed that job.again we schedule the chain also again same problem is coming.how can i solve this issue.
    Thanks ,
    chandu

    Hi,
    There are different places to track your job. Once your job is triggered in BW, you can track your load job where exactly it is taking more time and why. Follow below steps:
    1) After InfoPackage is triggered, then take the request number and go to source system to check your extraction job status.
    You can get the job status by taking the request number from BW and go to transaction SM37 in ECC. Then give the request number with begining '' and ending ''.  Also give '*' to user name.
    Job name:  REQ_XXXXXX
    User Name: *
    Check the job status whether job is completed or cancelled or short dump. If the job is still running check in SM66 whether you can see any process. If not accordingly you got to check in ST22 or SM21 in ECC. If the job is complete, then the same in BW side now.
    2) Check the data arrived in PSA, if not check whether Transfer routines or start routines are having bad SQL or code. Similarly in update rules.
    3) Once it is through in Source system (ECC), Transfer rules , Update Rules, then the next task is updating the data might some time take more time which might be based on some parameters ( Number of parallel process to update database ). Check whether updating the database is taking more time and may be you got to check with the DBA guy also.
    At all the times you should see minimum of atleast once process running all the time in SM66 till the time your job gets complete. If not you will see a log in ST22.
    Let me know if you still have questions.
    Assigning points is the only way of saying thanks in SDN.
    Thanks,
    Kumar.

  • Issue in Data from DSO to DSO Target with different Key

    Hello All,
    I am having Issue in Data from DSO to DSO Target with different Key
    Source DSO has Employee + Wage + Post numner as key and Target has Employee + Wage Type as key.
    DSO semantic grouping works like Group By clause in sql, is my understanding right ?
    Also if someone can explain this with a small example, it would be great.
    Many Thanks
    Krishna

    Dear, as explained earlier your issue has nothing to do with semantic grouping .
    Semantic grouping is only usefull when you have written a routine in the transformation for calculations and in error handling .
    Please go through this blog which explains very clearly the use of semantic grouping .
    http://scn.sap.com/community/data-warehousing/netweaver-bw/blog/2013/06/16/semantic-groups-in-dtp
    Now coming to your above question
    DSO 1
    Employee WageTyp Amount
    100          aa          200
    200          aa          200
    100          bb          400
    100          cc          300
    If we have semantic group as Employee .   If we have Employee as key of the target DSO and update type as summation .
    then target DSO will have
    Emp                Amount
    100                 700
    200                 200
    In this case Wage type will be the last record arriving from the data package . If the record 100  cc  300 is arrivng last then wage type will be cc .
    2) Case 2
    DSO 1
    Employee WageTyp Amount
    100          aa          200
    200          aa          200
    100          bb          400
    100          aa          300
    if we do Semantic grouping with Emp and Wage Type   If we have Employee and wage type as key of the target DSO and update type as summation .
    then target DSO will have
    Emp     Wage      Amount
    100          aa          500
    200          aa          200
    100          bb          400
    Hope this helps .

  • How can I use Automator to extract specific Data from a text file?

    I have several hundred text files that contain a bunch of information. I only need six values from each file and ideally I need them as columns in an excel file.
    How can I use Automator to extract specific Data from the text files and either create a new text file or excel file with the info? I have looked all over but can't find a solution. If anyone could please help I would be eternally grateful!!! If there is another, better solution than automator, please let me know!
    Example of File Contents:
    Link Time =
    DD/MMM/YYYY
    Random
    Text
    161 179
    bytes of CODE    memory (+                68 range fill )
    16 789
    bytes of DATA    memory (+    59 absolute )
    1 875
    bytes of XDATA   memory (+ 1 855 absolute )
    90 783
    bytes of FARCODE memory
    What I would like to have as a final file:
    EXCEL COLUMN1
    Column 2
    Column3
    Column4
    Column5
    Column6
    MM/DD/YYYY
    filename1
    161179
    16789
    1875
    90783
    MM/DD/YYYY
    filename2
    xxxxxx
    xxxxx
    xxxx
    xxxxx
    MM/DD/YYYY
    filename3
    xxxxxx
    xxxxx
    xxxx
    xxxxx
    Is this possible? I can't imagine having to go through each and every file one by one. Please help!!!

    Hello
    You may try the following AppleScript script. It will ask you to choose a root folder where to start searching for *.map files and then create a CSV file named "out.csv" on desktop which you may import to Excel.
    set f to (choose folder with prompt "Choose the root folder to start searching")'s POSIX path
    if f ends with "/" then set f to f's text 1 thru -2
    do shell script "/usr/bin/perl -CSDA -w <<'EOF' - " & f's quoted form & " > ~/Desktop/out.csv
    use strict;
    use open IN => ':crlf';
    chdir $ARGV[0] or die qq($!);
    local $/ = qq(\\0);
    my @ff = map {chomp; $_} qx(find . -type f -iname '*.map' -print0);
    local $/ = qq(\\n);
    #     CSV spec
    #     - record separator is CRLF
    #     - field separator is comma
    #     - every field is quoted
    #     - text encoding is UTF-8
    local $\\ = qq(\\015\\012);    # CRLF
    local $, = qq(,);            # COMMA
    # print column header row
    my @dd = ('column 1', 'column 2', 'column 3', 'column 4', 'column 5', 'column 6');
    print map { s/\"/\"\"/og; qq(\").$_.qq(\"); } @dd;
    # print data row per each file
    while (@ff) {
        my $f = shift @ff;    # file path
        if ( ! open(IN, '<', $f) ) {
            warn qq(Failed to open $f: $!);
            next;
        $f =~ s%^.*/%%og;    # file name
        @dd = ('', $f, '', '', '', '');
        while (<IN>) {
            chomp;
            $dd[0] = \"$2/$1/$3\" if m%Link Time\\s+=\\s+([0-9]{2})/([0-9]{2})/([0-9]{4})%o;
            ($dd[2] = $1) =~ s/ //g if m/([0-9 ]+)\\s+bytes of CODE\\s/o;
            ($dd[3] = $1) =~ s/ //g if m/([0-9 ]+)\\s+bytes of DATA\\s/o;
            ($dd[4] = $1) =~ s/ //g if m/([0-9 ]+)\\s+bytes of XDATA\\s/o;
            ($dd[5] = $1) =~ s/ //g if m/([0-9 ]+)\\s+bytes of FARCODE\\s/o;
            last unless grep { /^$/ } @dd;
        close IN;
        print map { s/\"/\"\"/og; qq(\").$_.qq(\"); } @dd;
    EOF
    Hope this may help,
    H

  • Problem in transfering data from DSO to InfoCube

    Hi,
    I am trying to transfer data from DSO to Infocube. But it is giving the following message.
    Last delta update is not yet completed
    Therefore, no new delta update is possible.
    You can start the request again
    if the last delta request is red or green in the monitor (QM activity)
    It is not allowing to load data to InfoCube from DSO. I checked the request number. But there is no request number in InfoCube. If request no is there I can delete the request from InfoCube and delete the data mart status in DSO and reload again.But there is no request id in IC. So plz guide me how I can tranfer data to InfoCube overcoming this problem.
    Actually further updata was running from DSO to IC. But it failed. Executing Function Module : RSPC_PROCESS_FINISH, I set the chain  green before loading
    data to InfoCube. So it starts executing the next chain. So there is no request id. So plz guide me how I can retain this request id.
    Edited by: Purushotham Reddy on Jan 28, 2008 11:19 AM

    Hi Reddy,
    Make the QM Status Red of the failed request even if it is red.
    and then run the same infopackage used in the chain. This will ask for repeat of last delta.
    If this process is understood to u then its fine otherwise read below:
    the other option is check if this failed request is present in PSA. If it is present in PSA then update the failed request from PSA, here u wont need to do a repeat of last delta. This Option is easier. this u can do from the ,manage tab of the data source I think this u might be surely knowing....
    Assign Points if it helps u.....
    Thanks,
    Nagesh.

  • Example on extracting time data from clusters

    Hi Experts,
       Please send me a sample code for time data extraction from clusters. or steps to extract time data from clusters.
    Thank you.
    Regards,
    Admir.
    Points will be rewarded.

    Hi
    see the sample report for time data fetching
    But ime data is not fetched from Clusters, only payroll data is fetched from clusters
    Time related infotypes are 2000 to 2999.
    REPORT zh_absence_quota
           NO STANDARD PAGE HEADING
           MESSAGE-ID zh_msg
           LINE-SIZE 169
           LINE-COUNT 60(1).
                   T A B L E S  D E C L A R A T I O N S
    TABLES:    pernr,    " Logical PNP
               t001p,    " Personnel Subarea
               t529u,    " Employment Status
               t500p,    " Personnel Area
               t501,     " Employee Group
               t503k,    " Employee Subgroup
               t549a,    " Payroll Area
               t554s,    " Absence Type
               t554t,    " Absence Type Texts
               t556a,    " Quota Type
               t527x,    " Orgn. Unit
               t556b,    " Quota Type Text
               pa0003.   " Payroll Status
    INFOTYPES:
               0000,   " Actions
               0001,   " Organizational Assignment
               2006,   " Absence Quota
               2001.   " Absences
                   T Y P E S  D E C L A R A T I O N S
    Employee Absence Structure
    TYPES: BEGIN OF s_2001,
             pernr TYPE persno,       " Personal Number
             awart TYPE awart,        " Absence Type
             subty TYPE subty,        " Sub Type
             endda TYPE endda,        " End date
             begda TYPE begda,        " Begin date
             abrtg TYPE abrtg,        " Absence days
             ename TYPE emnam,        " employee Name
             atext TYPE abwtxt,       " Absence Type Text
           END OF s_2001.
    Employee Absence Quota Structure
    TYPES: BEGIN OF s_2006,
             pernr TYPE persno,       " Personal Number
             ktart TYPE abwko,        " Absence Quota Type
             year(4) TYPE n,          " Year
             subty TYPE subty,        " Sub Type
             endda TYPE endda,        " End date
             begda TYPE begda,        " Begin date
             anzhl TYPE ptm_quonum,   " Absence Entitlement days
             ename TYPE emnam,        " employee Name
             ktext TYPE kotxt,        " Absence Type Text
             kverb TYPE kverb,        " Deduction Quota days
             anzhb TYPE ptm_quonum,   " Balance days
           END OF s_2006.
    Combined Employee Absence and Quota Structure
    TYPES: BEGIN OF s_rep,
             pernr TYPE persno,       " Personal Number
             ktart TYPE abwko,        " Absence Quota Type
             year(4) TYPE n,          " Year
             anzhl TYPE ptm_quonum,   " Absence Entitlement days
             kverb TYPE kverb,        " Deduction Quota days
             anzhb TYPE ptm_quonum,   " Balance days
             ktext TYPE kotxt,        " Quota Type Text
             awart TYPE awart,        " Absence Type
             abrtg TYPE abrtg,        " Absence days
             ename TYPE emnam,        " employee Name
             atext TYPE abwtxt,       " Absence Type Text
             endda TYPE endda,        " End date
             begda TYPE begda,        " Begin date
           END OF s_rep.
    Declaration of Variables
    DATA : gv_atext TYPE abwtxt,              " Absence Type Text
           gv_ktext TYPE kotxt,               " Absence Type Text
           gv_title1   TYPE sylisel,          " Report title
           gv_year(4)  TYPE c,                " Year
           gv_mon(2)   TYPE c,                " Month
           gv_hrs    TYPE abwtg,              " Hours
           gv_date   TYPE sydatum,            " Date
           gv_date1  TYPE sydatum,            " Date
           gv_dial.                           " Color flag
    Declaration of Constants
    CONSTANTS :
      c_x      TYPE c VALUE 'X',               " Sign
      c_1      TYPE persg   VALUE '1',         " Emp Group
      c_pernr(8) TYPE n VALUE '00000000',      " Pernr
      c_moabw  TYPE moabw   VALUE '01',        " Per SA Grouping
      c_mozko  TYPE mozko   VALUE '01',        " Per SA Grouping
      c_mopgk  TYPE mopgk   VALUE '1',         " Emp SGrp Grouping
      c_endda  TYPE sydatum VALUE '99991231',  " End Date
      c_val1(2) TYPE c VALUE '31',             " Date Type
      c_val2(2) TYPE c VALUE '12',             " Date Type
      c_val    LIKE p0041-dar01 VALUE '01',    " Date Type
      c_date1  LIKE sy-datum VALUE '18000101'. " Date
         I N T E R N A L  T A B L E S  D E C L A R A T I O N S
    DATA: i_2001 TYPE STANDARD TABLE OF s_2001 WITH HEADER LINE,
          i_2006 TYPE STANDARD TABLE OF s_2006 WITH HEADER LINE,
          i_rep1 TYPE STANDARD TABLE OF s_2006 WITH HEADER LINE,
          i_rep  TYPE STANDARD TABLE OF s_rep WITH HEADER LINE.
                     S E L E C T I O N  S C R E E N
    SELECTION-SCREEN BEGIN OF BLOCK b1 WITH FRAME TITLE text-001.
    SELECT-OPTIONS: s_ktart FOR t556a-ktart,  " Absence Quota Type
                    s_awart FOR t554s-subty.  " Absence Type
    SELECTION-SCREEN END OF BLOCK b1.
    SELECTION-SCREEN BEGIN OF BLOCK b2 WITH FRAME TITLE text-002.
    SELECTION-SCREEN BEGIN OF LINE.
    SELECTION-SCREEN COMMENT 1(33) text-003.   " Quota & Absence
    PARAMETERS: p_qa RADIOBUTTON GROUP rb1.
    SELECTION-SCREEN END OF LINE.
    SELECTION-SCREEN BEGIN OF LINE.
    SELECTION-SCREEN COMMENT 1(33) text-004.   " Quota
    PARAMETERS: p_q RADIOBUTTON GROUP rb1.
    SELECTION-SCREEN END OF LINE.
    SELECTION-SCREEN BEGIN OF LINE.
    SELECTION-SCREEN COMMENT 1(33) text-005.   " Absence
    PARAMETERS: p_a RADIOBUTTON GROUP rb1.
    SELECTION-SCREEN END OF LINE.SELECTION-SCREEN END OF BLOCK b2.
                      I N I T I A L I Z A T I O N                        *
    INITIALIZATION.
                  A T  S E L E C T I O N - S C R E E N                   *
    AT SELECTION-SCREEN.
    Validate the screen fields
      PERFORM validate_screen.
                   S T A R T - O F - S E L E C T I O N                   *
    START-OF-SELECTION.
    Selection of Period
      PERFORM get_period.
    Get PERNR from LDB
    GET pernr.
      IF p0000-stat2 <> '0'.
    Get the data from PA0001,PA2001, PA2006
        PERFORM get_pers_data.
      ENDIF.
                   T O P - O F - P A G E                                 *
    TOP-OF-PAGE.
    Header of the List
      PERFORM header.
                   E N D - O F - P A G E                                 *
    Footer
    END-OF-PAGE.
      IF p_qa = c_x.
    Display both Absence and Quota Data
        WRITE /1(188) sy-uline.
      ELSEIF p_q = c_x.
    Display only Quota Data
        WRITE /1(114) sy-uline.
      ELSEIF p_a = c_x.
    Display only Absence Data
        WRITE /1(125) sy-uline.
      ENDIF.
                   E N D - O F - S E L E C T I O N                       *
    END-OF-SELECTION.
    Combine the Absence and Quota Data
      PERFORM append_data.
      IF p_qa = c_x.
    Display both Absence and Quota Data
        PERFORM display_qa_data.
      ELSEIF p_q = c_x.
    Display only Quota Data
        PERFORM display_q_data.
      ELSEIF p_a = c_x.
    Display only Absence Data
        PERFORM display_a_data.
      ENDIF.
    *&      Form  validate_screen
    Validation of Selection Screen fields
    FORM validate_screen .
    Validation of Personnel Number
      CLEAR pa0003.
      IF NOT pnppernr[] IS INITIAL.
        SELECT pernr
        FROM pa0003 UP TO 1 ROWS
          INTO pa0003-pernr
          WHERE pernr IN pnppernr.
        ENDSELECT.
        IF sy-subrc <> 0.
          MESSAGE e999 WITH 'Incorrect Personnel Number'(006).
        ENDIF.
      ENDIF.
    Validation of Employee Status
      CLEAR t529u.
      IF NOT pnpstat2[] IS INITIAL.
        SELECT SINGLE statv
          INTO t529u-statv
          FROM t529u
          WHERE statv IN pnpstat2 AND
                statn = '2' AND
                sprsl = sy-langu.
        IF sy-subrc <> 0.
          MESSAGE e999 WITH 'Invalid Employee Status'(007).
        ENDIF.
      ENDIF.
    Validation of Personnel Area
      CLEAR t500p.
      IF NOT pnpwerks[] IS INITIAL.
        SELECT persa
        FROM t500p UP TO 1 ROWS
          INTO t500p-persa
          WHERE persa IN pnpwerks.
        ENDSELECT.
        IF sy-subrc <> 0.
          MESSAGE e999 WITH 'Incorrect Personnel Area'(008).
        ENDIF.
      ENDIF.
    Validation of Personnel Sub Area
      CLEAR t001p.
      IF NOT pnpbtrtl[] IS INITIAL.
        SELECT btrtl
        FROM t001p UP TO 1 ROWS
          INTO t001p-btrtl
          WHERE btrtl IN pnpbtrtl.
        ENDSELECT.
        IF sy-subrc <> 0.
          MESSAGE e999 WITH 'Incorrect Personnel Sub Area'(009).
        ENDIF.
      ENDIF.
    Validation of Employee Group
      CLEAR t501.
      IF NOT pnppersg[] IS INITIAL.
        SELECT persg
        FROM t501 UP TO 1 ROWS
          INTO t501-persg
          WHERE persg IN pnppersg.
        ENDSELECT.
        IF sy-subrc <> 0.
          MESSAGE e999 WITH 'Incorrect Employee Group'(010).
        ENDIF.
      ENDIF.
    Validation of Employee Sub Group
      CLEAR t503k.
      IF NOT pnppersk[] IS INITIAL.
        SELECT persk
        FROM t503k UP TO 1 ROWS
          INTO t503k-persk
          WHERE persk IN pnppersk.
        ENDSELECT.
        IF sy-subrc <> 0.
          MESSAGE e999 WITH 'Incorrect Employee Sub Group'(011).
        ENDIF.
      ENDIF.
    Validation of Payroll Area
      CLEAR t549a.
      IF NOT pnpabkrs[] IS INITIAL.
        SELECT abkrs
        FROM t549a UP TO 1 ROWS
          INTO t549a-abkrs
          WHERE abkrs IN pnpabkrs.
        ENDSELECT.
        IF sy-subrc <> 0.
          MESSAGE e999 WITH 'Incorrect Employee Payroll Area'(026).
        ENDIF.
      ENDIF.
    Validation of Absence Type
      CLEAR t554s.
      IF NOT s_awart[] IS INITIAL.
        SELECT subty
        FROM t554s UP TO 1 ROWS
          INTO t554s-subty
          WHERE subty IN s_awart AND
                moabw EQ c_moabw AND
                endda EQ c_endda.
        ENDSELECT.
        IF sy-subrc <> 0.
          MESSAGE e999 WITH 'Incorrect Employee Absence Type'(012).
        ENDIF.
      ENDIF.
    Validation of Absence Quota Type
      CLEAR t556a.
      IF NOT s_ktart[] IS INITIAL.
        SELECT ktart
        FROM t556a UP TO 1 ROWS
          INTO t556a-ktart
          WHERE ktart IN s_ktart AND
                mopgk EQ c_mopgk AND
                mozko EQ c_mozko AND
                endda EQ c_endda.
        ENDSELECT.
        IF sy-subrc <> 0.
          MESSAGE e999 WITH 'Incorrect Employee Quota Type'(013).
        ENDIF.
      ENDIF.
    ENDFORM.                  "validate_screen
    *&      Form  get_period
    Get the Correct Period based on Selection screen selection
    FORM get_period.
      CLEAR: gv_year,gv_mon, gv_date, gv_date1.
      gv_year = sy-datum+0(4).
      gv_mon  = sy-datum+4(2).
      IF pnptimr1 = c_x.      " Current Date
        pnpbegda = sy-datum.
        pnpendda = sy-datum.
      ELSEIF pnptimr2 = c_x.  " Current Month
        CONCATENATE gv_year gv_mon c_val INTO gv_date.
        CONCATENATE gv_year gv_mon c_val1 INTO gv_date1.
        pnpbegda = gv_date.
        pnpendda = gv_date1.
      ELSEIF pnptimr3 = c_x.  " Current Year
        CONCATENATE gv_year c_val c_val INTO gv_date.
        CONCATENATE gv_year c_val2 c_val1 INTO gv_date1.
        pnpbegda = gv_date.
        pnpendda = gv_date1.
      ELSEIF pnptimr4 = c_x.  " Upto Today
        pnpbegda = c_date1.
        pnpendda = sy-datum.
      ELSEIF pnptimr5 = c_x.  " From Today
        pnpbegda = sy-datum.
        pnpendda = c_endda.
      ELSE.
        IF ( pnpbegda IS INITIAL AND pnpendda IS INITIAL ).
          pnpbegda = c_date1.
          pnpendda = c_endda.
        ELSEIF pnpbegda IS INITIAL AND NOT pnpendda IS INITIAL.
          pnpbegda = c_date1.
          pnpendda = pnpendda.
        ELSEIF NOT ( pnpbegda IS INITIAL AND pnpendda IS INITIAL ).
          pnpbegda = pnpbegda.
          pnpendda = pnpendda.
        ENDIF.
      ENDIF.
    ENDFORM.              "get_period
    *&      Form  get_pers_data
    Get the Absence and Quota Data from PA0001,PA2001,PA2006
    FORM get_pers_data.
      DATA: lv_year1(4) TYPE n,
            lv_year2(4) TYPE n,
            lv_date1 TYPE sydatum,
            lv_date2 TYPE sydatum,
            lv_anzhb TYPE ptm_quonum.   " Last Year Balance days
    Get data from Respective Infotypes
      rp_provide_from_last p0001 space pnpbegda pnpendda.
    Absence Data
      LOOP AT p2001 WHERE pernr = pernr-pernr AND
                          begda GE pnpbegda   AND
                          endda LE pnpendda.
        IF p2001-awart IN s_awart.
          i_2001-pernr    = pernr-pernr.
          i_2001-subty    = p2001-subty.
          i_2001-awart    = p2001-awart.
          i_2001-abrtg    = p2001-abrtg.
          i_2001-begda    = p2001-begda.
          i_2001-endda    = p2001-endda.
          READ TABLE p0001 WITH KEY pernr = p2001-pernr.
          i_2001-ename    = p0001-ename.
    Get the Absence Type Text
          CLEAR gv_atext.
          SELECT SINGLE atext INTO gv_atext FROM t554t
                  WHERE sprsl = sy-langu AND
                        moabw = c_moabw  AND
                        awart = p2001-awart.
          IF sy-subrc = 0.
            i_2001-atext = gv_atext.
          ENDIF.
          APPEND i_2001.
          CLEAR i_2001.
        ENDIF.
      ENDLOOP.
    Quota Data
      LOOP AT p2006 WHERE pernr = pernr-pernr AND
                          begda GE pnpbegda   AND
                          endda LE pnpendda.
        IF p2006-ktart IN s_ktart.
          i_2006-pernr    = pernr-pernr.
          i_2006-subty    = p2006-subty.
          i_2006-begda    = p2006-begda.
          i_2006-endda    = p2006-endda.
          i_2006-year     = p2006-endda+0(4).
          i_2006-ktart    = p2006-ktart.
          i_2006-anzhl    = p2006-anzhl.
          i_2006-kverb    = p2006-kverb.
          i_2006-anzhb    = p2006-anzhl - p2006-kverb.
          READ TABLE p0001 WITH KEY pernr = p2001-pernr.
          i_2006-ename    = p0001-ename.
    Get the Quota Type Text
          CLEAR gv_ktext.
          SELECT SINGLE ktext INTO gv_ktext FROM t556b
                  WHERE sprsl = sy-langu AND
                        mopgk = c_mopgk  AND
                        mozko = c_mozko  AND
                        ktart = p2006-ktart.
          IF sy-subrc = 0.
            i_2006-ktext = gv_ktext.
          ENDIF.
          APPEND i_2006.
          CLEAR i_2006.
        ENDIF.
      ENDLOOP.
    For Vacation Quota (80) get the Balance of the Last Year and
    add to the Current Year Quota
      LOOP AT i_2006.
        IF i_2006-ktart = '80'.
          lv_year1 = i_2006-endda+0(4).
          lv_year2 = lv_year1 - 1.
          CONCATENATE lv_year2 '01' '01' INTO lv_date1.
          CONCATENATE lv_year2 '12' '31' INTO lv_date2.
          LOOP AT p2006 WHERE pernr = i_2006-pernr AND
                              begda GE lv_date1    AND
                              endda LE lv_date2    AND
                              ktart = '80'.
            lv_anzhb = p2006-anzhl - p2006-kverb.
            i_rep1-pernr = i_2006-pernr.
            i_rep1-ktext = i_2006-ktext.
            i_rep1-anzhl = p2006-anzhl.
            i_rep1-kverb = p2006-kverb.
            i_rep1-ename = i_2006-ename.
            i_rep1-begda = p2006-begda.
            i_rep1-endda = p2006-endda.
            i_rep1-anzhb = lv_anzhb.
            i_rep1-ktart = '80'.
            i_rep1-year = lv_year2.
            APPEND i_rep1.
            CLEAR: i_rep1.
          ENDLOOP.
        ENDIF.
        CLEAR: lv_year1, lv_year2,
               lv_date1, lv_date2,lv_anzhb.
      ENDLOOP.
      SORT i_rep1 BY pernr ktart.
    ENDFORM.          "get_pers_data
    *&      Form  append_data
    Put the Absence and Quota Data into one Report Int Table
    FORM append_data.
      CLEAR:   i_rep.
      REFRESH: i_rep.
      SORT i_2001 BY pernr awart.
      SORT i_2006 BY pernr ktart year.
    Move I_REP1 data into i_2006
      LOOP AT i_rep1.
        MOVE-CORRESPONDING i_rep1 TO i_2006.
        APPEND i_2006.
        CLEAR  i_2006.
      ENDLOOP.
    Move the Absence and Quota Data into a final Int Table
      LOOP AT i_2006.
        i_rep-pernr = i_2006-pernr.
        i_rep-ename = i_2006-ename.
        i_rep-ktart = i_2006-ktart.
        i_rep-anzhl = i_2006-anzhl.
        i_rep-kverb = i_2006-kverb.
        i_rep-ktext = i_2006-ktext.
        i_rep-anzhb = i_2006-anzhb.
        i_rep-year  = i_2006-year.
        CLEAR i_2001.
        CASE i_2006-ktart.
          WHEN '81'.
            PERFORM get_2001 USING i_2006-pernr '1000' i_2006-year.
          WHEN '50'.
            PERFORM get_2001 USING i_2006-pernr '1002' i_2006-year.
          WHEN '80'.
            PERFORM get_2001 USING i_2006-pernr '1001' i_2006-year.
          WHEN '56'.
            PERFORM get_2001 USING i_2006-pernr '1003' i_2006-year.
          WHEN '51'.
            PERFORM get_2001 USING i_2006-pernr '1004' i_2006-year.
          WHEN '52'.
            PERFORM get_2001 USING i_2006-pernr '1005' i_2006-year.
          WHEN '54'.
            PERFORM get_2001 USING i_2006-pernr '1006' i_2006-year.
          WHEN '53'.
            PERFORM get_2001 USING i_2006-pernr '1007' i_2006-year.
          WHEN '55'.
            PERFORM get_2001 USING i_2006-pernr '1008' i_2006-year.
          WHEN '57'.
            PERFORM get_2001 USING i_2006-pernr '1009' i_2006-year.
          WHEN '90'.
            PERFORM get_2001 USING i_2006-pernr '2000' i_2006-year.
          WHEN '58'.
            PERFORM get_2001 USING i_2006-pernr '2001' i_2006-year.
          WHEN '59'.
            PERFORM get_2001 USING i_2006-pernr '2002' i_2006-year.
          WHEN '91'.
            PERFORM get_2001 USING i_2006-pernr '2003' i_2006-year.
        ENDCASE.
        IF sy-subrc <> 0.
          APPEND i_rep.
        ENDIF.
        CLEAR i_rep.
      ENDLOOP.
      SORT i_rep BY pernr ktart year.
      DELETE i_rep WHERE pernr = ' '.
    ENDFORM.              " append_data
    *&      Form  display_qa_data
    Display the Absence and Quota Data
    FORM display_qa_data.
      DATA: lv_flag,                   " New Flag
            lv_tot2 TYPE ptm_quonum.   " Absence Balance days
      IF i_rep[] IS INITIAL.
        MESSAGE i000 WITH 'No Data found'(014).
      ELSE.
        LOOP AT i_rep.
    toggle color
          PERFORM toggle_color.
          IF lv_flag <> space.
            NEW-LINE.
          ENDIF.
          AT NEW pernr.
            READ TABLE i_rep INDEX sy-tabix.
            WRITE:/1 sy-vline,2(8) i_rep-pernr,
              10 sy-vline,11(40)   i_rep-ename.
          ENDAT.
          AT NEW ktart.
            READ TABLE i_rep INDEX sy-tabix.
            WRITE: 1 sy-vline, 10 sy-vline,
              51 sy-vline,52(25)   i_rep-ktext.
          ENDAT.
          AT NEW year.
            READ TABLE i_rep INDEX sy-tabix.
            WRITE: 1 sy-vline, 10 sy-vline,
                  51 sy-vline,
                  77 sy-vline, 78(4)  i_rep-year,
                  82 sy-vline, 83(11) i_rep-anzhl,
                  94 sy-vline, 95(25) i_rep-atext,
                 120 sy-vline,133 sy-vline,
                 144 sy-vline,
                 155 sy-vline,156(13)  i_rep-anzhb,
                 169 sy-vline.
          lv_tot2 = lv_tot2 + i_rep-anzhb.
          ENDAT.
          WRITE: 1 sy-vline,  10 sy-vline,
                51 sy-vline,  77 sy-vline,
                82 sy-vline,  94 sy-vline,
               120 sy-vline,121(12)  i_rep-abrtg NO-ZERO,
               133 sy-vline,134(10)  i_rep-begda NO-ZERO,
               144 sy-vline,145(10)  i_rep-endda NO-ZERO,
               155 sy-vline,169 sy-vline.
          NEW-LINE.
          AT END OF pernr.
            WRITE  : /1(169) sy-uline.
            SUM.
            FORMAT COLOR 3.
            WRITE:/1 sy-vline,   10 sy-vline,
                  51 sy-vline,   77 sy-vline,
                  82 sy-vline,   94 sy-vline,
                 120 sy-vline,121(12) i_rep-abrtg,
                 133 sy-vline,144 sy-vline,
                 155 sy-vline, 156(13) lv_tot2,
                 169 sy-vline.
            FORMAT COLOR OFF.
            WRITE  : /1(169) sy-uline.
            CLEAR lv_tot2.
          ENDAT.
        ENDLOOP.
      ENDIF.
    ENDFORM.              " display_qa_data
    *&      Form  display_q_data
    Display only the Quota Data
    FORM display_q_data.
      DATA: lv_flag.               " New Flag
      SORT i_2006 BY pernr ktart year.
      IF i_2006[] IS INITIAL.
        MESSAGE i000 WITH 'No Data found'(014).
      ELSE.
        LOOP AT i_2006.
    Toggle Color
          PERFORM toggle_color.
          IF lv_flag <> space.
            NEW-LINE.
          ENDIF.
          AT NEW pernr.
            READ TABLE i_2006 INDEX sy-tabix.
            WRITE: /1 sy-vline,  2(8)  i_2006-pernr,
                   10 sy-vline,11(40)  i_2006-ename.
          ENDAT.
          AT NEW ktart.
            READ TABLE i_2006 INDEX sy-tabix.
            WRITE: 1 sy-vline,10 sy-vline,
                  51 sy-vline,52(25)  i_2006-ktext.
          ENDAT.
          AT NEW year.
            READ TABLE i_2006 INDEX sy-tabix.
            WRITE: 1 sy-vline,  10 sy-vline,
                  51 sy-vline,
                  77 sy-vline,78(4)  i_2006-year,
                  82 sy-vline,83(11) i_2006-anzhl,
                  94 sy-vline,95(13) i_2006-anzhb,
                 108 sy-vline.
            NEW-LINE.
          ENDAT.
          AT END OF pernr.
            WRITE  : /1(108) sy-uline.
            SUM.
            FORMAT COLOR 3.
            WRITE: /1 sy-vline, 10 sy-vline,
                   51 sy-vline,
                   77 sy-vline, 82 sy-vline,
                   94 sy-vline, 95(13) i_2006-anzhb,
                  108 sy-vline.
            FORMAT COLOR OFF.
            WRITE  : /1(108) sy-uline.
          ENDAT.
        ENDLOOP.
      ENDIF.
    ENDFORM.              " display_q_data
    *&      Form  display_a_data
    Display Only the Absence Quota
    FORM display_a_data.
      DATA: lv_flag.               " New Flag
      SORT i_2001 BY pernr awart.
      IF i_2001[] IS INITIAL.
        MESSAGE i000 WITH 'No Data found'(014).
      ELSE.
        LOOP AT i_2001.
    Toggle Color
          PERFORM toggle_color.
          IF lv_flag <> space.
            NEW-LINE.
          ENDIF.
          AT NEW pernr.
            READ TABLE i_2001 INDEX sy-tabix.
            WRITE: /1 sy-vline, 2(10) i_2001-pernr,
                   10 sy-vline,11(40) i_2001-ename.
          ENDAT.
          WRITE:  1 sy-vline, 10 sy-vline,
                  51 sy-vline,52(25) i_2001-atext,
                  77 sy-vline,78(12) i_2001-abrtg,
                  90 sy-vline,91(10) i_2001-begda,
                 101 sy-vline,102(10) i_2001-endda,
                 112 sy-vline.
          NEW-LINE.
          AT END OF pernr.
            WRITE  : /1(112) sy-uline.
            SUM.
            FORMAT COLOR 3.
            WRITE: /1 sy-vline, 10 sy-vline,
                   51 sy-vline,
                   77 sy-vline,78(12) i_2001-abrtg,
                   90 sy-vline,101 sy-vline,
                  112 sy-vline.
            FORMAT COLOR OFF.
            WRITE  : /1(112) sy-uline.
          ENDAT.
        ENDLOOP.
      ENDIF.
    ENDFORM.              " display_a_data
    *&      Form  header
    Write the Report Header
    FORM header .
      data : lv_pers type pbtxt,
             lv_orgn type orgtx.
      gv_title1 = sy-title.              " Set List Header
      IF p_qa = c_x.
        NEW-PAGE LINE-SIZE 193.
      ELSEIF p_a = c_x.
        NEW-PAGE LINE-SIZE 125.
      ELSEIF p_q = c_x.
        NEW-PAGE LINE-SIZE 119.
      ENDIF.
    Standard header
      FORMAT RESET.
      CALL FUNCTION 'Z_STANDARD_HEADER'
        EXPORTING
          title1 = gv_title1.
    Get the Personal Area and Org.Unit Texts
      clear : lv_pers,lv_orgn.
      select single name1 into lv_pers
        from t500p where persa = pnpwerks-low.
      select single orgtx into lv_orgn
        from t527x where sprsl = sy-langu and
                         orgeh = pnporgeh-low and
                         endda = c_endda.
      if not lv_pers is initial.
        write : /2 'Personal Area:'(017), 17(25) lv_pers color 7.
      endif.
      if not lv_orgn is initial.
        write : /2 'Organization Unit:'(021), 20(25) lv_orgn color 3.
      endif.
      IF p_qa = c_x.
        FORMAT COLOR COL_HEADING.
        WRITE  : /1(169) sy-uline.
        WRITE:/1 sy-vline,2(8)    'Emp.No'(015) CENTERED,
              10 sy-vline,11(40)  'Employee Name'(016) CENTERED,
              51 sy-vline,52(25)  'Quota Description'(018) CENTERED,
              77 sy-vline,78(4)   'Year'(027),
              82 sy-vline,83(11)  'Entitlement'(019),
              94 sy-vline,95(25)  'Absence Description'(022) CENTERED,
             120 sy-vline,121(12) 'Absence days'(023),
             133 sy-vline,134(10) 'From Date'(024),
             144 sy-vline,145(10) 'To Date'(025),
             155 sy-vline,156(13) 'Quota Balance'(020),
             169 sy-vline.
        WRITE  : /1(169) sy-uline.
      ELSEIF p_q = c_x.
        FORMAT COLOR COL_HEADING.
        WRITE  : /1(108) sy-uline.
        WRITE:/1 sy-vline,2(8)   'Emp.No'(015) CENTERED,
              10 sy-vline,11(40) 'Employee Name'(016) CENTERED,
              51 sy-vline,52(25) 'Quota Description'(018) CENTERED,
              77 sy-vline,78(4)  'Year'(027),
              82 sy-vline,83(11) 'Entitlement'(019),
              94 sy-vline,95(13) 'Quota Balance'(020),
             108 sy-vline.
        WRITE  : /1(108) sy-uline.
      ELSEIF p_a = c_x.
        FORMAT COLOR COL_HEADING.
        WRITE  : /1(112) sy-uline.
        WRITE:/1 sy-vline,2(8)    'Emp.No'(015) CENTERED,
              10 sy-vline,11(40)  'Employee Name'(016) CENTERED,
              51 sy-vline,52(25)  'Absence Description'(022) CENTERED,
              77 sy-vline,78(12)  'Absence days'(023),
              90 sy-vline,91(10)  'From Date'(024),
             101 sy-vline,102(10) 'To Date'(025),
             112 sy-vline.
        WRITE  : /1(112) sy-uline.
      ENDIF.
    ENDFORM.                    " header
    *&      Form  toggle_color
    This routine alters the color of the records in the list
    FORM toggle_color.
      IF gv_dial = space.
        FORMAT COLOR COL_NORMAL INTENSIFIED OFF.
        gv_dial = c_x.
      ELSE.
        FORMAT COLOR 1 INTENSIFIED OFF.
        CLEAR gv_dial.
      ENDIF.
    ENDFORM.                    " toggle_color
    *&      Form  get_2001
    Get the ABsence type for each Quota type
    FORM get_2001 USING p_pernr TYPE persno
                        p_value TYPE awart
                        p_year.
      LOOP AT i_2001 WHERE pernr = p_pernr AND
                           awart = p_value AND
                           endda+0(4) = p_year.
        i_rep-awart = i_2001-awart.
        i_rep-abrtg = i_2001-abrtg.
        i_rep-atext = i_2001-atext.
        i_rep-begda = i_2001-begda.
        i_rep-endda = i_2001-endda.
        APPEND i_rep.
      ENDLOOP.
    ENDFORM.                                                    " get_2001
    <b>Reward points for useful Answers</b>
    Regards
    Anji

  • Issue when uploading Sales data from DSO to Cube.

    Dear All,
    I have an issue when I am uploading Sales data from DSO to Cube.  I am using BI 7.0 and I have uploaded all sales document level data to my DSO.  Then I use a transformation rule to calculate the Sales Value when I am doing DTP to cube.  Cube has customer wise aggregation data.
    In DSO I have NetPrice(KF) and Delivered_QTY(KF).  I do a simple multiplication routine in the transformation from DSO to Cube.
    RESULT =   SOURCE_FIELDS-NET_PRICE * SOURCE_FIELDS-DLV_QTY .
    At the moment I use Active Table(With out Archive) on the DSO to get the data since this is my first load.
    Issue is that the figure (Sales Value) in the cube is incorrect. I am getting a very large values which is impossible.
    Can someone please help me.
    Shanka

    Hi,
    are you sure that cube has customer wise aggregated data..since it will always aggregate the values of the key figures for the same set of characters...
    did you checked the values for the key figures as well and if even they are inflated or is the problem with this key figure only.
    During the data load the records may aggregates first for the same values of characters and then multiplication will be happening....if that is the case then you may have to multiply the values before storing it in the data package and then let it aggregate....can be achived through start routine...
    but first verify if other key figures are also having the same issue.
    Thanks
    Ajeet

  • Unable to extract the data from ECC 6.0 to PSA

    Hello,
    I'm trying to extract the data from ECC 6.0 data source name as 2LIS_11_VAHDR into BI 7.0
    When i try to load Full Load into PSA , I'm getting following error message
    Error Message: "DataSource 2LIS_11_VAHDR must be activated"
    Actually the data source already active , I look at the datasource using T-code LBWE it is active.
    In BI  on datasource(2LIS_11_VAHDR) when i right click selected "Manage"  system is giving throughing below error message
    "Invalid DataStore object name /BIC/B0000043: Reason: No valid entry in table RSTS"
    If anybody faced this error message please advise what i'm doing wrong?
    Advance thanks

    ECC 6.0 side
    Delete the setup tables
    Fill the data into setup tables
    Schedule the job
    I can see the data using RSA3 (2LIS_11_VAHDR) 1000 records
    BI7.0(Service Pack 15)
    Replicate the datasource in Production in Backgroud
    Migrate Datasource 3.5 to 7.0 in Development
    I did't migrate 3.5 to 7.0 in Production it's not allowing
    When i try to schedule the InfoPakage it's giving error message "Data Source is not active"
    I'm sure this problem relate to Data Source 3.5 to 7.0 convertion problem in production. In Development there is no problem because manually i convert the datasource 3.5 to 7.0
    Thanks

  • Problem while  data processing TRANSACTION data from DSO to CUBE

    Hi Guru's,
    we are facing problem while  data processing TRANSACTION data from DSO to CUBE. data packets  processing very slowly  and updating .Please help me regarding this.
    Thanks and regards,
    Sridhar

    Hi,
    I will suggest you to check a few places where you can see the status
    1) SM37 job log (give BIrequest name) and it should give you the details about the request. If its active make sure that the job log is getting updated at frequent intervals.
    2) SM66 get the job details (server name PID etc from SM37) and see in SM66 if the job is running or not. See if its accessing/updating some tables or is not doing anything at all.
    If its running and if you are able to see it active in SM66 you can wait for some time to let it finish.
    3) RSMO see what is available in details tab. It may be in update rules.
    4) ST22 check if any short dump has occured.
    You can also try SM50 / SM51 to see what is happening in the system level like reading/inserting tables etc.
    If you feel its active and running you can verify by checking if the number of records has increased in the cube.
    Thanks,
    JituK

  • Unable to load data from DSO to Cube

    Good morning all,
    I was trying to load data from DSO to Cube for validation. Before loading the new data, I deleted all from DSO and Cube. They contain no request at all. Cube has "Delta Update". First DSO was loaded 138,300 records successfully. Then Activated the DSO. The last when I clicked Exectue (DSO --> Cube), it loaded 0 record. I was able to load the data yesterday. What might be the reasons for this situation?
    Thank you so much!

    Hi BI User,
    For loading delta upload into the data target, there should be an initialization request in the data target with same selection criteria.
    So..first do the initialization and perform delta upload into the cube.
    Regards,
    Subhashini.

  • Not able to extract performance data from .ETL file using xperf commands. getting error "Events were lost in this trace. Data may be unreliable ..."

    Not able to extract  performance data from .ETL file using xperf commands.
    Xperf Commands:
    xperf –i C:\TempFolder\Test.etl -o C:\TempFolder\BootData.csv  –a process
    Getting following error after executing above command:
    "33288636 Events were lost
    in this trace. 
    Data may be unreliable
    This is usually caused
    by insufficient disk bandwidth for ETW lo
    gging.
    Please try increasing the minimum
    and maximum number of buffers
    and/or
                    the buffer size. 
    Doubling these values would be a good first at
    tempt.
    Please note, though, that
    this action increases the amount of me
    mory
                    reserved
    for ETW buffers, increasing memory pressure on your sce
    nario.
    See "xperf -help start"
    for the associated command line options."
    I changed page size file but its does not work for me.
    Any one have idea, how to solve this problem and extract ETL file data.

    I want to mention one point here. I have total 4 machines out of these 3 machines above
    commands working properly. Only one machine has this problem.<o:p></o:p>
    Hi,
    I consider that you can try to use xperf to collect the trace etl file and see if it can be extracted on this computer:
    Refer to following articles:
    start
    http://msdn.microsoft.com/en-us/library/windows/hardware/hh162977.aspx
    Using Xperf to take a Trace (updated)
    http://blogs.msdn.com/b/pigscanfly/archive/2008/02/16/using-xperf-to-take-a-trace.aspx
    Kate Li
    TechNet Community Support

  • Request: Problem in Extraction of data from CRM to BW

    Hi Gurus
    I have problems in extracting the data from CRM to BW.I thought you are the one who can solve them:
    1) When I check the data in the datasource in RSA3 i can see all records as well as i can see complete data in that but when i extract and check the data in cube i can see all records but i cannot see the complete data i.e some data is missing.
    2)And another problem is that In report i have 2 characteristics(DATS) start date and end date now i am trying to get number of days from start date to end date.
    I appreciate you in advance. If u find any good documentation please send me to [email protected]
    With regards
    Nagamani.

    Hi krishna,
                       yes we did enhancement to this data source. In PRDOCTION its taking 27 hours time  to load data.
              For extraction from source system (CRM) its taking  nearly 24 hours time  as per JOb LOG .
    08/05/2010 11:53:08 Call customer enhancement EXIT_SAPLRSAP_001 (CMOD) with 10,849 records
    08/05/2010 12:02:02 Result of customer enhancement: 10,849 records
    08/05/2010 12:02:03 PSA=0 USING & STARTING SAPI SCHEDULER
    08/05/2010 12:02:03 Asynchronous send of data package 1 in task 0002 (1 parallel tasks)
    08/05/2010 12:02:06 IDOC: Info IDoc 2, IDoc No. 1576298, Duration 00:00:01
    08/05/2010 12:02:06 IDoc: Start = 08/05/2010 10:26:37, End = 08/05/2010 10:26:38
    08/05/2010 13:02:38 Call customer enhancement EXIT_SAPLRSAP_001 (CMOD) with 10,958 records
    08/05/2010 13:11:48 Result of customer enhancement: 10,958 records
    08/05/2010 13:11:52 Asynchronous send of data package 2 in task 0003 (1 parallel tasks)
    08/05/2010 13:11:53 tRFC: Data Package = 1, TID = 0AA00D0301984C5AEE8E44DB, Duration = 00:16:
    08/05/2010 13:11:53 tRFC: Start = 08/05/2010 12:02:19, End = 08/05/2010 12:18:27
    08/05/2010 14:30:13 Call customer enhancement EXIT_SAPLRSAP_001 (CMOD) with 11,296 records
    08/05/2010 14:39:53 Result of customer enhancement: 11,296 records
    like this its taking 24 hours in extraction its self.
    Edited by: kotha123 on Aug 10, 2010 12:08 PM
    Edited by: kotha123 on Aug 10, 2010 12:09 PM

  • How to extract the data from R/3?

    Gurus,
    I need to extract the data from R/3 to BI . Below i am giving Table name followed by field name and field description.
    I am using BI 7.0 and ECC 6.0
    I am having following doubts.
    1. Based on below information which extract method I need to use (Generic data  Extraction?)?
    2. I try to create a one view but I could not find Join between the tables. I found only few joins. how to join other tables?
    3. Without joining these table how can I collect the data ?
    4. Do i need to create multiple tables?
    SAP Table      SAP Field        Field Description
    VBAK->             VBELN->        Sales Document
    VBAP->             WERKS->        Plant
    VBAK->              AUART->            Doc Type
    VBAP->             PSTYV->        Item Category
    VBAP->             MATNR->        Material
    VBAP->             ARKTX->        Description
    MATNR->              GROES->     
    VBAP->             ZIEME->        Target Quantity UOM
    VBAP->             WAVWR->        Cost
    VBAK->             ERNAM->        Name of a person who created the object
    VBAKA->             KUNNR->        Customer Number 1
    VBAP->             BSTNK->                 Customer purchase Order no
    VBEP->             BANFN->        Purchase Requisition Number
    QMFE->             FEGRP->        Cause Code(Code Group Problem)
    QMFE->             FECOD->        Damage Code(Problem or Damage Code)
    QMEL->             QMNUM->        Notification No
    EQUI->             EQUNR->        Equipment Number
    QMEL->             QMART->        Notification Type
    QMELI->             ERDAT->        Date on Which Record Was Created
    ============================================================================================
    Gurus I will assign full points .
    Thanks

    Hi,
    I would use the standard extractor 2lis_11_vaitm and enhance it for the following fields:
    VBEP-> BANFN-> Purchase Requisition Number
    QMFE-> FEGRP-> Cause Code(Code Group Problem)
    QMFE-> FECOD-> Damage Code(Problem or Damage Code)
    QMEL-> QMNUM-> Notification No
    EQUI-> EQUNR-> Equipment Number
    QMEL-> QMART-> Notification Type
    QMELI-> ERDAT-> Date on Which Record Was Created
    regards
    Siggi

Maybe you are looking for

  • Creative ZEN V lost video with the panda!

    Please, help me! I've just formated my mp3 accidentally. It's stupid I know, but there were some pic and videos I rly like and they already were on this mp4 when I bought it. There was video with panda and funny music and some pictures for the wallpa

  • E-Recruiting : Creation of NA for newly hired P

    Dear experts, We are implementing succession planning using eRecruiting 600. The implementation is done in an integrated scenario where eRec600 is on the same server as ERP 603. I know that every object P should have a relation with CP,BP & NA before

  • WRT54GS VERY Slow Internet?

    Well, We bought a router about 3-4 months ago, when my sister got her a laptop and I got a Nintendo DS. About a week ago, the internet on the main computer downstairs (the router/DSL box is RIGHT on top of the computer tower, so it isn't a interferen

  • Access oracle database from different classes in desktop / standalone app.

    I am a bit confused as to what way to go. I am building a desktop application that needs to access an oracle database. I have done this in the past using code similar to the following:         try {             DriverManager.registerDriver(new oracle

  • Vendor Calendar in PIR??

    Hello all, Similar to how there is in customer master, is there a calendar we can assign in vendor master or PIR? For a specific vendor, when we send PO, we wish to have requested delivery day for Tuesday and always Tuesday only.  What is the best wa