To Fetch current data record

Hi All,
I have an Requirement in WebI to fetch the latest updated record for a particular employee.
In my report I have the columns:
Employee Emp ID Nationality         Date            Department
Empx       23156      USA         20-03-2014        ELE
Empx       23156      USA         26-01-2014        DVX
Empx       23156      USA         19-12-2013        INV
Empx       23156      USA         15-11-2013        ME
Usually an Employee will keeps on changing the department time to time.Upon execution of report, I am getting the above results .
But the client requirement is to maintain only 1 record for a particular employee and that too latest record.
Client is nowhere bothered what all departments he has changed before , but he is bothered only the current department which he belongs currently and he wants only that latest record.
So , if we consider the above mentioned scenerio, the client wants only 1 last updated record for that particular employee in the report i.e
Employee  Emp ID  Nationality   Date               Department
Empx        23156      USA         20-03-2014        ELE
Please be noted that, for the report data is coming from Bex query and I am working on BO 4.1 version.
Could anyone please let me know what could be done for this requirement?

Hi Cris,
Follow Below steps:
1.     Create one variable as below:
        =[Date] Where( [Date] = Max([Date]) In([Emp ID]))
2.     Now Select your table. Go to Analysis > Filter. Click on "Add filter", Select the variable created                          above. Now put the filter as "Is not null".
You will get desired data.
Regards
Rakesh

Similar Messages

  • How to fetch the data records in summary format based on previous day

    drop table T1;
    create table T1(Class, Fees_Collected, Submit_Date) as select
    'MBA', 100000, '7/30/2012' from dual union all select
    'Btech', 20000, '7/10/2012' from DUAL union all select
    'MBA', 45000, '8/1/2012' from dual union all select
    'Btech', 55550, '7/31/2012' from DUAL union all select
    'BBA', 250660, '7/30/2012' from dual union all select
    'MBBS', 44556000, '7/31/2012' from DUAL union all select
    'BDS', 420050, '8/1/2012' from DUAL union all select
    'BBA', 30450, '7/30/2012' from DUAL union all select
    'MBBS', 120450, '7/31/2012' from DUAL union all select
    'BDS', 45950, '7/30/2012' from DUAL union all select
    'MBA', 252450, '8/1/2012' from DUAL;My requirment is to fetch the records based on summary format to display the data with following columns -
    Class |Prev Day Traded Value |Prev Day % of Total |Prev Day % MBA
    Note - Previous Day definiton (Buisness Day) = calendar days - (weekends + US Holidays)
    Kindly help me, as i want to keep it customized so that without specifying the hard coded dates ( Previous Day) it runs through and provide me the resultset....
    All of your help and time is highly appericated.

    You mean business days I guess ?
    I use a function (I had to write it myself) <tt><b> next_business_day(p_start_date in date,p_days_count in number) return date </b></tt>
    where I loop forward/backward from p_start_date according to the sign of p_days_count the required number of steps skipping weekends and holidays as the absolute value of p_days_count is mostly under 30 for the rest (at least around here) Oracle's <tt><b> add_months </b></tt> gets it done.
    Regards
    Etbin
    Edited by: Etbin on 6.8.2012 17:55
    Year till date: <tt><b> your_date between trunc(sysdate,'year') and sysdate </b></tt> is it ? http://en.wikipedia.org/wiki/Year-to-date ?
    Edited by: Etbin on 6.8.2012 18:27
    NOT TESTED!
    function next_business_date(p_start_date in date,p_count_days in number) return date is
      steps  number := abs(p_count_days);
      retval date := p_start_date;
    begin
      if p_start_date is null or p_count_days is null then
        return to_date(null);
      end if;
      while steps > 0
      loop
    /* skipping weekends and (yet to be implemented) holidays */
        while to_char(retval,'dy') in ('sat','sun') /* or is_holiday(retval) */
        loop
          retval = retval + sign(p_count_days);
        end loop;
    /* retval contains a business day now */
        retval = retval + sign(p_count_days);
        step := step - 1;
      end_loop
      return retval;
    end;

  • Find closest date before current date

    Hi,
    How to get the closest date record before the current date record using workflows(preferably) ?
    Detailed :-
    For e.g. A record is created with status as "Open". On click of a button a program should be invoked(workflow or script) to fetch the most recent records with status as "Open".
    That means if a new record is created today and already there are 3 records with same status as "Open" with two created yesterday and one created 2 days before.Then our program should fetch only those two records created yesterday and not the one created two days before.
    Thanks
    Sambit

    The simplest solution I can offer is:
    Add a search spec on the BC for Status="Open" and add a Sort Spec on the BC for "Created Date (DESC)". When you query the BC, just focus on the first record which should be the most recent record in "Open" status.
    Hope it helps!

  • IDOC Data Records

    I have the IDOC Number with me. How to fetch the Data Records from that ?

    If you want to read the idoc records in your program:
    use the folloing fms:
    Sample code:
      CALL FUNCTION 'EDI_DOCUMENT_OPEN_FOR_READ'
        EXPORTING
          document_number = p_docnum
        IMPORTING
          idoc_control    = f_idoc_control
        EXCEPTIONS
          OTHERS          = 01.
      IF sy-subrc <> 0.
        MESSAGE ID sy-msgid TYPE sy-msgty NUMBER sy-msgno
                WITH sy-msgv1 sy-msgv2 sy-msgv3 sy-msgv4
                RAISING error_opening_idoc.
      ENDIF.
    * Read IDOC data segments
      CALL FUNCTION 'EDI_SEGMENTS_GET_ALL'
        EXPORTING
          document_number = p_docnum
        TABLES
          idoc_containers = t_idoc_data
        EXCEPTIONS
          OTHERS          = 01.
      IF sy-subrc <> 0.
        MESSAGE ID sy-msgid TYPE sy-msgty NUMBER sy-msgno
                WITH sy-msgv1 sy-msgv2 sy-msgv3 sy-msgv4
                RAISING error_reading_idoc.
      ENDIF.
    * Close IDOC
      CALL FUNCTION 'EDI_DOCUMENT_CLOSE_READ'
        EXPORTING
          document_number = p_docnum
        IMPORTING
          idoc_control    = f_idoc_control
        EXCEPTIONS
          OTHERS          = 01.
      IF sy-subrc <> 0.
        MESSAGE ID sy-msgid TYPE sy-msgty NUMBER sy-msgno
                WITH sy-msgv1 sy-msgv2 sy-msgv3 sy-msgv4
                RAISING error_closing_idoc.
      ENDIF.

  • Although my ipod shows the number of steps, calories, and time, it no longer is saving it to the history. also when I open the history it takes me to May 2010 instead of the current date. Any ideas on how to get it to start recording the history again?

    Although my ipod shows the number of steps, calories, and time, it no longer is saving it to the history. Also when I open the history it takes me to May 2010 instead of the current date. Any ideas on how to get it to start recording the history again?

    Try:                                               
    - iOS: Not responding or does not turn on           
    - Also try DFU mode after try recovery mode
    How to put iPod touch / iPhone into DFU mode « Karthik's scribblings
    - If not successful and you can't fully turn the iOS device fully off, let the battery fully drain. After charging for an least an hour try the above again.
    - Try another cable                     
    - Try on another computer                                                       
    - If still not successful that usually indicates a hardware problem and an appointment at the Genius Bar of an Apple store is in order.
    Apple Retail Store - Genius Bar
    The missing apps could have been done by setting the Restrictions that can hid those apps. If the backup was made with those retrictions set the the Restrictions are also restored.
    Thus, if you get it to work restore to factory settings/new iPod, not from backup                               
    You can redownload most iTunes purchases by:        
      Downloading past purchases from the App Store, iBookstore, and iTunes Store

  • Best Approach to Compare Current Data to New Data in a Record.

    Which is the best approach in comparing current data to new data for every single record before making updates?
    What I have is a table with 5,000 records that need to be updated every week with data from a data feed, but I dont want to update every single record week after week. I want to update only records with different data.
    I can think of these options:
    1) two cursors (one for current data and one for new)
    2) one cursor for new data and one varray for current data
    3) one cursor for new data and a select into for current data.
    Thanks for your help.

    I don't recommend it over merge, but in theory you could use a checksum (OWA_OPT_LOCK.checksum()) on the rows and if they differ then copy the new row into the destination table. Or you might be able to use rowscn to see if things have changed.
    Like I said, I don't know that I 'd take either approach over merge, but they are options.
    Gaff
    Edited by: Gaff on Feb 11, 2009 3:25 PM
    Actually, rowscn between 2 tables may not be an option. I know you can turn a rowscn into a time value so if that rowscn is derived from anything but the column values for the row that would foil that approach.

  • Duplciuate Data record error while reloading a DTP

    I have a DTP created to load etxt datasource and put in the process chain. I have set "delta mode" so that it picks up if additional records are available.
    But, I noiticc that when there are no additional records (for eg: earlier load had 269 records and current load has same 269 records), I am gettign the error - "Duplciate data record" and status is set as "Red". I do not want to see error in this case as it did not find any new records and want the status to be green if there are no new records.
    Could you please suggest what settings will do that.
    Regards
    Raj

    Delta DTP will fetch only unloaded requests ( requests which donot exist in target ) and not additional records.
    Is the text datasource delta enabled ?
    Do you have Infopackage of Update type Delta setup?
    Did you run a Full DTP before Delta DTP?
    Assuming Full Infopackage has loaded 269 records to PSA and the same will be loaded ( if no additional records in source system ) again.
    Req 1 - 269 - Yesterday
    Req 2 - 269  - Today
    Full DTP ran yesterday will load 269 records to target.
    Delta DTP ran today will load Req1 & Req 2 to target ( master data will be overwritten ) - Reason Delta DTP is acting like Init with data transfer.
    You can start off with a Delta DTP instead Full, if Full DTP is ran before a Delta DTP maje sure you delete requests loaded by Full DTP.
    This can be ignored as this is Master Data which wil be overwritten.
    To get rid of error jst Check " Handle Duplicate Records " in  Update Tab of DTP.

  • Postion of data record in the table control

    Hi all,
    I am working on a module pool that has a table control which fetches the data from the Transparent table.
    Suppose there is data in the z-table formed and table control is showing the data record.
    i want to text boxes besides the table control, one of which will show me the value index of top row of the record set and other one will show me the value of last row appearing in that page of the table control.
    if i press page down then i should get new values in both the text boxes .
    please help me to get an idea what vales i will take to show the indexes.
    thanks
    ekta

    Thanks for you help.
    i have used a value 'n' which is no of rows that are on the one page of the table control.
    so when i do page down it will show me next values i.s values of the index currently on the table control next page.
    anyways thanks a lot

  • Historic and Current data for Master data bearing objects

    Hi All,
    We are trying to implement type 2 dimensions for all the master data bearing characteristics, where we require historic and current data available for reporting. the master data can have a number of attributes and all of them can be time dependent. We are not getting any 'datefrom' or 'dateto' from the source system.
    For example:
    For Example: The table below shows data entering BI at different dates.
    Source Data day of entering BI
    MasterID ATTR1 ATTR2
    123506 Y REWAR day1
    123506 N REWAR day4
    123506 Y ADJUST day4
    123506 N ADJUST dayn
    The field 'day of entry into BI' is only for your understanding; we do not get any date fields from the source. SID is the field we are generating for uniqueness. It is a counter. EFF_DATE would be the current date for all the data. EXP_DATE would be 31.12.9999 until the attributes change. SID and MasterID together would be the key.
    On day 1 the following data enters BI,
    day 1
    SID MasterID ATTR1 ATTR2 EFF_DATE EXP_DATE
    1 123506 Y REWAR 2/10/2009 12/31/9999
    On day 4, 2 data records enter with same PID,
    SID MasterID ATTR1 ATTR2 EFF_DATE EXP_DATE
    2 123506 N REWAR 2/13/2009 12/31/9999
    3 123506 Y ADJUST 2/13/2009 12/31/9999
    the EXP_DATE of the record of day 1 needs to be changed to current date.
    Also there are two records entering, so latest record would have EXP_DATE as 31.12.9999. And the EXP_DATE of the first record on day 4 should change to the current date.
    so the following changes should happen,
    CHANGE
    SID MasterIDATTR1 ATTR2 EFF_DATE EXP_DATE
    1 123506 Y REWAR 2/10/2009 2/13/2009
    CHANGE
    SID MasterID ATTR1 ATTR2 EFF_DATE EXP_DATE
    3 123506 Y ADJUST 2/13/2009 2/22/2009
    On day n, one data record enters with same PID,
    SID MasterID ATTR1 ATTR2 EFF_DATE EXP_DATE
    4 123506 N ADJUST 2/22/2009 12/31/9999
    The change is ,
    CHANGE
    SID MasterID ATTR1 ATTR2 EFF_DATE EXP_DATE
    3 123506 Y ADJUST 2/13/2009 2/22/2009
    The data expected in P-table is as below, on Day n or after Day n, untill any other record enters for this MasterID,
    1 123506 Y REWAR 2/10/2009 2/13/2009
    2 123506 N REWAR 2/13/2009 2/13/2009
    3 123506 Y ADJUST 2/13/2009 2/22/2009
    4 123506 N ADJUST 2/22/2009 12/31/9999
    Has anyone worked on type 2 dimensions earlier? Or any ideas to implement this logic would be appreciated.
    Regards,
    Sudeepti

    Compound the Master ID with eff date and other attribute as superior objects
    so you will get P-table as
    ATTR1   ATTR2   MAT ID  
    1 2/10/2009 2/13/2009 123506 Y REWAR
    2 2/13/2009 2/13/2009 123506 N REWAR
    3 2/13/2009 2/22/2009 123506 Y ADJUST
    4 2/22/2009 12/31/9999  123506 N ADJUST

  • Not able to fetch the data by Virtual Cube

    Hi Experts,
    My requirement is I need to fetch the data from Source System (From Data base table) by using Virtual Cube.
    What I have done is I have created Virtual Cube and created corresponding Function Module in Source System.
    That Function MOdule is working fine, if Data base table is small in Source System.But If data base table contains huge amount of data (millions of record), I am not able to fetch the data.
    Below is the code I have incorporated in my function module.
    c_th_mapping TYPE CL_RSDRV_EXTERNAL_IPROV_SRV=>TN_TH_IOBJ_FLD_MAPPING.
      DATA:
        l_s_map TYPE CL_RSDRV_EXTERNAL_IPROV_SRV=>TN_S_IOBJ_FLD_MAPPING.
      l_s_map-iobjnm = '0PARTNER'.
      l_s_map-fldnm  = 'PARTNER'.
      insert l_s_map into table l_th_mapping.
    create object l_r_srv
        exporting
           i_tablnm              = '/SAPSLL/V_BLBP'
          i_th_iobj_fld_mapping = l_th_mapping.
      l_r_srv->open_cursor(
        i_t_characteristics = characteristics[]
        i_t_keyfigures      = keyfigures[]
        i_t_selection       = selection[] ).
       l_r_srv->fetch_pack_data(
        importing
          e_t_data = data[] ).
      return-type = 'S'.
    In the above function Module,Internal table L_TH_MAPPING contains Info Objects from Virtual Cube and corresponding field from Underlying data base table.
    The problem where I am facing is, in the method FETCH_PACK_DATA, initially program is trying to fetch all the recordsfrom data base table to internal table.If Data base table so lagre, this logic is not working.
    So would you please help me how to handle these kind of issues.

    Hi Experts,
    My requirement is I need to fetch the data from Source System (From Data base table) by using Virtual Cube.
    What I have done is I have created Virtual Cube and created corresponding Function Module in Source System.
    That Function MOdule is working fine, if Data base table is small in Source System.But If data base table contains huge amount of data (millions of record), I am not able to fetch the data.
    Below is the code I have incorporated in my function module.
    c_th_mapping TYPE CL_RSDRV_EXTERNAL_IPROV_SRV=>TN_TH_IOBJ_FLD_MAPPING.
      DATA:
        l_s_map TYPE CL_RSDRV_EXTERNAL_IPROV_SRV=>TN_S_IOBJ_FLD_MAPPING.
      l_s_map-iobjnm = '0PARTNER'.
      l_s_map-fldnm  = 'PARTNER'.
      insert l_s_map into table l_th_mapping.
    create object l_r_srv
        exporting
           i_tablnm              = '/SAPSLL/V_BLBP'
          i_th_iobj_fld_mapping = l_th_mapping.
      l_r_srv->open_cursor(
        i_t_characteristics = characteristics[]
        i_t_keyfigures      = keyfigures[]
        i_t_selection       = selection[] ).
       l_r_srv->fetch_pack_data(
        importing
          e_t_data = data[] ).
      return-type = 'S'.
    In the above function Module,Internal table L_TH_MAPPING contains Info Objects from Virtual Cube and corresponding field from Underlying data base table.
    The problem where I am facing is, in the method FETCH_PACK_DATA, initially program is trying to fetch all the recordsfrom data base table to internal table.If Data base table so lagre, this logic is not working.
    So would you please help me how to handle these kind of issues.

  • FI line items up to the current date

    Hi all,
    I read the SAP note 485958 "BCT-FI:Extracting FI line items up to the current date" and I have some doubts here:
    1. After I've set parameter BWFIOVERLA to 'X' and extract delta at 6PM, will FI data that is changed before 6PM for example a gl entry at 5PM, goes into my delta extraction?
    2. What does BWFITIMBOR mean by having default value 02:00:00?
    3. Is this note applicable if my R3 PI release is at 2003_1_46C?
    Please advise.
    Thank you.
    Celeste

    Hi Celeste,
       Pls go through this link:
    http://help.sap.com/saphelp_nw04/helpdata/en/af/16533bbb15b762e10000000a114084/content.htm
    Excerpts from this link regarding your doubt:
    BWFIOVERLA: The logic of this parameter prevents records with the previous day as their CPU date from still being in the update at the time of extraction.
    If X is set for this parameter, selection is made up to the previous day if the time limit is not reached.
    If <space> is set for this parameter, selection is made up to the day before the previous day if the time limit is not reached.
    BWFITIMBOR: This parameter designates 02.00 (2 A.M.) as the time limit for extraction. If this limit is not reached, the system uses a security interval of one day to determine the To-value of the selection. If extraction is started before 02:00 therefore, the selection takes place only up to the day before the extraction run.
    Hope it helps you!!!!
    Regards,
    Amit

  • Error when Fetching CATs Data to BI

    Hi,
    I have created data source in R/3 System (Source System) for CATS data and extracting data Using function Module. Using transaction RSA3 it is fetching all the desired records in R/3 System.
    In BI System extracting Full Data and when monitoring it is throwing Error.
    Error 7 when sending an IDoc
    Error when opening an RFC connection    2
    Error when opening an RFC connection
    Errors in source system
    Function Module for fetching CATS Data
    FUNCTION ZLBTIME_GET_CATSREC.
    *"*"Local Interface:
    *"  IMPORTING
    *"     VALUE(I_REQUNR) TYPE  SRSC_S_IF_SIMPLE-REQUNR
    *"     VALUE(I_DSOURCE) TYPE  SRSC_S_IF_SIMPLE-DSOURCE OPTIONAL
    *"     VALUE(I_MAXSIZE) TYPE  SRSC_S_IF_SIMPLE-MAXSIZE OPTIONAL
    *"     VALUE(I_INITFLAG) TYPE  SRSC_S_IF_SIMPLE-INITFLAG OPTIONAL
    *"     VALUE(I_READ_ONLY) TYPE  SRSC_S_IF_SIMPLE-READONLY OPTIONAL
    *"     VALUE(I_REMOTE_CALL) TYPE  SBIWA_FLAG DEFAULT SBIWA_C_FLAG_OFF
    *"  TABLES
    *"      I_T_SELECT TYPE  SRSC_S_IF_SIMPLE-T_SELECT OPTIONAL
    *"      I_T_FIELDS TYPE  SRSC_S_IF_SIMPLE-T_FIELDS OPTIONAL
    *"      E_T_DATA STRUCTURE  ZLSCATS OPTIONAL
    *"  EXCEPTIONS
    *"      NO_MORE_DATA
    *"      ERROR_PASSED_TO_MESS_HANDLER
    * Example: DataSource for table SFLIGHT
    *  TABLES: SFLIGHT.
    * Auxiliary Selection criteria structure
      DATA: L_S_SELECT TYPE SRSC_S_SELECT.
    * Maximum number of lines for DB table
      STATICS: S_S_IF TYPE SRSC_S_IF_SIMPLE,
    * counter
              S_COUNTER_DATAPAKID LIKE SY-TABIX,
    * cursor
              S_CURSOR TYPE CURSOR.
    * Select ranges
    **  RANGES: L_R_CARRID  FOR SFLIGHT-CARRID,
    **          L_R_CONNID  FOR SFLIGHT-CONNID.
    * Initialization mode (first call by SAPI) or data transfer mode
    * (following calls) ?
      IF I_INITFLAG = SBIWA_C_FLAG_ON.
    * Initialization: check input parameters
    *                 buffer input parameters
    *                 prepare data selection
    * Check DataSource validity
        CASE I_DSOURCE.
          WHEN 'ZLTIME_DS_3'.
          WHEN OTHERS.
            IF 1 = 2. MESSAGE E009(R3). ENDIF.
    * this is a typical log call. Please write every error message like this
            LOG_WRITE 'E'                  "message type
                      'R3'                 "message class
                      '009'                "message number
                      I_DSOURCE   "message variable 1
                      ' '.                 "message variable 2
            RAISE ERROR_PASSED_TO_MESS_HANDLER.
        ENDCASE.
        APPEND LINES OF I_T_SELECT TO S_S_IF-T_SELECT.
    * Fill parameter buffer for data extraction calls
        S_S_IF-REQUNR    = I_REQUNR.
        S_S_IF-DSOURCE = I_DSOURCE.
        S_S_IF-MAXSIZE   = I_MAXSIZE.
    * Fill field list table for an optimized select statement
    * (in case that there is no 1:1 relation between InfoSource fields
    * and database table fields this may be far from beeing trivial)
        APPEND LINES OF I_T_FIELDS TO S_S_IF-T_FIELDS.
      ELSE.                 "Initialization mode or data extraction ?
    * Data transfer: First Call      OPEN CURSOR + FETCH
    *                Following Calls FETCH only
    * First data package -> OPEN CURSOR
        IF S_COUNTER_DATAPAKID = 0.
    * Fill range tables BW will only pass down simple selection criteria
    * of the type SIGN = 'I' and OPTION = 'EQ' or OPTION = 'BT'.
    **      LOOP AT S_S_IF-T_SELECT INTO L_S_SELECT WHERE FIELDNM = 'CARRID'.
    **        MOVE-CORRESPONDING L_S_SELECT TO L_R_CARRID.
    **        APPEND L_R_CARRID.
    **      ENDLOOP.
    **      LOOP AT S_S_IF-T_SELECT INTO L_S_SELECT WHERE FIELDNM = 'CONNID'.
    **        MOVE-CORRESPONDING L_S_SELECT TO L_R_CONNID.
    **        APPEND L_R_CONNID.
    **      ENDLOOP.
    * Determine number of database records to be read per FETCH statement
    * from input parameter I_MAXSIZE. If there is a one to one relation
    * between DataSource table lines and database entries, this is trivial.
    * In other cases, it may be impossible and some estimated value has to
    * be determined.
    **      OPEN CURSOR WITH HOLD S_CURSOR FOR
    **      SELECT (S_S_IF-T_FIELDS) FROM SFLIGHT
    **                               WHERE CARRID  IN L_R_CARRID AND
    **                                     CONNID  IN L_R_CONNID.
            OPEN CURSOR WITH HOLD s_cursor FOR
    *        SELECT * FROM zlvctmbw.
            SELECT a~counter a~pernr a~workdate a~skostl a~lstar a~rproj a~awart a~kokrs a~meinh a~tcurr a~price
                   a~unit a~bukrs a~ersda a~erstm a~ernam a~laeda a~laetm a~status a~refcounter a~reason a~belnr
                   a~catshours a~act1 a~act2 a~task a~lang1 a~narr1 a~lang2 a~narr2 a~cmnt a~clst a~plchl
                   b~ltxa1 b~transfer b~hrkostl b~hrlstar b~hrcostasg b~statkeyfig b~catsquantity b~bemot
                   c~pspnr c~pspid
                   d~kunnr
            FROM   catsdb AS a
            LEFT OUTER JOIN catsco AS b ON a~counter = b~counter
            JOIN   prps AS p ON p~pspnr = a~rproj
            JOIN   proj AS c ON c~pspnr = p~psphi
            JOIN   zltproj AS d ON d~pspnr = c~pspnr.
        ENDIF.                             "First data package ?
    * Fetch records into interface table.
    *   named E_T_'Name of extract structure'.
        FETCH NEXT CURSOR S_CURSOR
                   APPENDING CORRESPONDING FIELDS
                   OF TABLE E_T_DATA
                   PACKAGE SIZE S_S_IF-MAXSIZE.
        IF SY-SUBRC <> 0.
          CLOSE CURSOR S_CURSOR.
          RAISE NO_MORE_DATA.
        ENDIF.
        S_COUNTER_DATAPAKID = S_COUNTER_DATAPAKID + 1.
      ENDIF.              "Initialization mode or data extraction ?
    ENDFUNCTION.
    When fetching data using VIEW in source system it works fine.
    When testing for fetching data of SFLIGHT using function module ZRSAX_BIW_GET_DATA_SIMPLE
    made a copy of RSAX_BIW_GET_DATA_SIMPLE, it is working fine.
    Pls suggest the cause of error.
    Regards
    Vishal

    Hello Praveen,
    Connection is fine as I tried fetching data of SFLIGHT using standard template function module which works fine and in transaction sm58 I do not any data.
    Regards
    Vishal

  • Fixed report period based on current date

    I want to create a report containing records over a fixed period without using a date parameter. As a example I would like the report to continually draw on data that is no more than say ten years from the current date.
    E.g. Current date (xx/xx/2008) to (Current date - 10 years) or xx/xx/1998 then next year the formula would automatically adjust to current date (xx/xx/2009) to (Current date - 10 years) or xx/xx/1999.
    I am only a novice when it comes to designing reports so please bear that in mind.

    Hi,
    Create a RecordSelectionFormula and subtract 10 years from the current date with the DateAdd formula.
    E.g.
    //DateAdd (intervalType, nIntervals, startDateTime)
    {Table.Date} > DateAdd ("yyyy", -10, CurrentDate)
    The DateAdd function has more interval types, so you can subtract months, days, hours as well. Check the CR help file and search for DateAdd for details.
    Cheers,
    Fritz

  • Performance Issue - Fetching latest date from a507 table

    Hi All,
    I am fetching data from A507 table for material and batch combination. I want to fetch the latest record based on the value of field DATBI. I have written the code as follows. But in the select query its taking more time. I dont want to write any condition in where claue for DATBI field because I have already tried with that option.
    SELECT kschl
               matnr
               charg
               datbi
               knumh
        FROM a507
        INTO TABLE it_a507
        FOR ALL ENTRIES IN lit_mch1
        WHERE kschl = 'ZMRP'
        AND   matnr = lit_mch1-matnr
        AND   charg = lit_mch1-charg.
    SORT it_a507 BY kschl matnr charg datbi DESCENDING.
      DELETE ADJACENT DUPLICATES FROM it_a507 COMPARING kschl matnr charg.

    Hi,
    These kind of tables will be storing large volumes of data. Thus while making a select on it, its important to use as many primary key fields as possible in the where condition. Here you can try mentioning KAPPL since its specific to a requirement. If its for purchasing use 'M' and try.
    if not lit_mch1[] is initial.
    SELECT kschl
    matnr
    charg
    datbi
    knumh
    FROM a507
    INTO TABLE it_a507
    FOR ALL ENTRIES IN lit_mch1
    WHERE kappl = 'M'
    AND kschl = 'ZMRP'
    AND matnr = lit_mch1-matnr
    AND charg = lit_mch1-charg.
    endif.
    SORT it_a507 BY kschl matnr charg datbi DESCENDING.
    DELETE ADJACENT DUPLICATES FROM it_a507 COMPARING kschl matnr charg.
    This should considerably increase the performance
    Regards,
    Vik

  • How to restrict number of Data Records from Source system?

    Hi,
    How can I restrict the number of Data records from R3 source system that are being loaded into BI. For example I have 1000 source data records, but only wish to transfer the first 100. How can I achieve this? Is there some option in the DataSource definition or InfoPackage definition?
    Pls help,
    SD

    Hi SD,
    You can surely restrict the number of records, best and simplest way is, check which characteristics are present in selection screen of InfoPackage and check in R3, which characteristics if given a secection could fetch you the desired number of records. Use it as selection in InfoPackage.
    Regards,
    Pankaj

Maybe you are looking for

  • Iphone battery issue - should i go to apple or live with it?

    All Having recently just said that I was not having troubles with my iPhone I've hit a problem actually that is a bit frustrating, and I'm wondering if its worth going to apple with this problem, and if so should I do anything before hand, Im based i

  • Pivot report table

    Hi i am new to OAF. Please someone help me in how to create a Pivot report table in OAF? The table structure should be like _______________________________________Column1________________________Column2____________________Column3 _____________________

  • Layers not printing

    Hello, I have a document set up with multiple layer that also have multiple objects on the layers.  I DO have each layer set to print.  But for some reason only my lowest base image of every layer is printing.  The copy and content I have laying on i

  • Which app is better for learning spanish:byki or mindsnacks?

    Which app is better for learning spanish:byki or mindsnacks?

  • BI server missing BI Publisher

    Hi, I have just installed Business Intelligence 11.1.1.5.0 under Oracle Apps R12 on WIndows xp. But the BI Publisher apear to be missing from the start programme menu and therefore unable to publisher anyr pd files. Please any idea will help. Regards