Setup data for particular period.

Hi All,
We are loading data to BW last 5 yrs.Now I want to do the set up for particular period for an LO data source.Where can I do the time settings so that I can load only for that particular period.
Regards
Ajay.

Hi,
It actually depends on the application you are referring to.
If you see in the below transactions for filling up of set up tables for different applications, you will see that a certain date field is available for selection for some of them. Please refer below. You can also check it directly in the system using the transaction.
Tcode: Application number: Date filed available
OLI1BW: Application 03(material movements): Posting Date
OLI3BW: Application 02: Document date
OLI4BW: Application 04: Created on date
OLI7BW: Application 11: No date selection possible
OLI8BW: Application 12: No date selection possible
OLI9BW: Application 13: No date selection possible
Hope this helps,
Regards,
Shilpa

Similar Messages

  • Value contracts Maintenance Orders (service Orders) for particularly period

    Hi Experts,
    My client wants Value contracts Maintenance Orders (service Orders) for particularly periods. How can I maintain this scenario in sap MM.
    Thanks in advance,
    Chandhu

    Thanks for your replay,
    I have tried already standard value contract.
    I have mentioned 1000 service number in value contract. But I am not getting service number at the time of Purchase Order creation with Value contract reference number.
    Thanks
    Chandhu

  • Program to collect user u2013 transaction execution data for past period

    Hi Experts
    We are developing a program to collect user u2013 transaction execution data for past period ( 3 months-1 year ). We are getting required User and Tcode output but unable to find exact count per tcode ( executed by respective user).
    We have following count components available in SAPWL_WORKLOAD_GET_STATISTIC / SWNC_COLLECTOR_GET_AGGREGATES
    COUNT , DCOUNT ( Dialog count for transaction ) , UCOUNT , BCOUNT , ECOUNT, SCOUNT ,LUW_COUNT .
    Out of which  LUW_COUNT -Logical Units of Work comes close to no ot time transaction was executed but itu2019s even not the exact count as one transaction code can have more than one LUW .
    Does anyone have idea about how to get the right count .
    Thanks
    Vani

    If audit log is enabled in SM19 for transaction criteria all, you can get the transaction history executed by the users. But may need more space on audit file system if this is enabled for all users. May be you can archive once in 3 months into ZIP files and can unzip whenever you need.

  • Generate Posting Date for Payroll Periods"

    Hi all,
    What is the relevance of "Generate Posting Date for Payroll Periods" ?
    table T549S
    Rx

    Posting Date is required after payroll run and exited then we have to post payroll results to accounts.After payroll run which date we have to post to accounts. Source is 01 payday...target 04 posting date....for ex If payday is 30th every month.....then posting date is after X days .....like you have to generate postinfg dates.
    Mohan

  • Flexible Upload of Data for several periods/years

    Hey Colleagues,
    my client wants me to develop a new functionality to upload data for several periods/years. The function they are using right now and correctly is the flexible upload.
    As we have already developed several custom tasks and enhancements/modifications we try to not modificate further SAP standard or to post directly (back-door solution)
    But:
    It is no problem to start the flexible upload for different selecitons and to provide the values in an excel sheet. But I have problems to avoid a popup for selecting the excel file. It would be no problem if for example Batch-Input would work here, or if we would post directly using standard functionality (posting method with data change) but in this case we try to use the flex upload.
    Has anyone experienced a similiar problem? Can we use the flex upload with suppressing the popup to select the file WITHOUT an enhancement?
    Thanks in advance for your help,
    Yavuz

    Hi Yavuz,
    Yes this can work for every period in one year - use the multiperiod processing option, available since EHP2 (BCS 6.02).
    The multiperiod processing requires you to define the consolidation cycle and start period, so you could define it to start with p1 and include all 12+ periods of the year.
    NB the multiperiod processing is initiated by running with selection screen - right click and selec "run for remaining periods of the year"
    You can run either a task or a task group for the rest of the consoliation cycle, but data collection with flexible upload MUST have the presentation server settings (already mentioned above) or the data collection task will not work.
    So you make the configuration settings
    You then save your multi-period file on the presentatin server
    You then go to data collection (flexible upload) task in p1 and "run for remaining periods"

  • Export data for specific period through Data Pump

    Hi,
    I've a specific requirement to take the dumps of some tables for specific time period. like between last 10 days like 01-JAN-11 to 10-JAN-11. How can I acommplish this. For Documentation what I read is that we can export the data for specific period of thie by either setting FLASHBACK_SCN or FLASHBACK_TIME parameter in expdp command but this is point in time export not for the specific time export.
    Please guide me how can export between the specific time. like between 1-JAN to 10-JAN
    Regards,
    Abbasi

    export between the specific time. like between 1-JAN to 10-JANYou need to clarify your requirements. Data is always "at a point in time". I can see data as at noon of 01-Jan. I can see data as at noon of 10-Jan. What would I mean by data "between" 01-Jan and 10-Jan ?
    Say the table has 5 rows on 01-Jan :
    ID    VALUES
    1      ABC
    2      DEF
    3      TRG
    4      MXY
    5     DEW2 Rows "6-GGG" and "7-FRD" were inserted on 02-Jan.
    2 Rows "2" and "3" were updated from "DEF" and "TRG" to "RTU" and "GTR" on 03-Jan.
    1 Row "5-DEW" was deleted on 09-Jan.
    2 Rows "8-TFE" and "9-DZN" were insereted on 09-Jan.
    Can you tell me what is the "data between 01-Jan and 10-Jan" ?
    (the above example actually happens to have an incrementing key column "ID". Your table might not even have such an identifier column at all !)
    Hemant K Chitale
    Edited by: Hemant K Chitale on Jan 10, 2011 5:23 PM

  • Sql queries for retrieving setups data for functional modules

    Hi,
    Can anyone give me the sql queries for retrieving setups data for functional modules (GL, AP, AR, FA, and CM) from Database.

    Hi,
    Can anyone give me the sql queries for retrieving setups data for functional modules (GL, AP, AR, FA, and CM) from Database.

  • How should restrict data for particular month?

    Hi Experts,
    How should retrieve data for particular month without where condition.

    Hi,
    You must be having Month Object right in your Webi?
    User will be given opportunity to Select his/her required [Month]
    Create variable as =Userresponse("Enter value for Month")same text as you are giving in the Prompt text....Name it as [UMonth]
    Go to analysis tab..Filter..Add filter...[Month]=[UMonth]
    always this report will run for the month user require

  • How to see data for particular date from a alert log file

    Hi Experts,
    I would like to know how can i see data for a particular date from alert_db.log in unix environment. I'm suing 0racle 9i in unix
    Right now i'm using tail -500 alert_db.log>alert.txt then view the whole thing. But is there any easier way to see for a partiicular date or time
    Thanks
    Shaan

    Hi Jaffar,
    Here i have to pass exactly date and time, is there any way to see records for let say Nov 23 2007. because when i used this
    tail -500 alert_sid.log | grep " Nov 23 2007" > alert_date.txt
    It's not working. Here is the sample log file
    Mon Nov 26 21:42:43 2007
    Thread 1 advanced to log sequence 138
    Current log# 3 seq# 138 mem# 0: /oracle/NEWDB/oradata/NEWDB/redo3.log
    Mon Nov 26 21:42:43 2007
    ARCH: Evaluating archive log 1 thread 1 sequence 137
    Mon Nov 26 21:42:43 2007
    ARC1: Evaluating archive log 1 thread 1 sequence 137
    ARC1: Unable to archive log 1 thread 1 sequence 137
    Log actively being archived by another process
    Mon Nov 26 21:42:43 2007
    ARCH: Beginning to archive log 1 thread 1 sequence 137
    Creating archive destination LOG_ARCHIVE_DEST_1: '/oracle/NEWDB/admin/arch/1_137
    .dbf'
    ARCH: Completed archiving log 1 thread 1 sequence 137
    Mon Nov 26 21:42:44 2007
    Thread 1 advanced to log sequence 139
    Current log# 2 seq# 139 mem# 0: /oracle/NEWDB/oradata/NEWDB/redo2.log
    Mon Nov 26 21:42:44 2007
    ARC0: Evaluating archive log 3 thread 1 sequence 138
    ARC0: Beginning to archive log 3 thread 1 sequence 138
    Creating archive destination LOG_ARCHIVE_DEST_1: '/oracle/NEWDB/admin/arch/1_138
    .dbf'
    Mon Nov 26 21:42:44 2007
    ARCH: Evaluating archive log 3 thread 1 sequence 138
    ARCH: Unable to archive log 3 thread 1 sequence 138
    Log actively being archived by another process
    Mon Nov 26 21:42:45 2007
    ARC0: Completed archiving log 3 thread 1 sequence 138
    Mon Nov 26 21:45:12 2007
    Starting control autobackup
    Mon Nov 26 21:45:56 2007
    Control autobackup written to SBT_TAPE device
    comment 'API Version 2.0,MMS Version 5.0.0.0',
    media 'WP0033'
    handle 'c-2861328927-20071126-01'
    Clearing standby activation ID 2873610446 (0xab47d0ce)
    The primary database controlfile was created using the
    'MAXLOGFILES 5' clause.
    The resulting standby controlfile will not have enough
    available logfile entries to support an adequate number
    of standby redo logfiles. Consider re-creating the
    primary controlfile using 'MAXLOGFILES 8' (or larger).
    Use the following SQL commands on the standby database to create
    standby redo logfiles that match the primary database:
    ALTER DATABASE ADD STANDBY LOGFILE 'srl1.f' SIZE 10485760;
    ALTER DATABASE ADD STANDBY LOGFILE 'srl2.f' SIZE 10485760;
    ALTER DATABASE ADD STANDBY LOGFILE 'srl3.f' SIZE 10485760;
    ALTER DATABASE ADD STANDBY LOGFILE 'srl4.f' SIZE 10485760;
    Tue Nov 27 21:23:50 2007
    Starting control autobackup
    Tue Nov 27 21:30:49 2007
    Control autobackup written to SBT_TAPE device
    comment 'API Version 2.0,MMS Version 5.0.0.0',
    media 'WP0280'
    handle 'c-2861328927-20071127-00'
    Tue Nov 27 21:30:57 2007
    ARC1: Evaluating archive log 2 thread 1 sequence 139
    ARC1: Beginning to archive log 2 thread 1 sequence 139
    Creating archive destination LOG_ARCHIVE_DEST_1: '/oracle/NEWDB/admin/arch/1_139
    .dbf'
    Tue Nov 27 21:30:57 2007
    Thread 1 advanced to log sequence 140
    Current log# 1 seq# 140 mem# 0: /oracle/NEWDB/oradata/NEWDB/redo1.log
    Tue Nov 27 21:30:57 2007
    ARCH: Evaluating archive log 2 thread 1 sequence 139
    ARCH: Unable to archive log 2 thread 1 sequence 139
    Log actively being archived by another process
    Tue Nov 27 21:30:58 2007
    ARC1: Completed archiving log 2 thread 1 sequence 139
    Tue Nov 27 21:30:58 2007
    Thread 1 advanced to log sequence 141
    Current log# 3 seq# 141 mem# 0: /oracle/NEWDB/oradata/NEWDB/redo3.log
    Tue Nov 27 21:30:58 2007
    ARCH: Evaluating archive log 1 thread 1 sequence 140
    ARCH: Beginning to archive log 1 thread 1 sequence 140
    Creating archive destination LOG_ARCHIVE_DEST_1: '/oracle/NEWDB/admin/arch/1_140
    .dbf'
    Tue Nov 27 21:30:58 2007
    ARC1: Evaluating archive log 1 thread 1 sequence 140
    ARC1: Unable to archive log 1 thread 1 sequence 140
    Log actively being archived by another process
    Tue Nov 27 21:30:58 2007
    ARCH: Completed archiving log 1 thread 1 sequence 140
    Tue Nov 27 21:33:16 2007
    Starting control autobackup
    Tue Nov 27 21:34:29 2007
    Control autobackup written to SBT_TAPE device
    comment 'API Version 2.0,MMS Version 5.0.0.0',
    media 'WP0205'
    handle 'c-2861328927-20071127-01'
    Clearing standby activation ID 2873610446 (0xab47d0ce)
    The primary database controlfile was created using the
    'MAXLOGFILES 5' clause.
    The resulting standby controlfile will not have enough
    available logfile entries to support an adequate number
    of standby redo logfiles. Consider re-creating the
    primary controlfile using 'MAXLOGFILES 8' (or larger).
    Use the following SQL commands on the standby database to create
    standby redo logfiles that match the primary database:
    ALTER DATABASE ADD STANDBY LOGFILE 'srl1.f' SIZE 10485760;
    ALTER DATABASE ADD STANDBY LOGFILE 'srl2.f' SIZE 10485760;
    ALTER DATABASE ADD STANDBY LOGFILE 'srl3.f' SIZE 10485760;
    ALTER DATABASE ADD STANDBY LOGFILE 'srl4.f' SIZE 10485760;
    Wed Nov 28 21:43:31 2007
    Starting control autobackup
    Wed Nov 28 21:43:59 2007
    Control autobackup written to SBT_TAPE device
    comment 'API Version 2.0,MMS Version 5.0.0.0',
    media 'WP0202'
    handle 'c-2861328927-20071128-00'
    Wed Nov 28 21:44:08 2007
    Thread 1 advanced to log sequence 142
    Current log# 2 seq# 142 mem# 0: /oracle/NEWDB/oradata/NEWDB/redo2.log
    Wed Nov 28 21:44:08 2007
    ARCH: Evaluating archive log 3 thread 1 sequence 141
    ARCH: Beginning to archive log 3 thread 1 sequence 141
    Wed Nov 28 21:44:08 2007
    ARC1: Evaluating archive log 3 thread 1 sequence 141
    ARC1: Unable to archive log 3 thread 1 sequence 141
    Log actively being archived by another process
    Wed Nov 28 21:44:08 2007
    Creating archive destination LOG_ARCHIVE_DEST_1: '/oracle/NEWDB/admin/arch/1_141
    .dbf'
    Wed Nov 28 21:44:08 2007
    ARC0: Evaluating archive log 3 thread 1 sequence 141
    ARC0: Unable to archive log 3 thread 1 sequence 141
    Log actively being archived by another process
    Wed Nov 28 21:44:08 2007
    ARCH: Completed archiving log 3 thread 1 sequence 141
    Wed Nov 28 21:44:09 2007
    Thread 1 advanced to log sequence 143
    Current log# 1 seq# 143 mem# 0: /oracle/NEWDB/oradata/NEWDB/redo1.log
    Wed Nov 28 21:44:09 2007
    ARCH: Evaluating archive log 2 thread 1 sequence 142
    ARCH: Beginning to archive log 2 thread 1 sequence 142
    Creating archive destination LOG_ARCHIVE_DEST_1: '/oracle/NEWDB/admin/arch/1_142
    .dbf'
    Wed Nov 28 21:44:09 2007
    ARC0: Evaluating archive log 2 thread 1 sequence 142
    ARC0: Unable to archive log 2 thread 1 sequence 142
    Log actively being archived by another process
    Wed Nov 28 21:44:09 2007
    ARCH: Completed archiving log 2 thread 1 sequence 142
    Wed Nov 28 21:44:36 2007
    Starting control autobackup
    Wed Nov 28 21:45:00 2007
    Control autobackup written to SBT_TAPE device
    comment 'API Version 2.0,MMS Version 5.0.0.0',
    media 'WP0202'
    handle 'c-2861328927-20071128-01'
    Clearing standby activation ID 2873610446 (0xab47d0ce)
    The primary database controlfile was created using the
    'MAXLOGFILES 5' clause.
    The resulting standby controlfile will not have enough
    available logfile entries to support an adequate number
    of standby redo logfiles. Consider re-creating the
    primary controlfile using 'MAXLOGFILES 8' (or larger).
    Use the following SQL commands on the standby database to create
    standby redo logfiles that match the primary database:
    ALTER DATABASE ADD STANDBY LOGFILE 'srl1.f' SIZE 10485760;
    ALTER DATABASE ADD STANDBY LOGFILE 'srl2.f' SIZE 10485760;
    ALTER DATABASE ADD STANDBY LOGFILE 'srl3.f' SIZE 10485760;
    ALTER DATABASE ADD STANDBY LOGFILE 'srl4.f' SIZE 10485760;
    Thu Nov 29 21:36:44 2007
    Starting control autobackup
    Thu Nov 29 21:42:53 2007
    Control autobackup written to SBT_TAPE device
    comment 'API Version 2.0,MMS Version 5.0.0.0',
    media 'WP0206'
    handle 'c-2861328927-20071129-00'
    Thu Nov 29 21:43:01 2007
    Thread 1 advanced to log sequence 144
    Current log# 3 seq# 144 mem# 0: /oracle/NEWDB/oradata/NEWDB/redo3.log
    Thu Nov 29 21:43:01 2007
    ARCH: Evaluating archive log 1 thread 1 sequence 143
    ARCH: Beginning to archive log 1 thread 1 sequence 143
    Creating archive destination LOG_ARCHIVE_DEST_1: '/oracle/NEWDB/admin/arch/1_143
    .dbf'
    Thu Nov 29 21:43:01 2007
    ARC1: Evaluating archive log 1 thread 1 sequence 143
    ARC1: Unable to archive log 1 thread 1 sequence 143
    Log actively being archived by another process
    Thu Nov 29 21:43:02 2007
    ARCH: Completed archiving log 1 thread 1 sequence 143
    Thu Nov 29 21:43:03 2007
    Thread 1 advanced to log sequence 145
    Current log# 2 seq# 145 mem# 0: /oracle/NEWDB/oradata/NEWDB/redo2.log
    Thu Nov 29 21:43:03 2007
    ARCH: Evaluating archive log 3 thread 1 sequence 144
    ARCH: Beginning to archive log 3 thread 1 sequence 144
    Creating archive destination LOG_ARCHIVE_DEST_1: '/oracle/NEWDB/admin/arch/1_144
    .dbf'
    Thu Nov 29 21:43:03 2007
    ARC0: Evaluating archive log 3 thread 1 sequence 144
    ARC0: Unable to archive log 3 thread 1 sequence 144
    Log actively being archived by another process
    Thu Nov 29 21:43:03 2007
    ARCH: Completed archiving log 3 thread 1 sequence 144
    Thu Nov 29 21:49:00 2007
    Starting control autobackup
    Thu Nov 29 21:50:14 2007
    Control autobackup written to SBT_TAPE device
    comment 'API Version 2.0,MMS Version 5.0.0.0',
    media 'WP0280'
    handle 'c-2861328927-20071129-01'
    Thanks
    Shaan

  • How to extract data for particular two members of same dimension.

    As per the requirement i need to export data for certain members of a dimension. Lets say we need data for two account members A and B which is in in Account dimension only but is not a direct children. I need the data for all the available years too. Please suggest me how my DATAEXPORT command should look like.
    When i am using an AND statement it is not working accordingly. Say i am fixing for years 2007 and 2009 but the output file is coming for 2009 and 2010.
    Something other is happening when i am fixing OPEX_31 and OPEX_32. The values are coming not only for OPEX_31 and OPEX_32 but for many more accounts too.
    Here is my dataexport statement for your reference
    SET DATAEXPORTOPTIONS
    DataExportLevel "ALL";
    DataExportColFormat ON;
    DataExportDimHeader ON;
    DataExportOverwriteFile ON;
    FIX("LC","Total_Year","ESB1","2009","SIERRA","COSTCENTER_NA","CELLULAR_NA","OPEX_31",
    "January","February","March","April","May","June","July","August","September","October","November","December");
    DATAEXPORT "File" "     " "D:\exports\feb.txt";
    ENDFIX;
    I need data for OPEX_31 and OPEX_32 for all the available years starting from 2001 to 2025.
    Please suggest what are the modification needed to get the desired result.
    Thanks in advance

    Hi,
    There a few different options you can use for fixing on the months, years..
    e.g. FIX(January:December)
    or FIX(@CHILDREN(YearTotal)) < depends what the parent of the months is
    sames goes for years
    FIX(2009:2025)
    or
    FIX(@CHILDREN(Year)
    If your period dimension is dense you can always use that as the column header e.g. DataExportColHeader "Period" and then fix on the accounts you require.
    Cheers
    John
    http://john-goodwin.blogspot.com/

  • How to lock the data for few periods in the Input Layout

    Dear All,
    I have an input enabled layout wherein I will display the data based on a period range.( For ex, 01.2007 to 12.2007).
    Now, based on the current period I should disable the input readiness for the periods less than my current period.
    In my ex, if my current period is 06.2007, the input readiness should be disabled for the periods 01.2007 to 05.2007 .
    How can I achive this?
    Regards,
    Srini.

    Hi Srinivas.
    You can use user variable exit for the disabled one (the period one); based on the current period..
    You can read regarding it in here:
    http://help.sap.com/saphelp_nw04/helpdata/en/1d/ca10d858c2e949ba4a152c44f8128a/frameset.htm
    Hopefully it can help you a lot.
    Regards,
    Niel
    thanks for the points you choose to assign.

  • Result Analysis Values For particular period/year!!

    Hi Experts
    We are in a customer project, tenure of a project is around 2 to 3 years. As company follows Accounting standard AS-7 where we need to recognize the revenues in the same period in which the cost incurred. So, to achieve that we have planned the cost and revenue periodically for the years 2011-2012 ,2012-2013. But when i am running RA (KKA2) the results shows planned revenue and cost  as sum of all the years ,so accordingly POC will be based on overall Plan Cost and not Annual(Period ) based.
    But our requirement is that when i run RA for given period of particular year then RA should show planned cost and revenue for that particular period/year and accordingly  rest of the values like; POC, WIP, Reserves etc.. should get calculated .
    Please suggest how to approach to achieve the same.
    Thanks in advance..

    I think it is possible to meet your requirement through valuation method settings.
    please read sap help on RA.
    Please follow the below path to access the RA help
    SAP help portal -- Financials >> Controlling >> Product Cost Controlling >> Cost Object Controlling >> Product Cost by Sales Order >> Period-End Closing in Product Cost by Sales Order: Scenario >> Result Analysis

  • MCTC report not visible for particular period.

    Hi.
    I am uanble to see data in MCTC t code.
    Figures are visible Nov Month
    Figures Not are visible Dec Month
    MCTC report itself doesnt fetch data for Jan Month.
    Any  configuration need to be maintained?
    Customer & material both marked for statistical group.
    Reg.
    Amol

    Hi Amol,
                   Do as suggested above- Click on user settingsParameters-- Inclusde (With current peroid Check box should be ticked) in selection proposal. And make sure that the material has the  material stastics group in sales org2. Kindly please let me know If you need any more information on this.
    Regards,
    Ram Pedarla

  • Quarter End date for this period

    Hi
    I have current date, How could I get the quarter end date for this date.
    for example
    Jan 23 2005 - The quarter end date is Mar 31 2005.

    needs testing
    import java.util.Calendar;
    import java.text.SimpleDateFormat;
    class Testing
      public Testing()
        SimpleDateFormat sdf = new SimpleDateFormat("yyyy/MM/dd");
        Calendar cal = Calendar.getInstance();
        cal.add(Calendar.MONTH,3-(cal.get(Calendar.MONTH)%3));
        cal.set(Calendar.DAY_OF_MONTH,0);
        System.out.println(sdf.format(cal.getTime()));
        cal.set(2005,10,15);
        cal.add(Calendar.MONTH,3-(cal.get(Calendar.MONTH)%3));
        cal.set(Calendar.DAY_OF_MONTH,0);
        System.out.println(sdf.format(cal.getTime()));
      public static void main(String[] args){new Testing();}
    }

  • Carry forward data for next period

    Hi,
    Lets say A is transfering asset to B with profit... so there will be
    a unrealized profit that has to be eliminated in the group reporting.. Lets say I manage to eliminate this in bcs,
    so, the bcs retained earning will be reduce due to this unrealized profit right?
    when the balance carry forward next period, bcs b/f retained profit is the profit after we eliminate the unrealized profit.
    When we do load from the data stream again, the b/f retained earning in R/3 will still include the unrealized profit right?
    how do we want to make the unrealised profit to be eliminated permanently?
    Edited by: shahrul yusof on Feb 25, 2008 4:50 AM

    Well, Shahrul,
    The Balance carryforward is rather well provided here:
    http://help.sap.com/saphelp_sem60ep1/helpdata/en/53/01ad3c99c90455e10000000a114084/frameset.htm
    Remember that all data load (without manual entries) happens in  the posting level PL = 00. Eliminations are done on PL = 20 or 30.
    And during the bcf the data becoming a new initial balance goes to period 000.
    What's bothering you?

Maybe you are looking for