Time dep master data reorganization / process seems to do nothing

Hi,
Info:
I created a process chain just to reorganize time dep master data for 1 InfoObject (0UCINSTALLA). In this process variant I choose only to process time dep attributes - not text or time independent tables.
The InfoObject is flagged for change run (have som entries in M version)
The InfoObject with the properties:
/BI0/QUCINSTALLA = 9 393 539 rec (4 162 in M version) (same goes for Y-table). Those tables have just a few (3-4) fields.
The InfoObject 0UCINSTALLA exists in a few aggregates, but none of the time dep attributes does.
The system have these components:
SAP_ABA     700     0013
SAP_BASIS     700     0013
PI_BASIS     2005_1_700     0013
ST-PI     2005_1_700     0005
SAP_BW     700     0015
BI_CONT     703     0005
Running on Linux / Oracle 10.2.0.2.0
Problem:
The background job is started via process chain but checking the
process it seems to do nothing. I ran it for 3 days over the weekend, it doesn't time out or give any ABAP dumps or entries in SM21.
Checking the statement for the process in ST04 gives me:
SELECT
FROM
  "/BI0/QUCINSTALLA" T_00
WHERE
  EXISTS ( SELECT T_100 . "UCINSTALLA" FROM "/BI0/QUCINSTALLA" T_100
  WHERE T_100 . "UCINSTALLA" = T_00 . "UCINSTALLA"
GROUP BY
  T_100 . "UCINSTALLA" , T_100 . "OBJVERS" , T_100 . "CHANGED" ,
  T_100 . "IND_CODE" , T_100 . "UCBILL_CLA" , T_100 . "UCRATE_CAT" ,
  T_100 . "IND_NUMSYS" , T_100 . "IND_SEC"
HAVING
  COUNT(*) >= :A0 )
ORDER BY
  T_00 . "UCINSTALLA" , T_00 . "OBJVERS" , T_00 . "DATETO"
The Explain gives me:
SELECT STATEMENT ( Estimated Costs = 65 933 456 , Estimated #Rows = 0 )
       7 SORT ORDER BY
         ( Estim. Costs = 65 933 456 , Estim. #Rows = 4 )
         Estim. CPU-Costs = 101 899 203 438 143 Estim. IO-Costs = 56 497 939
           6 FILTER
             Filter Predicates
               1 TABLE ACCESS FULL /BI0/QUCINSTALLA
                 ( Estim. Costs = 20 562 , Estim. #Rows = 9 413 200 )
                 Estim. CPU-Costs = 3 825 943 517 Estim. IO-Costs = 20 208
               5 FILTER
                 Filter Predicates
                   4 HASH GROUP BY
                     ( Estim. Costs = 7 , Estim. #Rows = 1 )
                     Estim. CPU-Costs = 10 845 274 Estim. IO-Costs = 6
                       3 TABLE ACCESS BY INDEX ROWID /BI0/QUCINSTALLA
                         ( Estim. Costs = 6 , Estim. #Rows = 4 )
                         Estim. CPU-Costs = 45 379 Estim. IO-Costs = 6
                           2 INDEX RANGE SCAN /BI0/QUCINSTALLA~0
                             ( Estim. Costs = 3 , Estim. #Rows = 4 )
                             Search Columns: 1
                             Estim. CPU-Costs = 22 364 Estim. IO-Costs = 3
                             Access Predicates
Anybody have any idea what the problem could be - I have no idea how long a reorganization for the amount of records should take. But the process seems to do nothing.
Thanks,
Johan

A little update:
I could see that the background process (which seemed to do nothing) ran a oracle process and on the DB server instance this was no.1 in top cpu. So - something is happening. The DB only has 2 cpus. Is there any way to run this job with more then 1 process?

Similar Messages

  • Master data Reorganization Process for a single InfoObject

    Hallo Experts,
    mayby someone else had  this problem bevor us?
    SAP-BWProducion: Master data Reorganization Process for a single InfoObject (selfmade), other InfoObjects worked fine.
    SAP BW 7.0 SP20 or 21
    This element has daily attributes and the Q-Table has actual 180Mio entries. We startet reorganisation with 250Mio. entries.
    Because of the daily loading, we cancel the batchprocess in the evening and started it again in the morning. This was fine for about 5 days. We elliminate 80Mio entries.
    Our destination would be 80Mio entries left.
    Whenever we run this feature now  - in a process chain - no entries were deleted after 30 hours working, and there were no counting-stastics in SM50
         Sequentielles Lesen         
         Insert                       
         Update
         Delete
    The five days before, we saw Mio's of entries in this statistics.
    We have checked the Q-table and  there exist a number of records which have the same attribute values across several records and the date from - date to values of these records are in a row.
    RSRV is fine and we checked note 1234411 Master data reorganization - Performance improvement.
    There are no other notes helping us.
    Please give us a solution-idea.
    Thanks
    Santra

    Hi Santra,
    Could you please provide the solution for this?
    Thanks
    Vinod

  • Question: to have different 2 key dates to show time-dep master data.

    Hello All -
    I would kindly like to ask your suggestion in this challenges.
    In our Finance report e.g AR Report, we have 2 Year to dates of Key Figures. 1) Current Year to date, and 2) previous Year to dates. We want to have the report shows time-dependent master data navigation attribute based on this 2 year to dates.
    I am thinking to have 2 Key Date selection parameter in the report e.g. 1) Current key-date, and 2) Previous Key-date
    The report above will show COLOR as per key date. If the posting is 'Previous YTD', the COLOR will show based on 'Previous Key Date', and if the posting for 'Current YTD', the COLOR will show based on 'Current Key-Date'.
    The question are:
    1) What to do for customization if we want to have 2 key dates in Query or Report? If you have experience it before, could you please share?
    2) If you think that there is other way, feel free to share it.
    Many Thanks, and really appreciate your suggestion & advice,
    Cheers,

    Hello Cornelia -
    Thanks for your advice:
    1) Use Info Set to temporal join between master data, and transaction.
    Comment:
    We are using multicube for reporting. Are you saying that we should put infoset into MultiCube?
    2) Modelling the time-dependent master data as a info object in transaction (Cube).
    Comment:
    Thanks for this, i will analyse if we go to this option.
    Btw, I am also thinking to use Virtual Characteristic, but the performance will be expensive, like using Info Set.
    Feel free for your comment,
    Is there any other backend door that we can use e.g. to use BADI Exit.
    Appreciate your feedback,
    Thanks,

  • Activate time-dep. master data (0EMPLOYEE)

    Dear experts!
    I have loaded a lot of master data and after some fruitless change runs technically the data seemed to activated.One of the major challenges here is, that we are talking about a lot rekords (8 million) However the Q- and Y-table look like this:
    EMPLOYEE OBJVERS DATETO DATEFROM CHANGED
    14005099             A       2003.11.30 1000.01.01
    14005099             A       2003.12.31 2003.12.01      
    14005099             M       2004.01.31 2004.01.01 I    
    14005099             M       2004.02.29 2004.02.01 I    
    14005099             M       2004.03.31 2004.03.01 I    
    14005099             M       2004.04.30 2004.04.01 I    
    14005099             M       9999.12.31 2004.05.01 I    
    Thus, actually data  is still not active in BI, until the M/I -records have been converted into A/(blank) ones (regarding objvers/changed). Anyway, the DWH workbench (RSATTR) says, the iobj is still active....
    How can I finally activate the iobj (here we still have BW 3.1)?
    Thanks
    Frank

    Alright folks,
    thanks for all tips. Finally I found no other option than to activate the Q- and Y-table my own way. Everything works fine.
    Frank
    REPORT  ZHR_UPDATE_MD.
    *2005
    UPDATE /bi0/qemployee
    SET
      objvers = 'A'
      changed = ''
    WHERE
      objvers = 'M' AND
      changed = 'I' AND
      datefrom >= '20050101' AND
      datefrom <= '20051231'.
    COMMIT WORK.

  • Reorganization of time dependent master data

    I am using reorganization of time dependent master data and it does not merge overlapped period as suggested by SAP. Any suggestions please.
    http://help.sap.com/saphelp_nw04s/helpdata/en/c7/b05e3cc3a1a62ce10000000a114084/frameset.htm
    Best Regards
    Vikash

    Hi,
    you have used the processvariante "reorganization of time dependent master data" in a process chain and it doesn't work?
    I don't think so.
    1. it works perfekt. so, you can reduce the data in the master data table.
    2. you don't have overlapping periods - you have one-after-another time intervals.
    => Please check the attributs again. I think there is a change in one attribut.
    Sven

  • Master Data Delta Process?

    How do you update the master data on a regular basis?
    I did full updates for those as per the BP document. Should I have done deltas for those or should I just schedule the initial load to run at regular intervals?
    I will never have much master data, however, it seems silly to do a full load every time.

    Hi Stuart,
    You could go either way...first you would have to see if the extractor supports the delta loading process. Another thing you need to look into is the frequency of load, which would be guided by the frequency of change to the master data in the source system.
    We have deltas running for some major objects like customer, material, opportunities etc. A lot of them we manage with a weekly or fortnightly full load.
    Hope this helps...

  • Time-dependent master data in the cube and query

    Hello,
    I have a time-dep. masterdata infoobject with two time-dep attributes (one of them is KF). If i add this infoobject into the cube, what time period SID will be considered during the load? I assume it only matters during load, if i add the KF to the query it gets its value based on the SID in the cube.. right?
    Thanks,
    vamsi.

    If its Time Dependent Master Data object when you run your Master Data if that time any changes to the Master Data that value will be overwrite to the old value you will get the new value. When you run the Query execution the Infocube Master Data infoobject will having the SID that time it will to there it will be displayed at that moved what is the value you have in the Master Data table.
    This is what my experience.
    Thanks,
    Rajendra.A

  • Time Dependency Master Data Load

    Hi:
    This is my first time to work on time dependency master data. I need help!
    I first deleted master data and cleaned up the PSA for material sales.
    Then I have turned on the feature of time dependent on the info object for material sales. Save it and activated the object.
    Then i have problems by loading the master sales attribute data in PSA. The error is: on every record of material, it has "invalid 'to' date '00/00/0000'" and "invalid 'from' data '00/00/0000'".
    Is every process i did wrong? What is the process to work on time dependent master data? and loading the data?
    Thank you!!

    Hi,
    After turning on the time dependence...you get an extra field in the key of the master data table Q table date to and a new field date from as a new field.
    These two fields needs to be mapped to date to and date from R/3 source as well.
    If there is no source field for these two then you need to make sure to get the values for these fields.
    Just check if you are getting any field like that from R/3...right now i think you have left the mappings for these fields as blank.
    Thanks
    Ajeet

  • Loading time-dependent master data using update rules/transformations

    Hi
    I am trying to load time-dependent master data to an infoobject. It seems that I get an error message on duplicate records if I use a transformation or update rule. Does this only work with direct update ?

    In the DTP you have the option to ignore duplicate records....
    Just select that and then load data...

  • Upload time-dependent master data

    Hi Experts,
    I want to upload on a monthly base time dependet master data.
    a) What it is the correct way to do so? I use a infopackage with full upload from source system and a data-tranfer-process with delta modus from PSA to info-object.
    b) In case I need to repeat an upload for a preavious month due to changed data base. What is the process to do so?
    Best Regards and Greeting from Germany,
    Thorsten

    Hi,
    the flow is same as it is with time independent infoobjects.
    In case of time dependent data the most important thing is the data which is coming from the data source since the key fields of the infoobject always contain "valid to" infoobject.
    So when ever u try to load the data just make sure that you are not loading records with overlapping validity that is valid to and valid from range should not overlap or it will fail.
    Also as you have asked it works normally in case of delta if the source is generating it properly.
    Also if you want to repeat the load from previous month for the changes then u just need to schedule the delta.Just make sure that your data source is delta enabled and it bringing only changed data from source.
    Thanks
    Ajeet

  • 0CALDAY for time dependent master data lookup unknown when migrated to 7.0

    I am in the process of migrating a number of InfoProviders from the 3.x Business Content to the new methodology for BI 2004s.
    When I try to create a transformation from the Update Rules for 0PA_C01, all of the rules that use a master data lookup into 0EMPLOYEE give me the error such as "Rule 41 (target field: 0PERSON group: Standard Group): Time char. 0CALDAY in rule unknown". 
    How do I fix the transformation rule that is generated from the update rule for these time-dependent master data attributes?

    Hi Mark,
    look at http://www.service.sap.com/. I guess you need to implement some corrections or newer supp-packages.
    kind regards
    Siggi
    PS: take a look: https://websmp104.sap-ag.de/~form/handler?_APP=01100107900000000342&_EVENT=REDIR&_NNUM=941525&_NLANG=EN
    Message was edited by:
            Siegfried Szameitat

  • Data not uploading in Time dependent Master data Infoobject

    Hello All,
    I have a master data infoobject for HR entity and have to load data from PSA to that info object.
    The HR entity infoobject already have sone data like below:
    HR Entity
    Version
    Date from
    Date To
    x
    A
    01.07.2013
    31.12.9999
    x
    A
    19.04.2013
    30.06.2013
    x
    A
    01.09.2012
    18.04.2013
    x
    A
    01.01.2012
    31.08.2012
    x
    A
    01.01.1000
    31.12.2011
    Now the data in PSA is as follows:
    HR Entity
    Start Date
    End Date
    X
    01.01.2012
    18.12.2013
    Once I loaded this data to the infoobject, i can not see this value which is the latest value of this HR entity.
    Can somebody please explain how the data gets loaded in the time dependent master data infoobject and why this entry is not getting loaded in the info object.
    Regards
    RK

    Hi,
    did you activate master data after your load?
    You can check also version 'M' records and see if your record is there.
    The load went green?
    The problem is, that your entry overlaps all exisitng time intervals, which can't be deleted or merged as there may be dependent transactional data. You have first to delete the transactional data for this entity.
    Then you can delete the time-dependent data and reoload it from your PSA.
    BW will build then correct time intervals.
    The easiest is to change the time interval in PSA, see example below:
    At the moment the time interval is not accepted. But you can add time intervalls before 31.12.2011 and after 01.07.2013, Then system will create remaiing time intervals, e.g. your new record is:
    HR Entity
    Start Date
    End Date
    X
    01.08.2013
    18.12.2013
    Result will be:
    HR Entity
    Version
    Date from
    Date To
    x
    A
    19.12.2013
    31.12.9999
    x
    A
    01.08.2013
    18.12.2013
    x
    A
    01.07.2013
    31.07.2013
    Regards, Jürgen

  • What is time dependent master data?

    Can anybody explain me in detail with an example for time dependent master data?
    Thanks in advance.
    Sharat.

    hi,
    the master data value changes with respect to some time characteristics.
    say- Salesregion is a char that have sales rep as master data(attribute)
    saleregion  date from  date to              sales rep
    sr001        20/10/2007  20/12/2007          Ram
    sr002        21/12/2007  12/05/2008          Ram
    in the above example Ram is in two sales region in different dates.
    these type of attributes were time dependent.
    usually time period will be defined in the data range of 01/01/1000 to 31/12/9999.
    Ramesh

  • Modify code to pull the time dependent master data

    I fully under stand the suggestion below for the requirement to add the time dependent attribute comp code
    thanks fo rthe help but please tell me if there is a way i can modify the abap code and make the user enter the value for the date on which he want to pull th emaster data for company code or keydate to and from and pull the master data, so how will i proceede should i create the variable on 0doc_date and how to modify the code. please help . i have opened another question with same desc as above to assign points
    thanks
    soniya
    The literal within <..> is supposed to be replaced by the actual field name (as I didn't know the fields). In this case, I am changing your code for costcenter/company-code.
    data : wa like /bi0/qcostcenter.
    select single * from /bi0/qcostcenter into wa
    where costcenter = comm_structure-costcenter
    and objvers = 'A'
    and datefrom le comm_structure-<keydatefield>
    and dateto ge comm_structure-<keydatefield>.
    if sy-subrc = 0.
    result = wa-comp_code.
    endif.
    abort = 0.
    You can use this code for update rule of company_code. You have to replace '<keydatefield>' with a field name that contains the date on which the company is to be derived. If there is a date in your comm_structure (eg aedat) which you can use, you can specify that field in place of this literal (instead of comm_structure-<keydatefld> use comm_structure-aedat). If you have no such field, and you wish to use current date for getting the company code from time-dependent master data, you can use sy-datum (ie replace comm_strucutre-<keydatefld> with sy-datum).
    And it should work.
    The 'master data attribute' option is one of the options when you create update rule (one of the radio button options).

    That the code is doing anyway.
    If your txn data in the cube doesn't have a date, how does it know it is Feb data, or, it is March data?
    If it has a date or month field, you should modify and use this code to update the company based on that date instead of system date.
    Other than that minor variation, it is already doing what you look for.

  • SAP FC: time dipendent master data

    Hi All,
    I'm implementing the new SAP RDS for Solvency II on SAP FC.
    I have a important question: Is it possible to have time dipendent master data?
    For example
    The period of the reporting ID is 06.2014
    I upload a record with a fund with rating AAA (and period 06.2014)
    I create another reporting ID with period 09.2014
    I update the master data for the same fund that now have a new rating (BBB)
    I upload a record for the period 07.2014 with the same fund
    When I execute the report for the period 06.2014 I can't see the old rating (AAA) but only the last.
    Is it possible to have the history of master data's changes?
    Thank you in advance,
    Matteo 

    Hi Matteo,
    As response to your question about having the history of master data's changes, you can enable the
    trace reports which help the user to track any object modifications that have occurred.
    Trace reports list any time a tracked object is created, modified, or deleted, and mention who performed the action.
    This option can be enabled in the Desktop General Options as follwo:
    Select Tools > General Options.
    Select the Trace Reports tab.
    Activate Enable the trace report function.
    Click Select views to select the views whose operations you want to log.
    Deactivate the views whose operations you do not want to log.
    Click OK.
    Once this function is activated, all operations for the module objects you selected are logged in reports.
    Hope thiw would be helpful.
    Kindest regards,
    Siwar

Maybe you are looking for