Loading Transfer Rules details into Info Cube/ODS - VVVVV urgent

Hi Guys.
I  have a requirement to load all the active and inactive transfer rules into a cube so that i can prepare a report which are the active and inactive transfer rules existed in out BW system.
Please hel me out in this regards.
ThanX inadvance
Peter B

Hi Varaprasad,
Ok, can we make use of that table for reporting?
Or
I can created a Cube as per that table and i don't know ho to load that table data into that Cube.
Please let me know ASAP.
Thanks in advance
Peter B

Similar Messages

  • I need format for data in excel file load into info cube to planning area.

    Hi gurus,
    I need format for data in excel file load into info cube to planning area.
    can you send me what should i maintain header
    i have knowledge on like
    plant,location,customer,product,history qty,calander
    100,delhi,suresh,nokia,250,2011211
    if it is  right or wrong can u explain  and send me about excel file format.
    babu

    Hi Babu,
    The file format should be same as you want to upload. The sequence of File format should be same communication structure.
    Like,
    Initial columns with Characteristics (ex: plant,location,customer,product)
    date column (check for data format) (ex: calander)
    Last columsn with Key figures (history qty)
    Hope this helps.
    Regards,
    Nawanit

  • Data is not loaded into info cube

    Hi All,
    I have a custom defined info cube and full load is running every month and It's hardly taking 1 hour to finish the same.
    But this month data came to PSA successfully from there to data target process stopped at update rules, non of the records are getting added to info cube. We haven't done any changes on update rules also.
    Can anybody let me know what might be the reason behind it. Thanks.
    Regards,
    Ashok
    Message was edited by: Ashok kaipu

    Hi Ashok,
    You can do the following:
    1. In the Monitor Status tab, turn the request Red.
    2. In the Details tab right click this data package and choose Manual Update.
    3. After the processing is done, the datapackage will be green but the overall request will still be red.
    4. In the Monitor Status tab, turn the request back to original status (this will make it green)
    Hope this helps...

  • Flat File: no data load into Info Cube

    Hi there,
    i try to load a flat file. When I simulate the upload its works well. But no Data was load in my Info Cube. When I try define a query there are no available.
    Can someone provide me with a solution for this problem?
    With rgds
    Oktay Demir

    Hi Oktay,
    in addition to A.H.P.'s marks, check if
    - Data is posted not only into PSA but also into datatarget,
    - updaterules are active.
    - Check Monitor-Status in Cube-Administration
    - Check availabilitiy for reporting of the request wthin Cube-Administration.
    Cheers
    Sven

  • Abap insert into info cube results in ORA 14400

    Hi friends,
    in principle the situation is as following:
    we have 2 cubes which are identically in design, in the number of info objects etc.
    now we have the following code here:                                                                   
    * Dann aus Quelltabellen kopieren
    * (Fakten zuletzt wegen Fremdschlüsselbez. zu Dim.)
      CHECK P_KOP IS NOT INITIAL.
      COMMIT WORK.
      SELECT * FROM /BIC/DZHW_CUBE1P.
        CHECK /BIC/DZHW_CUBE1P-DIMID <> 0.
        MOVE-CORRESPONDING /BIC/DZHW_CUBE1P TO /BIC/DZHW_CUBE2P.
        INSERT /BIC/DZHW_CUBE2P.
      ENDSELECT.
      SELECT * FROM /BIC/DZHW_CUBE1T.
        CHECK /BIC/DZHW_CUBE1T-DIMID <> 0.
        MOVE-CORRESPONDING /BIC/DZHW_CUBE1T TO /BIC/DZHW_CUBE2T.
        INSERT /BIC/DZHW_CUBE2T.
      ENDSELECT.
      SELECT * FROM /BIC/DZHW_CUBE1U.
        CHECK /BIC/DZHW_CUBE1U-DIMID <> 0.
        MOVE-CORRESPONDING /BIC/DZHW_CUBE1U TO /BIC/DZHW_CUBE2U.
        INSERT /BIC/DZHW_CUBE2U.
      ENDSELECT.
      SELECT * FROM /BIC/DZHW_CUBE11.
        CHECK /BIC/DZHW_CUBE11-DIMID <> 0.
        MOVE-CORRESPONDING /BIC/DZHW_CUBE11 TO /BIC/DZHW_CUBE21.
        INSERT /BIC/DZHW_CUBE21.
      ENDSELECT.
      COMMIT WORK.
      SELECT * FROM /BIC/FZHW_CUBE1.
        /BIC/FZHW_CUBE2-KEY_ZHW_CUBE2P = /BIC/FZHW_CUBE1-KEY_ZHW_CUBE1P.
        /BIC/FZHW_CUBE2-KEY_ZHW_CUBE2T = /BIC/FZHW_CUBE1-KEY_ZHW_CUBE1T.
        /BIC/FZHW_CUBE2-KEY_ZHW_CUBE2U = /BIC/FZHW_CUBE1-KEY_ZHW_CUBE1U.
        /BIC/FZHW_CUBE2-KEY_ZHW_CUBE21 = /BIC/FZHW_CUBE1-KEY_ZHW_CUBE11.
        /BIC/FZHW_CUBE2-/BIC/ZGEHALT   = /BIC/FZHW_CUBE1-/BIC/ZGEHALT.
        INSERT /BIC/FZHW_CUBE2.
      ENDSELECT.
    the problem is the insert statement. if the interpreter reaches this step in the code, we get a dump with the following message:
    " Database error text........: "ORA-14400: inserted partition key does not map to   
      any partition"                                                                   
    now i assume - since this cube can be loaded with data normally - that it is not possible to store data in an info cube directly using a simple insert statement. are there any appropriate function modules to condense data in a request and load it into an info cube?
    Kind Regards.
    Gideon.
    Message was edited by: Gideon Lenz

    hi,
    we had a similar error and solved with oss note 339896,
    though 2.0b mentioned, it's applicable for our 3.0, please take a look beside the note Vinod mentioned ....(there also mentioned 509660 if f-fact table)
    339896
    Symptom
    During the parallel upload into InfoCubes, ORACLE error ORA14400 might occur in BW 2.0B.
    BW2.0B and BW2.1C both originate from the same BW technology basis. Thus 2.0B is a synonym for both releases.
    Other terms
    Partitioning, ODS, PSA, parallel loading, ORA14400
    Reason and Prerequisites
    The error can either occur during the insert into the "PSA table" ( /BIC/B00....) or during the insert into the "F-fact table" ( "/BI*/F<INFOCUBE>" ).
    If the error occurs when writing to the F-fact table, please refer to Note 509660.
    If the error occurs when writing to partitioned PSA tables, an inconsistency exists between administration table "RSTSODS" and the partitions in the database.
    Solution
    1.) As of Patch 22, CHECK Transaction "RSRV" allows to check the consistency of PSA tables and repair them, if required. (If no name is specified, the CHECK is carried out for all PSA tables.)
    2.) Patch < 22: Please check whether function module "RSDDCVER_PSA_PARTITION" exists in your system.
    - If yes: Start it for the corresponding PSA table using i_repair = 'X'. Then, the inconsistencies should be eliminated.
    - If not: The inconsistencies have to be repaired manually!
    Among all partitions of the PSA table, determine the partition with the highest "HIGH VALUE". Compare this partition to the entry in the Partno field of the "RSTSODS" table.
    ==> Transaction SE16 ---> 'RSTSODS' ---> filter on ODSNAME_TECH with the PSA table name. Error ORA14400 occurs if the entry in the RSTSODS table is higher than the highest partition. To solve the problem, release table RSTSODS in the SAP-DD so that you can change it via Transaction SE16. Then change the entry in the PARTNO field to the value of the highest 'HIGH VALUES'. If the PSA table is empty, enter value '2' in the PARTNO field of the "RSTSODS" table!!!!! The inconsistency may exist in the SAP buffer only. Table RSTSODS is "buffered completely". Before you change the table manually, do submit the command "/$tab rstsods" in the OK-code which invalidates the table in the buffer. If the problem continues to exist, change the entry and invalidate the buffer again by submitting "/$tab rstsods".
    509660
    Symptom
    BW2.0B and BW2.1C are based on the same BW technology.2.0B is therefore a synonym for both releases.
    The ORACLE error ORA14400 can occur in BW 2.0B//2.1C during writing to InfoCubes.
    Other terms
    Partitioning, parallel loading, F fact table, ORA00054, ORA14400, ORA14074, ORA02149
    Reason and Prerequisites
    The error can also occur during writing to PSA tables or when ODS objects are activated. Refer to note 339896 in this case.As of 2.0B, the F fact table in an BW/ORACLE environment is "range-partitioned" according to the package dimension.A new partition is created when a request is written to requests in the F fact table.If creation of the partition is not successful, ORACLE error 14400 ( "inserted partition key is beyond highest legally partition key" ). There are several causes for this:
    1. During loading, there are jobs running (such as ANALYZE TABLE ... CREATE INDEX... ) which lock the table in the ORACLE catalog and take a very long time.Creating the partition on the database takes a very long time;In this case, the ORA00054 error (resource busy) is issued.
    2. during parallel loading, an error occurs because an optimistic blocking concept is implemented in the update.
    Adding a partition is linked to writing the package dimension entry. If the loading process terminates in such a way that the dimension entry was written, but the partition was not created, the second loading process terminates with ORACLE error ORA14400.
    Solution
    As of patch BW 2.0B 15 / BW 2.1C 7, creation of a new partition by a SAP lock is saved.If a lock is already present, the program tries to get the lock 100 more times and then continues.In the case of parallel loading processes, this optimistic lock approach always resulted in the ERROR.For this reason, the parameter _wait is set to True in the "RSTMPLWI" template for BW 2.0B patch 21 / BW 2.1C 13 when the enqueue function is called and the loop counter is set to 500, so that the loaders wait longer before continuing.
    First select the test "Unused entries in the dimensions of a InfoCube" for the corresponding CUBE via transaction "rsrv"  ==>  InfocubeData and press "Eliminate error". The entry is then deleted in the dimension and during the next loading process, a new entry is generated and the partition is created.
    The correction instructions for BW 2.9B patch 15 - 20 / BW 2.1C patch 7 - 12 is attached.In this way, the programs from the corrected TEMPLATE are regenerated.
    If the error persists in spite of this change, check whether the F fact table is analyzing or whether indexes are being generated on the F fact table at the same time as the load process.
    The program RSTMPLWI is an ABAP template where the update programs are generated. ==> For this reason, automatic installation is not possible and you can not run a syntax check!

  • Sap BW, Info Cube/ ODS

    Hi Gurus,
    Iam having a small problem with loading into a cube from two ODSs.
    I am able to see only todays requests(two from 2 ods) not previous reqs.
    the loading is from 2 ods that too full.
    no selection conditions has been given.
    what will be reason,that i am see only current(today) reqs.if i put pre date also not giving pre reqs.
    Could you pls sort out my proble.it is urgent...
    points would be rewarded...
    cheers.
    Surya

    Hi,
       If you go to manage tab you will see request desplay   from date of the update --- to --. give date what dates you want. you can see.
    thanks
    subbarao

  • Initial Loading of User Details into OIM 11g from Peoplesoft

    Hi,
    I wanted to pull user both (employee/contractor) details into OIM from People Soft. Which connector should i use.?
    PSFT User Management? or PSFT Employee Recon?
    Initially the users doesn't exist in OIM. I am looking for something like initial loading.
    Regards,
    Ashok

    Hi Kevin
    As per earlier version of PS ER connector, Oracle® Identity Manager Connector Guide for PeopleSoft Employee Reconciliation, Release 9.1.0, on section 3 Extending the Functionality of the Connector, there is guide on how to extend the connector to pull additional attributes from PS , by using peoplecode..
    But in Oracle® Identity Manager Connector Guide for PeopleSoft Employee Reconciliation,Release 11.1.1, am not able to find this information anymore.
    On how to pull additional attributes from PS like department, division and location, which is custom attributes in PS
    Please help !
    Many thanks

  • Steps to create Data loading from Flat File to Info Cube in BI

    Hi,
         I am very new BW, I need some one help. When I am trying to create info source i am any pop window stating to create Transaction data or master data.
         After creating Info source, I dont know how to assign this info source to source system (which i created).
          When select the context menu of info source I dont have option to assign the datasource.
          And one more thing is When I am creating the Info Cube. I cant understand how to create.
          Please some one help me how to map the fields to Source system.
    Regds
    Dave.

    Hi,
    For flat file upload, first you need to create the source system.
    Then you need to create the infosource based on the format of the flat file you are going to upload or vice versa depending on your requirements.
    Once your infosource is ready, right click on it and select assign datasource. Here you can assign your flat file datasource. Then create an infopackage and give the path from where the file is to be upload to BW (in the infopackage).
    Also look at the thread below for procedure on flat file upload :
    http://help.sap.com/saphelp_nw04s/helpdata/en/8e/dbe92341c84242be2c7d3917f1c197/frameset.htm
    Cheers,
    Kedar

  • Errror while loading data from dso to info cube....

    Hi All,
    When i am running dtp from dso to cube i get a error i.e :
    1.  Data package processing terminated
         Message no. RSBK229
    2.  'Processed with Errors'
         Message no. RSBK257
    3. Error while updating to target 0FIGL_C10 (type INFOCUBE)
        Message no. RSBK241.
         So these are the errors i am getting , i am new to BI so how could i solve this errors....
         All the transformation are active with no errors.
    Thanks in Advance
    Regards,
    Mrigesh.
    Edited by: montz2006 on Dec 7, 2009 8:45 AM

    Hi All,
    Now i deleted data from dso and data source and again run the dtp,
    this time i got data from data-source to dso but , when i am running dtp between dso and cube , i am not finding an error.
    Now when i go to cube and see the data the request has came but data has not transferred .....
    The Request Status is green in cube but data is not cuming....
    Regards,
    Mrigesh.

  • Adding a new key figure to Info Cube

    Can I add a new key figure to an existing Info cube in which data is loaded?
    Assume that In Info Cube we have the following.
    Stu Id -- Characteristic
    Maths , Physics , Chemistry -- Key figures
    These are the info objects which are already available and transfer rules are already available for the same in the infosource.
    Could any one let me know how to add one more new key figure (total ) , which is the sum of the three marks to the Info Cube and populate the data in to the same.
    I have tried the following steps ,  but could not get the solution.
    1). Create a new key figure info object (total).
    2). Add the same to the communication and pull the same into transfer rules of the existing info source and activate the same.
    3). Add the new key figure to the info cube.
    4). Open transfer rules for the info cube and change the mode from NO updation to Addition for the particular key figure (total).
       When I perform the 4th step it is giving a red symbol beside the key figure in update rules and I'm not able to activate the same.
    Any help on how to add key figures to update rules and transfer rules is highly appreciated and points would be assigned.
    Thanks

    1.add new Keyfigure to Infocube.
    2.Go to Update rules> go to update type of that Keyfigure>selct formula in update method>create new formula>here you can add up your 3 keyfigures-->OK.
    activate update rules and InfoProvider.Check all are active or not..
    by using Export generate datasouce ,you populate data(historical data)  to new keyfigure as well.Then you have to delete historical data requests.
    or iyou can create formula in query designer as well as srinivas suggested.that would be better option.

  • Loading InfoObject master data into Cube

    Hi Guys,
    I have a  question. I created an InfoObject as a data target and loaded data into that data target. I created a cube and I want to include this infoobject in the cube. Now How do I load data into this cube from the InfoObject.Because the InfoObject doesnt have any option to select data targets, since it is master data.
    Data is being loaded into the InfoObejct. But when I check the cube there is no data.
    How do I load this master data into the cube without update rules?
    Thanks in advance.

    Hi,
    i think u need load some value to transaction data so that you can report on it, but how you are going to decide which row will be having which value from master data? in general value for this should come from transactional data, or you could assign some constant value to it.
    Do you want to use any attribute of Inoobject in report? And if you have any key value for that coming from Transaction data, you can use rule type as read from master data
    It depends upon your requirememt.
    Since you want to restrict the values of the cube which are going into the report then you should include this object into the cube.
    In any case if you want to restrict the value of the transactional data in the cube with either the value of the objects ot the attributes of object you will have to add this object into the cube.
    As you said you can load master data and use in report but how you are going to use it..............you will have to include it in the cube to use it.
    Also you can check whether this object is attribute of some other object already in the cube in that case you can make it navigation attribute and then use it for the restrictions in the cube.
    In this case only you do need to add it to the cube,But if it not an attribute of any object then you must add it to the cube.
    You have to get the value of your infoobject into Cube from datasource, even though you want to use attributes only.
    You will have to add it and it wont have any effect the transaction data getting loaded into the cube will be different from the master data only SID's are matched during loading of the data.
    You should load separate master data into the 0recontract infoobject with the master data source and make the respective object contract type and contract category as navigation there.
    Also you should add 0recontract to the cube and update it from the transaction data source as you have said that it can be updated.
    Then just do the normal restrication at the report level
    You can make this display attribute as naviagatioonal first in the master data object and then in the cube make this attribute as navigational.
    then you can use it to restrict in the cube
    You can use Category as InfoObject in your cube, and while loading data for your infoobject in the rule type you can use option as reading from master data. It will take key of infoobject, and will update attribute value in your cube.
    http://help.sap.com/saphelp_nw04s/helpdata/en/e5/f913426908ca7ee10000000a1550b0/content.htm
    You can also restrict with attributes value, Just in the restrcition of your infoobject in query designer, slelect left-botton button of 'Dipaly Other Values', here you can select you attribute and its value for restriction as well
    Regards,
    Hareesh

  • Load to Info cube - Error

    I am trying to load from DSO to an Info cube. The records doesnt have error and but the process always gets terminated and gives the follwoing message:
    Error while updating to target ZPU_C01 (type INFOCUBE)
    Processing Terminated
    Its not able to create "Error DTP" as well since there are no errors.
    Any suggestions?

    Hi.......
    Hav u Debug the request..........In the DTP Monitor............click on Debugging in the top.............
    How is system performance..............Check Short dumps in ST22..............and System Log in SM21..........
    Regards,
    Debjani.........

  • Loading in Info Cube takes huge time

    Hi,
    We are loading transactional data in the info cube ZFI_C01  from a DSO ZFI_O05
    We loaded around 1.8 million records ( 1776444 )  through a request in the DSO ZFI_O05  that took around 13 minutes including activation of that request.( Note that flag for SIDs Generation upon Activation was checked  and so, SIDs were generated for all the characteristics in the DSO during activation )
    When we loaded the same request to the cube ZFI_C01, the request took around 3 hours to finish.
    I did RSRV Checks for the infocube ZFI_C01 to find out the ratio between fact table and dimension tables.
    ( goto RSRV->all elementary test->database->database info abt infoprovider table->give the cube name ..check the log it gives the ratio between fact table and dim table... )
    I got the following results for the two of the dimensions that are involved in the cube:
    Table /BIC/DZFI_C013 has 1564356 entries; size corresponds to 86% of the InfoCube
    Table /BIC/DZFI_C012 has 1649990 entries; size corresponds to 91% of the InfoCube
    When I checked the properties for both these dimensions, the checkbox "High Cardinality" and "Line Item Dimensions" were unchecked.
    I can not check the checkbox "Line item Dimension" as both these dimensions are having more than one info characteristics.
    Shall I check "High Cardinality" checkbox and retry loading as ratio of the dimension table to fact table size is more than 20%.
    But I am bit unclear what impact it will have on reportin

    Hi there,
    Check if you have any routines with code (start routine, end routine, etc.) that can have huge load performance.
    Also check in sm50 when you do the load to the InfoCube if you're having several reads to the table NRIV and/or to some specific dimension tables. If so, find out what are the entries for buffer number range to those dimension tables, and after repeat the load (cancel it meanwhile) to the infocube, go to transaction SNRO and keep in buffer range 500 of sixe to those buffer number range. Repeat the load.
    Try to see if it hepls,
    Diogo.

  • Error regarding data load into Essbase cube for Measures using ODI

    Hi Experts,
    I am able to load metadata for dimensions into Essbase cube using ODI but when we are trying same for loading data for Measures encountring following errrors:
    Time,Item,Location,Quantity,Price,Error_Reason
    '07_JAN_97','0011500100','0000001001~1000~12~00','20','12200','Cannot end dataload. Essbase Error(1003014): Unknown Member [0011500100] in Data Load, [1] Records Completed'
    '28_JAN_97','0011500100','0000001300~1000~12~00','30','667700','Cannot end dataload. Essbase Error(1003014): Unknown Member [0011500100] in Data Load, [1] Records Completed'
    '28_JAN_97','0011500100','0000001300~1000~12~00','500','667700','Cannot end dataload. Essbase Error(1003014): Unknown Member [0011500100] in Data Load, [1] Records Completed'
    Can anyone look into this and reply quickly as it's urgent requiremet.
    Regards,
    Rohan

    We are having a similar problem. We're using the IKM SQL to Hyperion Essbase (DATA) knowledge module. We are mapping the actual data to the field called 'Data' in the model. But it kicks everything out saying 'Unknown Member [Data] in Data Load', as if it's trying to read that field as a dimension member. We can't see what we missed in building the interface. I would think the knowledge module would just know that the Data field is, um, data; not a dimension member. Has anyone else encountered this?
    Sabrina

  • Error in Uploading the data in Info Cube

    Hi,
    We are implementing SCM 5.0 APO and ECC 6.0 in UK/I client.
    At the moment we are facing one problem in uploading the Excel file into Info Cube.which was working perfectly earlier.
    Through analysis, we come to know that data flows thru PSA. but  when we try to push that data from PSA manually system gives following error message;
    Runtime Errors         UNCAUGHT_EXCEPTION
    Except.                CX_RSR_X_MESSAGE
    What happened?                                                                                |
    |    The exception 'CX_RSR_X_MESSAGE' was raised, but it was not caught anywhere along the call hierarchy.                                                                               
    Since exceptions represent error situations and this error was not                  
    adquately responded to, the running ABAP program 'SAPLRRMS' has to be    terminated.                                                                               
    Error analysis                                                                               
    An exception occurred which is explained in detail below.                                    
    The exception, which is assigned to class 'CX_RSR_X_MESSAGE', was not caught  and  therefore caused a runtime error.                                                            
    The reason for the exception is:                                                             
    No text available for this exception                                                                               
    Missing Handling of Application Exception                                                        Program                              RSABW_START_NEW                                          
    Trigger Location of Exception                                                                     Program                                 SAPLRRMS                                             
    Include                                 LRRMSU13                                              Row                                     78                                                   
    Module type                             (FUNCTION)                                            Module Name                             RRMS_X_MESSAGE 
    if somebody can give advice on this! As we could upload the data earlier perfectly.
    Thanks
    Regards
    Jignesh.

    Hi Jignesh,
                      I this the first time you are loading into the cube? Is it the a fresh installation?
    The error message doesn't help you much but I would try a couple of things
    1. Replicate the datasources in source system, Activate the transfer and update rules and then try to load
    2. Try loading only till the PSA and then try to simulate the update to cube. If you are using  BI 7.0, then just load the info package, it will load to PSA and unless you run the DTP, it doesnot update the cube.
    3. This can also be a GUI issue. Make sure that you have all the GUI files installed. Is it a dump with message "CX_RSR_X_MESSAGE"?
    Check the Note 763203 - Termination during InfoCube realignment
    Hope this helps. Please donot hesitate to ask more questions.

Maybe you are looking for

  • How do I copy iTunes purchases from one iTunes library to another stored on the same computer but with different apple ID's?

    I recently set up a new iTunes account for my son by giving him his own Apple ID & own login on the Mac we own. In addition I have an existing iTunes account with a different Apple ID & different login on the same Mac. We would like to share some pur

  • Firewire Unresponsive

    I think my firewire is fried. My iPod is not recognized by the mac anymore when plugged (although it does receive power from the computer) It just freezes while connected. I have also tried connecting another firewire device (a DV camera) with no suc

  • Sleep/ start up issue

    Last night I put my mac to sleep, like I usually do. This morning I hit the keyboard to have it wake up again, but nothing happened. After a few tries I touched the computer to feel if it was even warm, and it was just cold. It wasn't on at all. So t

  • Can't launch the Pantone Calibrator Panel. Help!!!

    The tray shows on the taskbar  and I clicked the right bottom then clicked 'lauch color calibrator' , but the application will never be launched(Same when I open it in the Start manu). Even though I have unistalled the pantone calibrator and then rei

  • Help with flattening stamps

    I did not have any problem with Acrobat 9 in the past.  Recentlyh, Acrobat X was installed.  The only problem is flattening stamps on pdf files. I copied "flattenPages.js" from from Acrobat 9 to Acrobat X  <<<c:\Users\UserName\AppData\Roaming\Adobe\A