Data Process in CUBE & DSO

Hi Experts
Please..Please update me on in detail if possible with example ......Key Fields and Dimensions
How Data Records will be processed in DSO and Cube
What is 0RECORD MODE in DSO
I tried to search but i can't able to find what i am looking for

Dear User,
KeyFields: Simply like Primary keys...if the keyfields combination is same...then the relevant datafield values will be overwritten.
Dimension: Nothing but a collection of LOGICALLY RELATED INFOOBJECTS.  Its an angle of viewing the data from the Fact table.
0RECORDMODE: Used to maintain change history. It is having 7 different type of images, using which we can identify delta records.
Regards,
Ram.

Similar Messages

  • Problem while  data processing TRANSACTION data from DSO to CUBE

    Hi Guru's,
    we are facing problem while  data processing TRANSACTION data from DSO to CUBE. data packets  processing very slowly  and updating .Please help me regarding this.
    Thanks and regards,
    Sridhar

    Hi,
    I will suggest you to check a few places where you can see the status
    1) SM37 job log (give BIrequest name) and it should give you the details about the request. If its active make sure that the job log is getting updated at frequent intervals.
    2) SM66 get the job details (server name PID etc from SM37) and see in SM66 if the job is running or not. See if its accessing/updating some tables or is not doing anything at all.
    If its running and if you are able to see it active in SM66 you can wait for some time to let it finish.
    3) RSMO see what is available in details tab. It may be in update rules.
    4) ST22 check if any short dump has occured.
    You can also try SM50 / SM51 to see what is happening in the system level like reading/inserting tables etc.
    If you feel its active and running you can verify by checking if the number of records has increased in the cube.
    Thanks,
    JituK

  • Data Extraction failure from DSO to Cube

    Hello Experts,
    The Data load to Cube( zfiar_c03) from DSO (zfiar_03) using a full
    update is failing throwing the error message"Incomplete update due to
    errors in single records ".We had deleted the failed request from
    infocube and loaded the the same req from psa by correcting or removing
    the # values but it failed again resulting in the same error.
    When we checked the previous successful requests which got loaded to
    cube from dso it also contains the # values , but the loading was
    successful.
    our BW reporting is seriously impacted due to the failure coz our
    collection,recievables..etc all business critical report is based on
    this cube.
    Request you all to help..
    thanks,
    Sapta

    Hi Sapta Jyothi,
                     I am not sure what the actual problem is. But, from my prior experience, I fell this is related to the Process chain configuration provided you are using process chains.
                     If  you are using a process chain , please check if there is any other failed step in the Process chain. If that is the case, even if the failure due to invalid entries is fixed the QM status of the load to the cube will not turn Green till the process chain status turns green or yellow. You can check from the details tab of the info-package load run and see if the process chain log step is red. If this is the case, then you will have to take action to see that other steps are completed succesfully. And also the Process chain in this case needs to be modelled in a way, such that, when a load to a cube fails the successive steps like creation of cube indeces should not be triggered.
    Thanks ,
    Nithin.

  • Not able to delete Cube & DSO data

    Hi Team,
    I am trying to delete data request from upper & lower Data providers(cube & DSO respectivly).
    Frst I tried to delete cube data using context menu option DeleteData.It generates dump "Time Exceeded". So I manually selected the request in cube manage & deleted.
    Later when i try deleting DSO(that updates abv cube) data with context menu option gv similar dump so i manually tried deleting topmost req & it fails with following log.
    Delete running: DataStore object FIGLO02B, from 304,053 to 304,053
    Delete is scheduled; Selection conditions were substituted
    FB RSM1_CHECK_DM_GOT_REQUEST called from PRG RSSM_PROCESS_REQUDEL_ODSO; row 000396
    Request '304,054'; DTA 'FIGLO02B'; action 'D'; with dialog ''
    Leave RSM1_CHECK_DM_GOT_REQUEST in row 70; Req_State ''
    No records found for request ODSR_AKJPD3YPNDJMTLCTHHM0Z01VX in change log /BIC/B0000111000 of DataStore FIGLO02B
    Deletion of request REQU_F1RX9B16C7MNKCHC4AYPLI0LP from data target FIGLO02B failed
    Deletion of request REQU_F1RX9B16C7MNKCHC4AYPLI0LP from data target FIGLO02B failed
    No records found for request ODSR_AKJPD3YPNDJMTLCTHHM0Z01VX in change log /BIC/B0000111000 of DataStore FIGLO02B
    then i noted the cube data still nt deleted .Also this table /BIC/B0000111000  is empty.
    Please suggest hw shall i delete data frm DSO & cube as i want to perform fresh init.

    Hi
    Try deleting the Data Requests from Tables.
    Active Data /BIC/A<ODS_NAME>00
    New Data /BIC/A<ODS_NAME>10 in BW 2.x /BIC/A<ODS_NAME>40 in BW 3.x
    Change Log: search in table RSTSODS with the string "8<ODS_NAME>" in field ODSNAME, the name of Change Log is in field ODSNAME_TECH.
    RSODSACT                       Activation of ODS M Records
    RSODSACTREQ                    Activation table of M-request
    RSODSO_REQUCPY                 Assigning Requests to Template
    RSODSO_ROLLBACK                Table for Temporary Storage
    RSODSO_RUNTIME                 Results of Runtime Measuremen
    RSODSREQTSTMP                  Sorting of ODS Requests
    RSODSSETTINGS                  Settings for an ODS
    Hope ithelps

  • Do Not Check Uniqueness of Data in Write Optimised DSO

    Hello,
    I am working with write optimised DSO with already billion records in it. I have flag 'Do Not Check Uniqueness of Data' in Settings as checked(means it will surely not check for uniqueness of data). I am thinking of removing this flag off and activate the DSO again. I am willing to remove this flag as it will provide option of Req ID input in listcube on DSO and without this list cube will never return back with results.(I have to analyze aggregations)
    I tried removing this flag and then activate DSO in production system with 17 million records which took 5 mins for activation (index creation). So maths says for a billion records activation transport will take around 6 hrs for moving to Production.
    I am willing to remove this flag as it will provide option of Req ID input in listcube and without this list cube will never return back with results.
    Questions:
    How does this flag checks the uniqueness of record? WIll it check Active table or from the Index?
    To what extent DTP will slow down the process of subsequent data load?
    Any other factors/risks/precautions to be taken?
    Let me know if questions are not clea or further inputs are required from my side.
    Thanks.
    Agasti

    Hi,
    Please go through the site :
    /people/martin.mouilpadeti/blog/2007/08/24/sap-netweaver-70-bi-new-datastore-write-optimized-dso
    As far as your ques are concerned... i hope above blog will answer most of it , and if it does'nt please read below mentioned thread.
    Use of setting "Do Not Check Uniqueness of Data" for Write Optimized DSO
    Regards
    Raj

  • Data load to Cube

    Hi All,
    I am new to SAP BI, I am trying loading data from data source to DSO and then from DSO to a Cube.
    I am getting data in DSO, but when I run DTP for DSO to cube the data does not come in cube.
    There is no status message for error while executing the DTP .
    There are no filters also.
    I deleted the data from data source again then did the process again then also data is coming to DSO ,
    but not coming to CUBE.
    Can anybody help me with this.
    Thanks in Advance,
    Regards,
    Ramesh

    Hi,
    Try to activate the data in the DSO. Go to the manage of your DSO and click on the request and click Activate. And then wait for the request to get activated and then start the DTP for the cube.
    Hope this helps.
    Thanks,
    Nitant

  • How to create process chain with dso and infocube at a time

    hi friends,
      i have a FLATFILE in application server and i need to upload it to STANDARD DSO and i created a INFO PACKAGE and DTP for DSO.
    i have a STANDARD INFOCUBE to upload data from above DSO. here DSO is the source.  i want to schedule  this process in process chain. how could i do this. any one please give me solution.
         can any one explain me the options in RSPC and how they work and where we exactly use. this is very helpful to me. please give me guidance. thank you.
    thanks,
    sree

    First make sure you have already created the infopackage to load the data from server to the DSO and the DTP to load data from DSO to the Cube.
    Now goto Tcode RSPC.
    Choose Create process Chain option. provide appropriate technical name and description.
    Drag "Execute infopackage" process type from left hand side pane in RSPC.  Choose the infopackage to load the data from Server to DSO. you can find this infopackage using the F4 help in the above selected process type.
    Drag the "data transfer process" process type to load data from the Flat file data source above to the DSO.
    Then Drag the "Activate DataStore Request"  process type and select the DSO in which you would load the data  using the above DTP.
    Drag the "data transfer process" process type to load data from DSO to the Cube.
    Save, activate  and schedule the process chain according to appropriate time.

  • Error while loading data into the cube

    Hi,
    I loaded data on to PSA and when I am loading the data to the cube through DataTransferProcess, I get an error (red color).
    Through "Manage", I could see teh request in red. How do I get to knoe the exact error? Also what could be the possibel reason for this?
    Also can some one explain the Datatransfer process(not in process chain)?
    Regards,
    Sam

    Hi Sam
        after you load the data  through DTP(after click on execute button..) > just go to monitor screen.. in that press  the refresh button..> in that it self.. you can find the  logs..
       otherwise.. in the request  screen also..  beside of the request number... you can see the logs icon.. you can click on this..
    DTP  means..
    DTP-used for data transfer process from psa to data target..
    check thi link..for DTP:
    http://help.sap.com/saphelp_nw04s/helpdata/en/42/f98e07cc483255e10000000a1553f7/frameset.htm
    to load data in to the datatargets or infoproviders like DSO, cube....
    in your  case.. the problem may be.. check the date formats.. special cherectrs... and
    REGARDS
    @JAY

  • Logical Data Partitioning Info cube

    Hi,
    We have a Legacy system data in BW which is about 25 million reports span across 6 years and by Fiscal year/period.
    Currently data is held in 6 DSO's and its one time load to cube...data load won't happen on daily basis as legacy system is de commissioned.
    How to Logical Data Partitioning Info cube
    Thanks

    Hi Choudary,
    One way is as follows: Convert the update process (for 0VENDOR) to flexible update. For this you will need to make 0VENDOR an InfoProvier (RDS1 > Master data tab). Then you need to create update rules from the 0VENDOR_ATTR InfoSource to 0VENDOR InfoObject. Since you now have the 0VENDOR_ATTR InfoSource you can also create update rules from this InfoSource to your cube and load the data.
    Hope this helps...

  • Where is the subsequent processing of CUBE or DTP going to cube?

    Hi guyz,
    I'm looking for the subsequent process in the DTP or in the CUBE so that when a request or data is loaded in the cube an EVENT will be triggered.
    Thank you!
    Loed

    Hi Loed
    I assume you want to trigger an event once data is loaded to a cube/dso and other load should start after completion of this event.
    For this you can first create one event from Tcode:SM64 (i.e ZTEST) in my case and use the same event in request tab of cube.
    Click on Subsequent proc. below window will appear and mention same event name here. In this case once data load is complete for this cube then ZTEST event will be triggered on its own.
    Use same event to triggere a process chain as below. In process chain start variant use ZTEST event.
    Regards
    Shabnam

  • Load ODS data into multiple cubes

    I have an ODS that has global data.  I need to load the ODS data to multiple cubes based on region and year.  The delta is setup between ODS and cubes.  How do we go about doing this in 3.x and 7.0 and is there any difference in the procecure between these two versions.
    Regards,
    Ram.

    Hi Ram.
    In BI 7.0, you need to create separate Transformations from your Data Store object (DSO) to all the Multiple cubes. For each transformation you then have to create a Data transfer process (DTP). In the DTP you can select data based on the Region and Year.
    DTP data selection
    1. Open the DTP
    2. Goto the Extraction Tab
    3. There will be a "Filter" button opposite to the Extraction mode.
    4. By using this Filter button you can do data selection on different fields.
    In BW 3.x you need to create update rules instead of transformations. The following link should give you more details
    http://help.sap.com/saphelp_nw2004s/helpdata/en/3f/0e503c3c0d563de10000000a114084/frameset.htm
    Regards, Uday
    Assign points for helpful answers and get one point for yourself.

  • Data getting added in DSO

    Hi ,
    I am having DSO in which data is coming from cube .In transformation ( Cube --> DSO )  Rule Type EQ "Direct Assignment  & Aggregation EQ 'Overwite'  & there is no Start & End routine present .
    Cube Data
    Planning Area  Plant   Country Customer Group Chiild Material  Parent Material   Quantity
    xxx                    yy        z                ABC              100                     10                    250
    xxx                    yy        z                ABC              100                     20                    150
    xxx                    yy        z                ABC              100                     30                     50
    DSO Data  ( Planning Area,  Plant ,  Country, Customer Group, Chiild Material are Key Field  & Quanity is data field )
    Planning Area  Plant   Country Customer Group Chiild Material   Quantity
    xxx                    yy        z                ABC              100                  450
    Ideally when data flowing from cube to dso ,dso data should get overwrite if key field are same but in mine case quanity is getting added .I don't know  please guide me .

    Hi ,
          If you need different records in DSO as per  your example (without adding the parent material) ,then load two different plant values in the InfoCube and see the result. For example , load Plant '100' and Plant '200' into your InfoCube first and then load these two records into your ODS. The records will not get aggregated and you will find two records in your ODS .But if you want only the last record,then you may need to write routine in your update rules.
    Hope it clarifies your doubt.
    Thanks.

  • Moving data to archive cube

    Hi Experts,
    We are extracting data from R/3 to BW.We are keeping 2years of Data in DSO1 and moving into Cube(R/3>DSO1>Cube1).we have 2007 and 2008 data in the DSO and Cube.We extracted 1999 to 2006 data in to History DSO  from R/3 and sent into InfoCube(Cube2).The flow as is follows
    Current Data:2007 and 2008
    R/3 -
    >DSO1--->Cube 1(Deltas are running)
    History data:1996 to 2006
    R/3>DSO2--->Cube2.
    Now I want to move 2007 data in to History data(History DSO and Cube).
    I have two options to get this job  done .
    1.Move selective data from DSO1 to DSO2 and from DSO2 to Cube 2.
    2.Move selective data from Cube 1 to Cube 2.If So I can't see item wise data just I can see only Aggregated data in the cube.
    Is there any best approach other than two options.if not what would be the best between two options.
    Once I move the data into History cube I need to delete the data in Current DSO and Current Cube based on selective deletion.If I delete the data in DSO and Cube is there any impact on currrent Delta load.
    I need to do this every year becuase we want to keep only 2years of data in current data InfoCube..Could any one throw some light on the above issue.
    Thansks,
    Rani.

    Hi Rani.........
    Ur 1st Question....
    We are extracting data from R/3 to BW.We are keeping 2years of Data in DSO1 and moving into Cube(R/3>DSO1>Cube1).we have 2007 and 2008 data in the DSO and Cube.We extracted 1999 to 2006 data in to History DSO from R/3 and sent into InfoCube(Cube2).The flow as is follows
    Current Data:2007 and 2008
    R/3 -
    >DSO1--->Cube 1(Deltas are running)
    History data:1996 to 2006
    R/3>DSO2--->Cube2.
    Now I want to move 2007 data in to History data(History DSO and Cube).
    I have two options to get this job done .
    1.Move selective data from DSO1 to DSO2 and from DSO2 to Cube 2.
    2.Move selective data from Cube 1 to Cube 2.If So I can't see item wise data just I can see only Aggregated data in the cube.
    Is there any best approach other than two options.if not what would be the best between two options.
    ANS I want to clear u one thing.....that in infocube data will not get Aggregated until u aggregate the Cube.........So u can see the data in cube item wise if u don't aggregate the cube.......
    Anyways...I think the best option is to follow the Flow.....Move selective data from DSO1 to DSO2 and from DSO2 to Cube 2..........
    Secondly..U hav asked...
    Once I move the data into History cube I need to delete the data in Current DSO and Current Cube based on selective deletion.If I delete the data in DSO and Cube is there any impact on currrent Delta load.
    I need to do this every year becuase we want to keep only 2years of data in current data InfoCube..Could any one throw some light on the above issue....
    ANS Selective deletion of data is not going to effect delta loads...so u can do Selective deletion...
    Actually Delta load will get effected....somehow if anyone delete a delta request without making the QM status red.......then init flag will not be set back....and data will be lost...
    Hope this helps..
    Regards,
    Debjnai...

  • Automatically trigger the event to load data from Planning cube to Standard Cube

    Hello,
    We have a below set up in our system..
    1. A Planning BEx query using which user makes certain entries and writes back data to the planning cube.
    2. An actual reporting cube which gets data from the planning cube above.
    Now, what we want to do is to automate the data load from Planning cube to Reporting cube.
    This involves 2 things..
    1. Change the setting " Change real time load behaviour " of the planing cube to Planning.
    2. Trigger the DTP which loads data from Planning cube to reporting cube.
    We want to automate the above two steps...
    I have tried few things to achieve the same..
    1. Created an event in SM64,
    2. In the Planning cube "Manage" Screen,  clicked on "Subsequent Processing" and provided the event details. ( Not sure if its the correct place to provide the event detail ).
    3. Wrote a ABAP program which changes the setting of the planning cube (  " Change real time load behaviour " to Loading )
    4. Created a process chain, where we have used the event as a start variant, used the ABAP program as a next step, DTP run as the last step..
    This i hoped, would trigger the event as soon as a new request comes and sits in planning cube, which in turn would trigger the process chain which would load the data from Planning cube to reporting cube..
    This is not working. I dont think the event is triggering, and even if does i am not sure whether it will start the process chain automatically. Any ideas please ?

    hi,
    try to do the transformation directly in the input cube by using CR of type exit, more details :
    http://help.sap.com/saphelp_nw70ehp2/helpdata/en/43/1c3d0f31b70701e10000000a422035/content.htm
    hope it helps.

  • Deleting Data from SSAS CUBE

    Hello Everyone,
    I have a question, we developed a Basic Cube by building dimensions & Fact from the Source table(Everything in DSV, no Physical DIM and Fact Tables). Now everything goes well.  The Source table will only have CUrrent day data.
    My Question is: Need to delete  particular day data from the cube without disturbing the Existing Data.
    Eg: When I process the data on 31st of March, cube will have only 31st. But when I process 1st , 2nd data CUbe should have 31st +2nd data. 1st Dated data should be deleted from cube,when ever I process the data I just need month end data need to be stored
    with current day data. all the other stuff should be deleted from the cube.
    if I process the Cube on 1st of May, I should only have 31st March, April 30th and May 1st data.
    Hope the Question is clear, please let me know.
    Any help/suggestions would be appreciated.
    Thanks In Advance.
    Thanks, Please Help People When they need..!!! Mark as answered if your problem is solved.

    Hi BKomm,
    I Guess the only way to handle this scenario is by using partitions.
    Create partitions for every last day of months + One additional partition for  current_date.
    your where clause for current_date partition should be somewhat like this
      Where Date = current_date And Date <> Last_Day_of_current_Month
    so that it does not duplicate data for the last day of current month.
    Saurabh Kamath
    Hello Kamat, I was looking for something else inorder to delete the existing data from a cube. But your Approach is far better than making it complex, it did not strike to my mind. Thanks will implement practically and check.
    Thanks Again.
    Thanks, Please Help People When they need..!!! Mark as answered if your problem is solved.

Maybe you are looking for