Datamart from a compressed cube

Hi All,
is it possible to perform an export datasource on a cube that has been compressed and then load the data it into another cube ?
I get the following error when I do the above
"No corresponding requests were found for request REQU_3YQUNVFWZQRR87ZPBRYDHE97C in the ODS/Cube"
Please suggest if you have any ideas.
Best Regards,
Sanjay

Hi Sanjay,
Yes.It is possible to create an export data source on a cube that is compressed.
Is it an error OR just a warning.
Bye
Dinesh

Similar Messages

  • Rollup and Datamart concepts for compressed cube

    Hi all,
    I loaded the data into cube and i compressed it ( collapse ),
    Then I have 2 questions
    1. Can i use the  Rollup option now?
    2. Can I load the data into another cube by using the compressed cube?
    good answers are rewarded
    Regards
    Ram

    hi sri
    Its not like we cannot do rollup before compression. You are right we do rollup to fill the aggregates. When you will do compression the request id will be deleted and the data will move from F table to E table. There will be no requst id at all.
    We do rollup based on Request only. If there is no request how can u do rollup. the same requst will not be availabe for reporting if u dont do the Rollup to aggregates.
    Regards
    Khaja

  • Problem with 0RECORDMODE in datamart from ODS tu Cube

    Hi all,
    i need to do a datamart from a cube to an ODS, when i create the Update rules (in 3.5), this ask me "the infoobject 0RECORDMODE doesnt exist in the infosource", thats ok because the cube doesnt need the 0RECORDMODE, but the most weird is, all this that i need to do, is created and it works, now i dont know how that consultan could do it, how can i do this? help guys

    Hi,
    You just create the export datasource for this infocube then create the flow from cube to ODS.Because from cube to ODS delta work on request based.So it will update the ODS according to the new request which will be load into the infocube...
    Thanks,
    Kamal

  • Deleting data from a compressed cube

    Hi Gurus,
    I have an info cube which is getting data fron three different DSO's (DSO1, DSO2, DSO3)
    and I have Aggregates to the cube and this whole process is automated in a Process chain
    now I have to delete the data in the cube which is getting loaded from DSO2 only
    how to achieve this
    Thanks
    MSS:-)

    Hi,
    You can do Selective deletion but how you do that , is there any field which differentiate between DSO1 , DSO2 & DSO3
    based on that fields you make selective deletion
    for Dynamic Selective deletion goto the following link
    http://www.sdn.sap.com/irj/scn/index?rid=/library/uuid/4001f4b1-189f-2e10-bcb3-87bb0fb428ba
    another option is to create infoset/ multiprovider  based on DSO1 & DSO3
    Best Regards
    Obaid

  • DTP from Compressed cube

    HI Gurus
    I have a Partial  compressed cube, with some of the request are compressed and rest not compressed.
    I am using this cube as a datamart to a similar cube (using for back up data).
    I am using the delta with the request by request option in DTP.
    After the data load i am facing the problem that the data is not similar. Target cube has more figures.
    Is this because of using request by request(in DTP) as i have some copmressed request in source cube.
    Thanks
    Dheeraj

    Hi Dheeraj,
    If you are moving data from one cube to another cube its must be do all requests are compression, and then move it to another cube. 
    I hope to you didn't any problem.
    You are facing the problem that the data is not similar is due to using request by request(in DTP),because If you set this indicator, a DTP request only gets data from one request in the source.
    When it completes processing, the DTP request checks whether the source contains any further new requests. If the source contains more requests, a new DTP request is automatically generated and processed.
    But you are loading not deltas then uncheck this indicator and load it.
    Regards
    SKBABU

  • Datamart works for Compressed data

    Hello
    We created Datamart for the Billing Cube and needs to send the data to New Billing Cube.
    I question like for the Old Cube for which Datamart is created, got some compressed records.
    So when iam loading the data from Datamart to new cube, will it be possible for me to get the compressed records too.
    Do we need to load differently for compressed records.
    Please provide some solution if anyone has faced this, it will be very helpful for me.
    Thanks in advance

    Hello,
    when you load data from one cube to another, all records will be passed; compressed and not compressed.
    Furthermore, if you are creating a new cube and need to load all the data from the old cube to the new cube, I recommend to compress the old cube fully, update indexes, and create statistics on it; the selection and fetching of data will be faster.
    Hope this helps...
    Olivier.

  • Semantic Partitioning Delta issue while load data from DSO to Cube -BW 7.3

    Hi All,
    We have created the semantic partion with the help of BADI to perform Partitions. Everthing looks good,
    first time i have loaded the data it is full update.
    Second time i init the data load. I pulled the delta  from DSO to Cube . DSO is standard where as Cube is partition with the help of semantic partition. What i can see is the records which are updated in Latest delta are shown into the report rest all are ignored by the system.
    I tried compression but still it did not worked.
    Can some has face this kind
    Thanks

    Yaseen & Soujanya,
    It is very hard to guess the problem with the amount of information you have provided.
    - What do you mean by cube is not being loaded? No records are extracted from DSO? Is the load completing with 0 records?
    - How is data loaded from DSO to InfoCube? Full? Are you first loading to PSA or not?
    - Is there data already in the InfoCube?
    - Is there change log data for DSO or did someone delete all the PSA data?
    Sincere there are so many reasons for the behavior you are witnessing, your best option is to approach your resident expert.
    Good luck.
    Sudhi Karkada
    <a href="http://main.nationalmssociety.org/site/TR/Bike/TXHBikeEvents?px=5888378&pg=personal&fr_id=10222">Biking for MS Relief</a>

  • FOR BACKUP FROM PA TO CUBE IN APO 5.0

    HELLO FRIENDS,
    I HAVE A SMALL QUERY ,HOW TO LOAD DATA FROM PA TO CUBE
    1.CREATE DATA SOURCE IN PLANNING AREA SECEEN
    2.CREATE INFO PACKAGE
    3.CREATE INFO CUBE
    4. CREATE TRANSFORMATION
    5. DPT DATA TRANSFORMATION PROCESS
    THESE STEPS I HAVE CREATED BUT IAM NOT ABLE TO GET THE DATA TO CUBE FOR BACK UP PURPOSE. I AM WRONG PL CORRECT ME ,GIVE ME UR SUGGSIONS.

    Sivaram,
    Something I found in a BI thread:
    1. The data flow is from BW to APO, and from APO to BW.
    from bw to apo, to feed planning area, characteristic value combination, in APO we have infocube to feed the data, as we know APO has BW inside (all the tcodes RSA1 etc are valid), datamart scenario is used here, we create infocube in APO, update rules, infosource and assign datasource from BW.
    The data flow from APO to BW, to get the plan data inputted by user, we generate datasource in APO from planning area (compare with 'generate export datasource' from infocube), transaction /SAPAPO/SDP_EXTR. 'extractor technology' here in practice is just like when we are using datasource
    take a look
    http://help.sap.com/saphelp_scm50/helpdata/en/c9/199170f13711d4b2f20050da385632/frameset.htm
    2. as mentioned in #1, you can use the transaction /SAPAPO//SAPAPO/SDP_EXTR to handle planning area-datasources in APO, or /SAPAPO/MSDP_ADMIN - planning area; choose your planning area, and menu extractor tools (or generate datasources), again datamart scenario is used here, after the datasource generated in APO, it's replicated in BW (APO act as source system), the rest steps are just same as when extract data from other SAP source system
    simply what you need to practice is /SAPAPO/SDP_EXTR to 'generate export datasource',
    Check
    http://help.sap.com/saphelp_scm50/helpdata/en/8f/9d6937089c2556e10000009b38f889/frameset.htm
    FOLLOWING IS A VERY USEFUL LINK:
    http://help.sap.com/saphelp_scm50/helpdata/en/e0/9088392b385f6be10000000a11402f/frameset.htm
    I believe you need to REPLICATE your data sources after generating them. Have you dione that?
    Hope this helps,
    Abhi

  • Data mart is not visible in Compression cube

    Hi Experts,
    we r loading data daily into BI inventory Cube 0IC_C03 with compression, but when i loading the same data  into SEM server there  wont get init flag, even i checked datamart status in BI but i did not find.
    so please help me out why it is not finding datamart symbol in BI & Init flag for same data into SEM.
    i think this might be  with compression cube...pl give me complete solution...
    Thanks
    Pinky....

    hello,
    Try doing a couple of checks:
    observe the values for Fields BWGEO, BWGEOO, BWGVP, BWGVO, BWNETWR, BWMNG, etc in PSA if they are null or not.
    If not null, That means have you actvated The process key (0PROCESSKEY and 0BWAPPLNM) else activate the same.Take a look on note, 353042.
    When loading to another target,do a full update, followed by init wo data transfer, followed by delta
    OR
    regular init followed by delta
    or
    First do a init W\O Data transfer and then
    Second do a repair Full Load, this will Load all the data which is present at that instant in 0IC_C03.
    and hence further u run Delta Load, this will bring delta Load.
    Don't do any compression before the delta because you will not get any delta records afterwards. The compression combines all of the non-compressed requests, so you will lose data in unloaded deltas.
    When you r compressing, set with No Marker update Update’ NOT SET i.e. unticked or unchecked after loading of 2LIS_03_BX.
    Regards,
    Dhanya

  • Different aggregation operators for the measures in a compressed cube

    I am using OWB 10gR2 to create a cube and its dimensions (deployed into a 10gR2 database). Since the cube has 11 dimensions I set all dimensions to sparse and the cube to compressed. The cube has 4 measures, two of them have SUM as aggregation operator for the TIME dimensions the other two should have AVERAGE (or FIRST). I have SUM for all other dimensions.
    After loading data into the cube for the first time I realized that the aggregation for the TIME dimension was not always (although sometimes) correct. It was really strange because either the aggregated values were correct (for SUM and for AVERAGE) or seemed to be "near" the correct result (like average of 145.279 and 145.281 is 145.282 instead of 145.280 or 122+44+16=180 instead of 182). For all other dimensions the aggregation was OK.
    Now I have the following questions:
    1. Is it possible to have different aggregations for different measures in the same COMPRESSED cube?
    2. Is it possible to have the AVERAGE or FIRST aggregation operator for measures in a COMPRESSED cube?
    For a 10gR1 database the answer would be NO, but for a 10gR2 database I do not know. I could not find the answer, neither in the Oracle documentation nor somewhere else.

    What I found in Oracle presentation is that in 10GR2 compressed cube enhancements support all aggregation methods except weighted methods (first, last, minimum, maximum and so on). It is from September 2005 so maybe something changed since then.
    Regarding your question about the results I think it is caused by the fact that calculation are made on doubles and then there is a compression, so maybe precsion is lost a little bit :(. I really am curious whether it is because of numeric (precision loss) issues.

  • AVG aggregation in compressed cube

    We are using OWB10gR2 and have designed a compressed cube with eight dimensions, one of them is the TIME dimension with standard year and week hierarchy. The (single) measure is not summed over the time but we have to calculate the average over time.
    What the cube is calculating is sometimes correct, but not always. I tried a bunch of things (having time dimension with only one hierarchy, partitioning the cube or not, ...) to get that problems solved but without result. OLAP 10gR2 should be able to calculated average in a compressed cube so I assume that the code OWB is generating is not correct.
    Has anybody ever designed a compressed cube with an aggregation different from SUM?? It this possible using OWB?

    Hi,
    Even you compress the Cube you can delete the data from Cube by selective deleteion. You can delete based on your Dates/Characters...etc.. you see my blog..
    in  https://wiki.sdn.sap.com/wiki/display/profile/Surendra+Reddy
    Using Selective Deletion in Process Chains
    https://www.sdn.sap.com/irj/servlet/prt/portal/prtroot/com.sap.km.cm.docs/library/business-intelligence/s-u/using%20selective%20deletion%20in%20process%20chains.pdf
    Thanks
    Reddy

  • Updation of data from ods to cube

    hi all,
              I have 8 activated request in ODS . now I want to load data into a cube
    from this ods .
      is it possible to update request one by one so that I have same no. of request
    in cube as in ods?
    Thanks in adv.

    Hi
    If you want to do like that, you have to create an infopack which updates data from ODS to cube.....Go to datamart ODS in your Infosouces...create an infopack....and you can giv request id in selection and do one by one
    Assign points if useful
    Thanks
    N Ganesh

  • Issue when uploading Sales data from DSO to Cube.

    Dear All,
    I have an issue when I am uploading Sales data from DSO to Cube.  I am using BI 7.0 and I have uploaded all sales document level data to my DSO.  Then I use a transformation rule to calculate the Sales Value when I am doing DTP to cube.  Cube has customer wise aggregation data.
    In DSO I have NetPrice(KF) and Delivered_QTY(KF).  I do a simple multiplication routine in the transformation from DSO to Cube.
    RESULT =   SOURCE_FIELDS-NET_PRICE * SOURCE_FIELDS-DLV_QTY .
    At the moment I use Active Table(With out Archive) on the DSO to get the data since this is my first load.
    Issue is that the figure (Sales Value) in the cube is incorrect. I am getting a very large values which is impossible.
    Can someone please help me.
    Shanka

    Hi,
    are you sure that cube has customer wise aggregated data..since it will always aggregate the values of the key figures for the same set of characters...
    did you checked the values for the key figures as well and if even they are inflated or is the problem with this key figure only.
    During the data load the records may aggregates first for the same values of characters and then multiplication will be happening....if that is the case then you may have to multiply the values before storing it in the data package and then let it aggregate....can be achived through start routine...
    but first verify if other key figures are also having the same issue.
    Thanks
    Ajeet

  • Delta records not updating from DSO to CUBE in BI 7

    Hi Experts,
    Delta records not updating from DSO to CUBE
    in DSO keyfigure value showing '0' but in CUBE same record showing '-I '
    I cheked in Change log table in DSO its have 5 records
    ODSR_4LKIX7QHZX0VQB9IDR9MVQ65M -  -1
    ODSR_4LKIX7QHZX0VQB9IDR9MVQ65M -   0
    ODSR_4LIF02ZV32F1M85DXHUCSH0DL -   0
    ODSR_4LIF02ZV32F1M85DXHUCSH0DL -   1
    ODSR_4LH8CXKUJPW2JDS0LC775N4MH -   0
    but active data table have one record - 0
    how to corrcct the delta load??
    Regards,
    Jai

    Hi,
    I think initially the value was 0 (ODSR_4LH8CXKUJPW2JDS0LC775N4MH - 0, new image in changelog) and this got loaded to the cube.
    Then the value got changed to 1 (ODSR_4LIF02ZV32F1M85DXHUCSH0DL - 0, before image & ODSR_4LIF02ZV32F1M85DXHUCSH0DL - 1, after image). Now this record updates the cube with value 1. The cube has 2 records, one with 0 value and the other with 1.
    The value got changed again to 0 (ODSR_4LKIX7QHZX0VQB9IDR9MVQ65M - (-1), before image &
    ODSR_4LKIX7QHZX0VQB9IDR9MVQ65M - 0, after image). Now these records get aggregated and update the cube with (-1).
    The cube has 3 records, with 0, 1 and -1 values....the effective total is 0 which is correct.
    Is this not what you see in the cube? were the earlier req deleted from the cube?

  • New field added to cube, delta DTP from DSO to cube is failing

    Dear all,
    Scenerio in BI 7.0 is;
    data source -delta IPs-> DSO -
    delta DTP---> Cube.
    Data load using proces chain daily.
    We added new field to Cube --> transformation from DSO to cube is active, transport was successful.
    Now, delta from DSO to cube is failing.
    Error is: Dereferencing of the NULL reference,
    Error while extracting from source <DSO name>
    Inconsistent input parameter (parameter: Fieldname, value DATAPAKID)
    my conclusion, system is unable to load delta due to new field. And it wants us to initialize it again ( am i right ?)
    Do I have only one choice of deleting data from cube & perform init dtp again ? or any other way ?
    Thanks in advance!
    Regards,
    Akshay Harshe

    Hi Durgesh / Murli,
    Thanks for quick response.
    @ durgesh: we have mapped existing DSO field to a new field in cube. So yes in DTP I can see the field in filter. So I have to do re-init.
    @ Murli: everything is active.
    Actully there are further complications as the cube has many more sources, so wanted to avoid seletive deletion.
    Regards,
    Akshay

Maybe you are looking for

  • The address for the webpage is not displaying, just "Google". How do I get the webpage address to display again?

    The bar at the top of the page does not display the webpage address anymore after I changed some setting while trying to delete a bookmark. It appears I've permanently deleted web addresses. How do I restore this setting?

  • Alt Operating Systems on the MacBook. What is a good choice?

    I have a macbook 2ghz core 2 duo and want to install a different operating system on my macbook. I am not really interested in windows, but am deciding between: bsd linux solaris I do not have any code background, and will not be able to do much outs

  • Need Permission for Firefox Update?!

    More permissions problems in Leopard. I've been trying to install the latest version of Firefox (2.0.1.14 or something to that effect) only to have it tell me at start-up that the update can't be installed because I do not have sufficient permission

  • Need Explaination for the Code

    I would like to understand the declaration part of the code and also the flow below    DATA: L_S_RANGE Type RSR_S_RANGESID,    DATA: LOC_VAR_RANGE like RRRANGEEXIT what is L_S_RANGE     and   RSR_S_RANGESID  ? what is LOC_VAR_RANGE     and    RRRANGE

  • Can OSD handle MSSQL local variables?

    DECLARE @ERR_MSG VARCHAR(255); SET @ERR_MSG = '(FOC1406) SQL OPEN CURSOR ERROR.'; SELECT * FROM Rcaster05dev.Dbo.Botlog2 WHERE STAT_BT_NAME = 'J170547s5tlu' AND MESSAGE like @ERR_MSG Yields: SELECT * FROM Rcaster05dev.Dbo.Botlog2 WHERE STAT_BT_NAME =