Cube to Cube Data Transfer (Request by Request)

Dear All,
1. We have a cube for example: X (Transactional Cube)
Requirement is:
a. Create a New Cube for example: Y (Copy of the above cube)
b. Data in Cube X should be transferred to cube Y (see the below note).
Imp note: Data from source cube (X) to Target cube (Y) should be request by request. suppose if I have 100 requests in cube Cube X, I should get all the 100 requests into cube Y.
How is this possible?
Thanks in advance,
Best Regards,
Rama

Why not just use ethernet and copy the files from the cube to the new iMac? You don't have to use the migration assistant. File sharing will work just fine.
It's easy to set up a simple network between two Macs.
http://docs.info.apple.com/article.html?path=Mac/10.4/en/mh914.html

Similar Messages

  • "get all new data request by request" after compressing source Cube

    Hi
    I need to transfer data from one Infocube to another and use the Delta request by request.
    I have tried this when data on Source Infocube was not compressed and it worked.
    Afterwards some requests were compressed and after that the delta request by request is transfering all the information to target Infocube in only one request.
    Do you know if this a normal behavior?  
    Thanks in advance

    Hi
    The objective of compression is it will delete all the request in your F table and moves data to E table.after compression you don't have request by request by data.
    This is the reason you are getting all the data in single request.
    Get data request by request works, if you don't compress the data in your Cube.
    If you want to know about compression, check the below one
    http://www.sdn.sap.com/irj/scn/go/portal/prtroot/docs/library/uuid/c035d300-b477-2d10-0c92-f858f7f1b575?QuickLink=index&overridelayout=true
    Regards,
    Venkatesh.

  • 'Get All New Data Request by Request' option not working Between DSO n Cube

    Hi BI's..
             Could anyone please tell me why the option ' Get one Request only' and  'Get All New Data Request by Request' is not working in DTP between Standard DSO and InfoCube.
    Scenario:
    I have done the data load by Yearwise say FY 2000 to FY 2009 in Infopackage and load it to Write-optimised DSO (10 requests) and again load Request by request to Standard DSO and activate each request. I have selected the option in DTP's to  'Get All New Data Request by Request' and its working fine between WDSO and SDSO. But not working between Cube and SDSO. While Execute DTP its taking as a single request from SDSO to Cube.( 10 request to single request).
    Regards,
    Sari.

    Hi,
    How does your DTP setting looks like from below options ? It should be change log, assuming you are not deleting change log data.
    Delta Init. Extraction from...
    - Active Table (with archive)
    - Active Table (without archive)
    - Archive ( full extraction only)
    - Change Log
    Also if you want to enable deltas, please do not delete change log. That could create issue while further update from DSO.
    Hope that helps.
    Regards
    Mr Kapadia
    *Assigning points is the way to say thanks*

  • Data Transfer Process and Delete Overlapping Requests

    Hi All,
    We are on BW 7.0 (Netweaver 2004s).  We are using the new data transfer processing and transformation.  We want to use the ability to delete overlapping requests from a cube in a process chain.  So lets say we have a full load from an R/3 system with fiscal year 2007 in the selection using an infopackage.  It gets loaded to the PSA.  From there we execute the data transfer process and load it to the cube.  We then execute the delete overlapping requests functionality.  My question is, will the DTP know that the infopackage selection was 2007 so it will only delete requests with selections of 2007 and not 2006 from the cube?  Basically, is the DTP aware of the selections that were made in the infopackage?
    Thanks,
    Scott

    Hi Everyone,
    Figure it out...on a data transfer process you can filter the selection criteria - go to the extraction tab of a DTP and click on the filter icon.  Enter your seleciton conditions to pull from the PSA....these seleciton conditions will be used to delete the overlapping requests from the cube.
    Thanks

  • How to load request by request from DSO to Cube

    Hi All,
    I am loading the data from DSO to Cube. In DSO each request has some 50 lakhs records. When I give execute in the cube transformation after some time it is giving timout error(Basis have set the time till 25 min) and I gave one more option in DTP get all new data request by request but this option also not getting the proper data. Again I am getting the time error.
    If anyone knows any other option please let me know.
    Thanks for your help in advance
    Regards
    Sathiya

    Hi,
    You can load on the criteria of key fields in the DSO.
    Or selecting Fiscal periods may help.
    Hope this helps,
    Regards,
    anil

  • Data recognition during transfer of data from cube to cube.

    Dear All,
            If I have 2 cubes in such a way that 2nd cube pulls data from the first cube. But in the first cube data is there till March 2007 and the second  has pulled data till february 2007. But in a certain incident the ticks in "Data Mart status of request" in first cube disappears for all pulled data. How do I make these ticks reappear in first cube which will help me later to pull delta of data b/w 1st cube and 2nd cube into 2nd cube?
    Yours truly,
    Ratish

    Hi Ratish,
    It seems that due to structural change or "Initialization pointer" has been disturbed in the sytem . In this Scenario
    1. Delete data from 2nd Cube. which fetches delta from first cube.
    2. Delete initialization option for datamart (export datasource) generated by first cube.
    3. Right click on first cube and select option update data into data target.
    4. Select Initialize update. Then onwards system should able to deliver delta to 2nd cube.
    Hope that helps.
    Regards
    Kapadia
    ***Assigning points is the way to say thanks in SDN.***

  • Diff between init with data transfer and repair full request

    hi,
    i have observed that even in the new flow we are doing init without data transfer and then repair full request
    if i do init with data transfer also i can achieve the same?
    i want to know why we need to do this ,do we have any advantage of doing init without transfer and repair full request?
    please suggest me

    Hi Venkat,
    A repair full request is loaded in cases where you get erroneous records or where there are missing records which you want to load. In such cases, a repair full request for the selective records can be loaded and is the most efficient way to correct the data, since you won't be required to load the entire data once again. Also you acheive the desired result without disturbing the delta. (You are not required to do an init w/o data transfer after a repair full, people just do it a a precaution)
    However, repair full requests should only be loaded to infoproviders with update mode 'overwrite'. Otherwise, if the InfoProvider, its very probable that you might double the value of the key-figures due to rows being added twice - in case of an InfoProvider with update mode 'Additive'. So, if your InfoProvider is additive, you will need to delete the entire data and do an 'init with data transfer' to avoid corrupting the data. (Note: you can do a repair full request for an additive infoprovider in case of lost records or if you can delete erroneous records with selective deletion.But you have to be careful with the selections lest you inadvertently load other records than required and corrupt the data)

  • Cube to cube data transferred ,and delta status

    Hi Gurus
    I need to do Portioning in cube
    I have a cube Zaa_c01 which is updated by two data source ZDS1 and ZDS2 (Delta Update) I copy the Zaa_c01 and I created new cube Zaa_c02, also I transferred the data to that. But it is shoeing in single request in cube, after partion I again reloaded to ZAA_C01 , know I am seeing only single request in a cube.
    Whether I lose delta in this case , or I am missing any step, this activity I done in Development, and need to do in production. 
    Regards
    Shashi

    Hai Shashidhar,
    Partitioning is split the large dimensional cube into smaller cubes.... regarding performance point of view we have to do this partitioning.
    there are two types of partitioning..
    1. Logical Partitioning.
    2.Physical Partitioning
    physical partitioning is done at Database level . and logical partitioning done at Data target level.(EX: Info cube level)
    Cube partitioning with time characteristics 0calmonth Fiscper is physical partitioning.(so, we need at leats one of the above time charecterstics for doing partioning...).
    Logical partitioning is u partition ur cube by year or month that is u divide the cube into different cubes and create a multiprovider on top of it.
    STEPS :
    SELECT YOUR CUBE.. AND DOUBLE CLICK IT..->
    in the menu's select EXTRAS menu--> in that select -->DB PERFORMANCE sub menu..-> in that again select Partitioning sub menu..
    and specify how many partions you want to do...
    but you cube should be contain atleast ocalmonth or phical varient should be there...
    Prerequisites
    You can only partition a dataset using one of the two partitioning criteria ‘calendar month’ (0CALMONTH) or ‘fiscal year/period (0FISCPER). At least one of the two InfoObjects must be contained in the InfoProvider.
    http://help.sap.com/saphelp_nw04s/helpdata/en/8c/131e3b9f10b904e10000000a114084/frameset.htm
    Thanks,
    kiran.

  • Cube to Cube - Data Question...any help

    Actuals cube is feeding a Planning cube  using Gen Exp DS functionality.
    In the update rules all possible mappings are made.
    <u>After loading Actuals,  Planning Cube has data like </u>
    <b>Employee/Request ID/Amount</b>
    123/APOXXX/1000$
    100/APOXXX/1993$
    <b>100/DTPXXX/1992$</b>
    102/APOXXX/1800$
    <u>
    Data in Actuals Cube</u>
    <b>Employee/Request ID/Amount</b>
    <b>100/DTPXYZ/1992$</b>
    119/DTPXYZ/1900$
    120/DTPXYZ/1962$
    After loading,  Planning cube has data of  Actuals for only those employees which have Planned costs...just like Emp 100 frm above example, there were no actuals found for Emp 119 & Emp 120 in the planning cube ( some actuals data is not going into the planning cube )!!!
    No selection criteria in the Infopackage.
    Comment SDNer's...
    Message was edited by:
            Jr Roberto

    Can u please explain your problem in more detail. This will help us to think on it.
    Regards,
    Mahendra

  • How to load data from one Infocube to another request by request using DTP

    Hi All,
    I have a scenario where we are maintaining backup Infocube B for Infocube A.  User loads data from a flat file many times a day in different requests.  We need to maintain backup of the Infocube A on weekly basis.  i.e data to be refreshed with delta update from cube A to cube B.  There are some situations where user deletes some of the requests in Infocube A randomly after use.  Will this change effect in back up cube B when performed data refresh from cube A to cube B.  i.e. this functionality is similar to reconstruct in BW 3.5.  Now we are running on BI 7.0 SP 9.
    Can anyone answer this ASAP.
    Many Thanks,
    Ravi

    You cannot load request by request, " Get Data By Request " DTP loads request by request on " First In First Out " basis. You can run some Pseudo/Fake DTP's if you dont want to load data from a particular request.
    If the user deletes a request from Cube A, it wont be loaded to Cube B but if it is already loaded in Cube B and later the user deletes the request from Cube A you have to delete the request frm Cube B. Inorder to monitor request by request run DTP with   " Get Data By Request set.

  • Cube Partition and Date

    Hi,
    I have a cube which has data from 2000 till now. The data from 2000-2007 is very less.
    But for 2008 and till may 2009 we have huge data
    So we decided to partition the cube on Fiscal Year/Period with 1252( 12 for 2008, 5 for Jan 2009 till may 2009 and 1 for 2007 and below and 1 for June 2009 onwards)
    Now My question is how would we give the data range?
    Can anyone please tell me?
    Regards

    Hi AS,
    Suggest you to partition until October/December 2009, so that you wont have to repartition again immediately(reduce administrative effort).
    For your partitioning request.
    Fiscal year/period - 001/2008 to 005/2009.
    Maximum number of partitionins - 19.
    Check Features subtopic in this link for further details
    [Partitioning exmaple|http://help.sap.com/saphelp_nw70/helpdata/en/e3/e60138fede083de10000009b38f8cf/frameset.htm]
    Hope it helps,
    Best regards,
    Sunmit.

  • Data mart cube to cube copy records are not matching in target cube

    hI EXPERTS,
    Need help on the below questions for DATA Mart-Cube to cube copy(8M*)
    Its BW 3.5 system
    We have two financial cube.
    Cube A1 - Sourced through R/3 system (delta update) and Cube B1- Sourced through A1 cube.(Full update). These two cubes are connected through update rules with one to one mapping without any routines.Basis did a copy of back-end R/3 system from Production to Quality server.This happened approximately 2 months back.
    The Cube A1 which extracts delta load from R/3 is loading fine. but for the second cube, (extraction from previous cube A1) i am not getting full volume of data instead i m getting meagre value but the loading shows successful status in the monitor.
    We  tried through giving conditions in my infopackage (as it was given in previous year's loading) but then also its fetching the same meagre volume of data.
    To ensure that is it happening for the particular cube, we tried out in other cube which are sourced thro myself system and that is also getting meagre data rather than full data..
    For Example: For an employee if the data available is 1000, the system is extracting randomly some 200 records.
    Any quick reply will be more helpful. Thanks

    Hi Venkat,
                  Did you do any selective delitions in CUBEA1.
    first reconcile data cube1 & cube2 .
    match totals of cube1 with cube2.
    Thanks,
    Vijay.

  • How to tune performance of a cube with multiple date dimension?

    Hi, 
    I have a cube where I have a measure. Now for a turn time report I am taking the date difference of two dates and taking the average, max and min of the date difference. The graph is taking long time to load. I am using Telerik report controls. 
    Is there any way to tune up the cube performance with multiple date dimension to it? What are the key rules and beset practices for a cube to perform well? 
    Thanks, 
    Amit

    Hi amit2015,
    According to your description, you want to improve the performance of a SSAS cube with multiple date dimension. Right?
    In Analysis Services, there are many tips to improve the performance of a cube. In this scenario, I suggest you only keep one dimension, and only include the column which are required for your calculation. Please refer to "dimension design" in
    the link below:
    http://www.mssqltips.com/sqlservertip/2567/ssas--best-practices-and-performance-optimization--part-3-of-4/
    If you have any question, please feel free to ask.
    Simon Hou
    TechNet Community Support

  • How to add a new field in the cube and load data

    Hi,
    The requirement is
    We have  ZLOGISTICS cube , the data souce of this filed has REFDCONR-reference dcument number filed . We have to create a new field in cube load data and get this new filed into the report also.
    Please any one can help me with the step by step process of how to do?
    How to get the data into BW and into the report.

    Hi,
    So you need that this new field have data in old records?
    1.- If you are in BI 7.0 and the logic or data for that New field are in the same Dimension, you can use a Remodeling to fill it. I mean if you want if you want to load from a Master Data from other InfoObject in the same Dim.
    2.- If condition "1" is not yours.
    First add the new field, then create a Backup Cube (both cubes with the new field) and make a full update with all information in the original Cube. The new field willl be empty in both cubes.
    Create an UR from BackUp_Cube to Original_Cube with all direct mapping and create a logic in the Start Routine of the UR (modiying the data_package) you can look for the data in the DSO that you often use to load.
    To do that both cubes have to be Datasources ( right click on Cube-> aditional function-> and I think is "Extract Datasource")
    Hope it helps. Regards, Federico

  • Is it possible to create a Webservice in BI which takes XML as an input and gives PDF as output with an additional requirement that Siebel expecting the XSD from BI to send data in the BI requested format

    Is it possible to create a Webservice in BI which takes XML as an input and gives PDF as output with an additional requirement that Siebel expecting the XSD from BI to send data in the BI requested format. Siebel wants to send the data as xml to BI but not sure of the BI capabilities on giving WSDL embedded with XSD (input is a hierarchical)

    Hi All,
    I am able to fulfil above requirement. Now I am stuck at below point. Need your help!
    Is there any way to UPDATE the XML file attached to a Data Definition (XML Publisher > Data Definition) using a standard package or procedure call or may be an API from backend? I am creating an XML dynamically and I want to attach it to its Data Definition programmatically using SQL.
    Please let me know if there is any oracle functionality to do this.
    If not, please let me know the standard directories on application/database server where the XML files attached to Data Definitions are stored.
    For eg, /$APPL_TOP/ar/1.0/sql or something.
    Regards,
    Swapnil K.

Maybe you are looking for

  • Can't open a .psd file.

    I don't know what Adobe product to download to open the file. Can you tell me? I'm running Windows 7.

  • ITunes 11.1 freezes when I connect my ipod Touch 5g w/ ios 7

    I recently downloaded ios 7 for my ipod touch 5g. I didn't have a problem with ios 7 as I actually kind of liked it. However, when I tried to sync my music, itunes told me that i needed to download version 11.1 because it supports ios 7. I downloaded

  • Sometimes no reply to the post, neither no reply to the user replied.

    sometimes i am fed up with hp support forum where no one replies to the post. There are few post were not a single user has replied to it.  Even there are few user who have replied to my post and when i have further replied to their reply and still n

  • Webi-servers restart every 2hours when long publication running

    I have 4 webi server. When long publication running greater 2 hour, webi servers is restart. all four servers restart simultaneously I try tunning  "Timeout Before Recycling (seconds):" and "Maximum Documents Before Recycling:" but no result. webi, w

  • Move to a new mac

    I retored my user account form a time machine backup to a new macbook with Mountain Lion.  All of my mailboxes previously on the 'On My Mac' list got transfered with only the mailbox name.  All the mailboxes showed 0 messages...  Any advise?  I can t