Load to a cube

Can I carry out to an init load to a cube and than delete the contents of this load and still have a succesful delta mechanism? Thanks

Do an init without data trasnfer and you are sure that delta will work well.
Hope it helps.
Regards

Similar Messages

  • Multiple loads to a cube ?

    hello BW Experts,
    Can we do multiple loads to a cube ?
    Thanks,
    BWer

    If it is a FULL LOAD, definitely there would be duplication of data from ODS. I think you should consider dropping of requests in a FULL LOAD scenario or implementing DELTA.
    In you update rules of the cube, do you have the option of Addition or No Update checked.
    if it is No Update, the report might be having a parameter of considering only the most current request. This is possible by using variable of REQUEST ID like "Most Current Data" which is an SAP Exit. In this case, if the cube is compressed the entire data would report as duplicate in your case.
    It is always desirable that the cube contains non duplicate data for efficient processing.
    Hope it helps.
    Regards

  • Loading data from Cube to Planning area

    Hi,
             If I am loading data from a cube to a planning area using transaction TSCUBE,
    does the system load data into planning area for the combinations that exist in the cube or does it load for all CVCs?
    For example,
    I have my CVC as Plant, Material, Customer
    If there are 4 CVCs in the POS that were previously generated as
    Plant--Material--Customer
    01--M1--
    C1
    01--M2--
    C3
    01--M2--
    C2
    01--M4--
    C5
    If the cube has data like this:
    Plant--MaterialCustomer----Qty.
    01--M1C1--
    10
    01--M2C3--
    20
    01--M2C2--
    5
    (doesnot have the last combination), then if I use TSCUBE transaction to load data to Planning area from this cube,
    is the data loaded as
    Plant--MaterialCustomer----Qty.
    01--M1C1--
    10
    01--M2C3--
    20
    01--M2C2--
    5
    Only for the 3 combinations that exist in the cube and not load anything for the last one
    OR
    is the data loaded as
    Plant--MaterialCustomer----Qty.
    01--M1C1--
    10
    01--M2C3--
    20
    01--M2C2--
    5
    01--M4C5--
    0
    Load all 4 combinations and send 0 as the cube doesnot have this combination?
    Hope I am clear on this question.
    Thanks.

    Thanks a lot Vinod, Srinivas and Harish. The reason why I am asking you is that we have a scenario where we get this situation.
    We initially get data from R/3 to BW to APO like the below:
    Plant--MaterialCustomer----Qty.
    01--M1C1--
    10
    Later when the customer is changed or bought out by somebody C1 is changed to C2. Some times when the business doesnot know who the customer is initially they just put C1 as dummy and then after sometime replace it by C2. Then the new record coming in is as follows:
    Plant--MaterialCustomer----Qty.
    01--M1C2--
    10
    BW can identify changes in transaction data  but not in Master data. What I mean by this is when Qty. 10 changes from 10 to 20, the system can identify it in deltas.
    If the customer (master data) changes to C2 from C1, the system thinks it's a new record all together then if I use delta loads, it gets me the following:
    Plant--MaterialCustomer----Qty.
    01--M1C1--
    10
    01--M1C2--
    10
    If I am looking at Plant and Material Level, my data is doubled.
    So we are planning to do a full load that works like this:
    1. Initial data like the below:
    Plant--MaterialCustomer----Qty.
    01--M1C1--
    10
    The CVC is created and the planning area has Qty.10
    Then we delete the contents of cube and do a full load into the cube with changed customer
    Plant--MaterialCustomer----Qty.
    01--M1C2--
    10
    This time a new CVC is created. Then we have another 10 loaded into Planning area.
    If the system loads all CVCs, then the it would send
    Plant--MaterialCustomer----Qty.
    01--M1C1--
    0
    01--M1C1--
    10
    If the system loads only combinations in cube,
    then it loads
    Plant--MaterialCustomer----Qty.
    01--M1C2--
    10
    But the system already has another 10 for Customer C1 duplicating the values.
    We are trouble in the second case.
    We had to go fr this solution instead of realignment as our business has no way pf knowing that C1 was replaced by C2.
    Hope I am clear.

  • Getting error while scheduling load from one cube to another

    Hi,
    I have create one cube ,which has got loaded from two data sources and load into the cube went successful.I have created copy cube and schedule the load into copy cube from the base cube,where i am getting error message syaing that"New work Area started,terminated the system" and "Core_dump","Rolled-Out session terminated" all the above messages are getting and so many short dumps.I am getting all the above messages.
    Even i have change data packet size from 20000 to 5000 and increares dialup process from 2 to 5.But still i am getting same kind of messages.
    Please some one can help me out from this issue.

    Hi,
    In which version you are.
    If you in 7.0 then have you checked in DTP.
    Just logooff from your server, and again relog, and try to load once again.
    Let us know status ....
    Reg
    Pra

  • Getting an error while trying to load data into cube

    Hello,
    I am trying to load data into cube but it gives me following error:
    fiscal year variant not expected.
    Can someone tell what could be done on fiscal year
    Cheers
    Jim.

    Hi,
    May be you have not mapped anything to fiscal variant or not included fiscal variant in the cube but you have used fiscal period inside it.
    You should load fiscal variant if you wanna use fiscal period.
    Map fiscal variant to a constant K4 or from the source if present and then see the result.
    Generally it is K4 except for FI-CO cubes .
    In the transfer rules, select field '0fiscvarnt' 'Tp' click
    Enter K4 for Constant................Save and Activate
    Iam sure you'd fix the issue.
    Regards,
    Ray

  • Getting  error in PSA while trying to load data in cube

    Dear Friends,
    I am trying to load data in cube I am getting error in PISA i.e.
         Value 'Boroplus Anticream 19Gm Regular Plain ' (hex. '426F726F706C757320416E7469637265616D203139476D2052') of characteristic .
    please guide me.
    Thanks in Advance.
    Best Regards
    Rafeeq

    <i>Value 'Boroplus Anticream 19Gm Regular Plain ' (hex. '426F726F706C757320416E7469637265616D203139476D2052') of characteristic .</i>
    Your data is in lowercase.. either change the data to UPPERCASE from source..
    or delete the request in cube and edit PSA data from lowecase to UPPER CASE if they are few records only.
    or make the infoobject setting lowecase checked,
    Regards
    Manga(Assign points if it helps!!)

  • I need format for data in excel file load into info cube to planning area.

    Hi gurus,
    I need format for data in excel file load into info cube to planning area.
    can you send me what should i maintain header
    i have knowledge on like
    plant,location,customer,product,history qty,calander
    100,delhi,suresh,nokia,250,2011211
    if it is  right or wrong can u explain  and send me about excel file format.
    babu

    Hi Babu,
    The file format should be same as you want to upload. The sequence of File format should be same communication structure.
    Like,
    Initial columns with Characteristics (ex: plant,location,customer,product)
    date column (check for data format) (ex: calander)
    Last columsn with Key figures (history qty)
    Hope this helps.
    Regards,
    Nawanit

  • Inventory data load from Inventory Cube to another Copy Cube

    Hello Experts,
    I am trying to load Inventory data from the Inventory cube(say YCINV) to a copy cube  say YCOPY_CINV(copy of Inventory cube), but the results appear inconsistant when I compare the reports on these 2 cubes. I am trying to populate a fiield in copy cube so that I can populate the same data back to the original cube with the new field in it, I am doing this reload back and forth for historical data purpose only.
    I have seen lot of posts as how to run the setups for Inventory data, but my case does not need to perform set up runs.
    Does the note 1426533 solve the issue of loading from one cube to another ? we are on SAP BI 7.01 with SP 06 ,but the note specifies SP 07 ?
    I have tried note 375098 to see if it works, but I do not see the options as mentioned from  step "Using DTP (BW 7.x)" in BI to perform this note
    Please advise on whether to go with implementing note 1426533 or is there any other way to load inventory data from one cube to other.
    Regards,
    JB

    Hi Luis,
    Thanks for your reply,
    I do not see any setting like Intial stock in DTP except "initial non-cumulative for non- cumulative". I did try using the option "initial non-cumulative for non- cumulative" ,but the results still do not match with the inventory cube data.I do not see the check box for marker in the copy cube (under roll up tab). Please let me know if we can really implement this solution ,i.e. copying from inventory cube to copy cube and then re-loading it back to inventory cube for historical data. Currenlty, I am comparing the queries on these 2 cubes, if the data matches then I can go  ahead and implement it in Production, other wise it would not be wise to do so .
    Regards,
    JB

  • Reg:loading data into cube getting error

    Hi,
    When i am loading data intothe cube i am getting following error.
    record1:fiscal year variant not expected.
    error 4 update.

    Recently, I experienced following error:
    fiscal year variant K4 not expected.
    following process fixed my issue
    In the transfer rules, select field '0fiscvarnt' 'Tp' click
    Enter K4 for Constant................Save and Activate
    Also check the following-------
    in R/3 (SE16), make sure K4 exists in table T009
    in Bw, select source system --> transfer global settings

  • Frequenty occured  ERRORS in loading data to cube

    hi sap gurs
      can u give me frequently occured errors in loading data to cube.
    giri

    Hi Giri,
    There will be thousands of errors which can occur in any kind of environment.
    Some of the errors listed
    1. SID missing
    2. No alpha conforming values found
    3. Replicate datasource error
    4. BCD_Overflow error
    5. update rule inactive
    6. DUplicate masterdata found error
    RFC connection lost.
    7. Invalid characters while loading.
    8. ALEREMOTE user is locked.
    9. Lower case letters not allowed.
    10. While loading the data i am getting messeage that 'Record
    the field mentioned in the errror message is not mapped to any infoboject in the transfer rule.
    11. object locked.
    12. "Non-updated Idocs found in Source System".
    13. While loading master data, one of the datapackage has a red light error message:
    14. Extraction job aborted in r3
    15. repeat of last delta not possible
    16. datasource not replicated
    17. datasource/transfer structure not active
    18. Idoc Or Trfc Error
    Pls find these links to help yourself out
    production support
    Re: BW   production support
    https://forums.sdn.sap.com/click.jspa?searchID=1844533&messageID=1842076
    /people/siegfried.szameitat/blog/2005/07/28/data-load-errors--basic-checks
    https://www.sdn.sap.com/irj/sdn/go/portal/prtroot/docs/library/uuid/3a699d90-0201-0010-bc99-d5c0e3a2c87b
    /people/valery.silaev/blog/2006/10/09/loading-please-wait
    Help on "Remedy Tickets resolution"
    Re: What is caller 01?
    /people/siegfried.szameitat/blog/2005/07/28/data-load-errors--basic-checks
    /people/siegfried.szameitat/blog/2005/07/28/data-load-errors--basic-checks
    https://forums.sdn.sap.com/click.jspa?searchID=678788&messageID=1842076
    Production Support
    Production support issues
    check Siggi's weblog for common errors in data loading
    /people/siegfried.szameitat/blog/2005/07/28/data-load-errors--basic-checks
    Assign points if it is helpful.
    Regards,
    Sreedhar

  • To load non cumulative cube initialize opening balance in R/3. explain?

    To load non-cumulative cube i have to initialize opening balance in R/3(S278). please elaborate and what is S278 means is it LIS datasource please explain?
    Thank you,
    York

    Dear all,
    The data source 0EC_PCA_1  doesn't consider the opening balances i believe, can any experts help me solve this problem.
    Regards,
    M.M
    Edited by: Magesh Magesh on Jun 26, 2009 2:55 PM

  • How to delete perviously loaded request from cube

    Hi All,
    I am loading a standard cube from r3 w/o PSA through DTP.
    1 ) how to delete data automatically from cube w/o using Process chains.( previously we have a facility in Infopackage to delete old loaded data ) .In BI7 how to delete the same automatically.
    2) How to schedule the DTP to run on daily basis( w/o process chain ) .I am not able to schedule it via RSRDA ?
    What might be the reason ?
    Regards
      KK

    Chivukula,
    I'm curious as to why you aren't using the process chains.  There's now a "delete overlapping request" process type within rspc.
    You are right thought, the DTP does not offer a facility to delete out the old loaded data.  This feature is not in BI 7.0
    As for scheduling a DTP to run daily, there's no way I know of to run it daily without rspc.  Yes I understand that rspc could get very messy but just create a separate holding area, say DAILY DTPS and put your DTP's in there.
    Cheers,
    Pom

  • Loading time to cube

    hello all,
    i am facing a unique problem.
    when loading to the ODs the load takes place successfully but when activating the load it fails but in the monitor it still shows yellow and i am unable to make out where the error is occuring. Also when i am loading to the cube from the ODS it seems to get caught when inserting into the /bi0/smaterial and i am not sure what needs to be done. i ran RSRV on the infocube and checked the SID .
    Any thoughts are greatly appreciated
    thanks
    amit

    Hi all,
    I checked the status tab of the monitor and it doesnt show me any errors. But it keeps failing when i trying to activate the data. I went to transasction ST22 and saw some runtime error.
    it says dbif_rsql_invalid_rsql .
    Also checked the rsrv for 0material and it says the following error 1 value from S table does not exist in P table. There is no option to correct the error in RSRv.
    Any help is greatly appreciated.
    thanks
    amit

  • Loading ODS to Cube Creates two records

    I am using one ODS to load data into Cube 1 and Cube 2.
    Cube 1 is a delta load and Cube 2 is a full load. The design is that Cube 1 will have all the daily updates from R/3 and Cube 2 will be loaded with a snap-shot of the ODS data at midnight. When the snap-shot is loaded into Cube 2 an update rule will change characteristic infoobject  "Snap-Shot date" to the current date.  So, cube 2 will contain all the nightly snap-shots with different dates
    The initial load of Cube 1 runs fine and it loads 1488 records. When I run Cube 2's full load it add 2976 records (double). When I look at the Cube 2 data I see one record with the Snap-shot date that is blank and one with the current date in it.
    I have to click on a key figure that has an "Update Type = Addition" to get to the characteristics update rule for "snap-shot date". Is the fact that the key figure is additive causing the creation of two records?
    Regards,
    Mike...

    Yes, that was my problem I didn't have the update rule applied to all the key figures. When I put in the update rule and I saved it, I said "no" to the pop-up "Apply to all key figures".
    I just re-edited the update rule and this time I clicked "yes" apply to all key figures. Load is working fine now...
    Thanks,
    Mike...

  • Performance Tuning Data Load for ASO cube

    Hi,
    Anyone can help how to fine tune data load on ASO cube.
    We have ASO cube which load around 110 million records from a total of 20 data files.
    18 of the data files has 4 million records each and the last two has around 18 million records.
    On average, to load 4 million records it took 130 seconds.
    The data file has 157 data column representing period dimension.
    With BSO cube, sorting the data file normally help. But with ASO, it does not seem to have
    any impact. Any suggestion how to improve the data load performance for ASO cube?
    Thanks,
    Lian

    Yes TimG it sure looks identical - except for the last BSO reference.
    Well nevermind as long as those that count remember where the words come from.
    To the Original Poster and to 960127 (come on create a profile already will you?):
    The sort order WILL matter IF you are using a compression dimension. In this case the compression dimension acts just like a BSO Dense dimension. If you load part of it in one record then when the next record comes along it has to be added to the already existing part. The ASO "load buffer" is really a file named <dbname.dat> that is built in your temp tablespace.
    The most recent x records that can fit in the ASO cache are still retained on the disk drive in the cache. So if the record is still there it will not have to be reread from the disk drive. So you could (instead of sorting) create an ASO cache as large as your final dat file. Then the record would already still be on the disk.
    BUT WAIT BEFORE YOU GO RAISING YOUR ASO CACHE. All operating systems use memory mapped IO therefore even if it is not in the cache it will likely still be in on the disk in "Standby" memory (the dark blue memory as seen in Resource Monitor) this will continue until the system runs out of "Free" memory (light blue in resource monitor).
    So in conclusion if your system still has Free memory there is no need (in a data load) to increase your ASO cache. And if you are out of Free memory then all you will do is slow down the other applications running on your system by increasing ASO Cache during a data load - so don't do it.
    Finally, if you have enough memory so that the entire data file fits in StandBY + Free memory then don't bother to sort it first. But if you do not have enough then sort it.
    Of course you have 20 data files so I hope that you do not have compression members spread out amongst these files!!!
    Finally, you did not say if you were using parallel load threads. If you need to have 20 files read up on having parrallel load buffers and parallel load scripts. that will make it faster.
    But if you do not really need 20 files and just broke them up to load parallel then create one single file and raise your DLTHREADSPREPARE and DLTHREADSWRITE settings. Heck these will help even if you do go parallel and really help if you don't but still keep 20 separate files.

Maybe you are looking for