Issue in loading ODS to CUBE(URGENT)

Hi All,
We are facing an issue in loading AR,AP and G/L ODS to Cube.
The steps we done are
1) Repair full request is done in ODS got around 35 lack records.
2) Done Full load in Cube n got 35 lack records.
3) Done delta load in ODS got 189 records.
4) Done delta load in Cube got 1 lack records.
This is where we guys couldnt find out what went wrong??
1 lack records to cube is not possible...??
Did we went wrong any where..??
Could any one guide us to do What has to be done??
Kind Regards,
Shanbagavalli.S

Hi,
Thanks for the response.
Already delta is done in the Cube and ODS.
No we didn't do intialization . We did repair full in ODS and full update to cube,.
What we have done is
We planned to do the selective deletion from G/L cube for Q2 2007.
Done a repair full request to ODS
From ODS to Cube a full load. (as we have deleted the data from the cube)
Problem is after doing this it has pulled data for Saturday, Sunday, Monday and Tuesday.(past five days)
But the problem is wilth delta.. it is still pulling only 189 same records that got pulled in yesterday evening data load
Can you tell me wat are all the methods to load those days records also delta records to CUBE.
Thanks in advance.
shanba
Message was edited by:
        shanbagavalli shivasankaran
Message was edited by:
        shanbagavalli shivasankaran

Similar Messages

  • Activating ODS to CUBE (urgent)

    Hello Gurus,
    I'm running an process chains where it loads data to ODS and activating ODS and loading data from ODS to CUBE ....
    <b>where my problem is while doing further processing for ODS to CUBE i'm getting an error as follows</b>
    1.Error 8 when starting the extraction program
    2.Error in source system
    3.Error occurred when updating data from ODS
    Plz through some light <b><u>urgent</u></b>
    Regards
    Sandy

    Hi,
    Dont create manually from SU01, the best way is just goto SPRO -->SAP Business Information warehouse -->Automated processes --> Create User for Background Process...
    The system prompts with BWREMOTE..just give a password and save it ..it would automatically pull up the necessary profiles.
    Error in source systems was giving to me when i have an extractor which extraction job in R3 was cancelled.....this error is received in BW....also you can receive this error if the connection between systems is lost....check in Source Systems and check Connection in order to check if is a network problem...
    ckeck the RFC connection once.
    check if any dumps created ( i mean for detials)
    If possible give some more details of your problem
    there may be no connection with source system to BW system.
    please check the same in RSA1 Source system tree for the source system, context menu, check the connection.
    if not activate the transfer stru. of this object in BW system by replicating the datasource.
    Check in RSRV for the prg that updates data for this ODS,Correct it if you find some errors
    Goto RSRV, Select the test for ODS Updating program..give the ODS name and execute,if you get an error, you have the option to correct it there..
    Hareesh

  • Loading through Process Chains 2 Delta Loads and 1 Full Load (ODS to Cube).

    Dear All,
    I am loading through Process chains with 2 Delta Loads and 1 Full load from ODS to Cube in 3.5. Am in the development process.
    My loading process is:
    Start - 2 Delta Loads - 1 Full Load - ODS Activation - Delete Index - Further Update - Delete overlapping requests from infocube - Creating Index.
    My question is:
    When am loading for the first am getting some data and for the next load i should get as Zero as there is no data for the next load but am getting same no of records for the next load. May be it is taking data from full upload, i guess. Please, guide me.
    Krishna.

    Hi,
    The reason you are getting the same no. of records is as you said (Full load), after running the delta you got all the changed records but after those two delta's again you have a full load step which will pick whole of the data all over again.
    The reason you are getting same no. of records is:
    1> You are running the chain for the first time.
    2> You ran this delta ip's for the first time, as such while initializing these deltas you might have choosen "Initialization without data transfer", as such now when you ran these deltas for the first time they picked whole of the data.Running a full load after that will also pick the same no. of records too.
    If the two delats you are talking are one after another then is say u got the data because of some changes, since you are loading for a single ods to a cube both your delta and full will pick same "For the first time " during data marting, for they have the same data source(ODS).
    Hope fully this will serve your purpose and will be expedite.
    Thax & Regards
    Vaibhave Sharma
    Edited by: Vaibhave Sharma on Sep 3, 2008 10:28 PM

  • Loading ODS to Cube Creates two records

    I am using one ODS to load data into Cube 1 and Cube 2.
    Cube 1 is a delta load and Cube 2 is a full load. The design is that Cube 1 will have all the daily updates from R/3 and Cube 2 will be loaded with a snap-shot of the ODS data at midnight. When the snap-shot is loaded into Cube 2 an update rule will change characteristic infoobject  "Snap-Shot date" to the current date.  So, cube 2 will contain all the nightly snap-shots with different dates
    The initial load of Cube 1 runs fine and it loads 1488 records. When I run Cube 2's full load it add 2976 records (double). When I look at the Cube 2 data I see one record with the Snap-shot date that is blank and one with the current date in it.
    I have to click on a key figure that has an "Update Type = Addition" to get to the characteristics update rule for "snap-shot date". Is the fact that the key figure is additive causing the creation of two records?
    Regards,
    Mike...

    Yes, that was my problem I didn't have the update rule applied to all the key figures. When I put in the update rule and I saved it, I said "no" to the pop-up "Apply to all key figures".
    I just re-edited the update rule and this time I clicked "yes" apply to all key figures. Load is working fine now...
    Thanks,
    Mike...

  • Issue with loading data into cube with duplicate members in different dimensions

    Have a cube with allowed dublicate members only in two dims (Period1 and Period2).
    Outlines saves with no errors, but when i try to load data, i get some errors:
    "Member 2012-12 is a duplicate member in the outline
    A6179398-68BD-7843-E0C2-B5EE81808D0B    01011    cd00    st01    2905110000    EK    fo0000    NNNN-NN-NN    2012-12    cust00000$$$    1" 
    These dims represent two similar periods of time. Am I to change member names and aliases a little in one of them(f.e. yyyy1-mm --> yyyy1 - mm1)?
    Users wouldnt like it...
    Period1
      yyyy1
        yyyy1/q1
          yyyy1-mm1
          yyyy1-mm2
          yyyy1-mm3   
        yyyy1/q2
    Period2
      yyyy1
        yyyy1/q1
          yyyy1-mm1
            yyyy1-mm1-dd1 
            yyyy1-mm1-dd2
          yyyy1-mm2
          yyyy1-mm3   
        yyyy1/q2
    Tnanx

    You may have to use fully qualified name something like
    [Period1].[2012-12]
    [Period2].[2012-12]
    You can refer to ESSBASE admin guide on "Creating and Working With Duplicate Member Outlines"
    Regards,
    Sunil

  • Essbase Studio Performance Issue : Data load into BSO cube

    Hello,
    Having succesfully built my outline by member loading through Essbase Studio, I have tried to load data into my application again with Studio. However I was never able to complete the data load because it is taking forever. Each time I tried to work with Studio in streaming mode (hoping to increase the query speed), the load gets terminated due to the following error : Socket read timed out.
    In the Studio properties file, I typed in, oracle.jdbc.ReadTimeout=1000000000, but the result has not changed. Even if it did work out, I am also not sure the streaming mode is gonna provide a much faster alternative to working in non-streaming mode. What I'd like to know is, which Essbase settings I can change (either in Essbase or Studio server) in order to speed up my data load. I am loading into a Block Storage database with 3 Dense, 8 Sparse and 2 attribute dimensions. I filtered some dimensions and tried to load data to see exactly how long it takes to create a certain number of blocks. With ODBC setting in Essbase Studio, it took me 2.15 hours to load data into my application where only 153 blocks were created with the block size of 24B. Assuming that in my real application the number of Blocks created are going to be at least 1000 times more than this , I need to make some changes in settings. I am transferring the data from Oracle Database, with 5 tables joined to a fact table (view) from the same data source. All the cache settings in Essbase are in default mode. Would changing cache settings, buffer size or multiple threads help to increase the performance? Or what would you suggest that I should do?
    Thank you very much.

    Hello user13695196 ,
    (sorry I no longer remember my system number here)
    Before it comes to any optimisation attemps in the essbase (also Studio) environment you should definitily make clear that your source data query performs well at the oracle db.
    I would recommand:
    1. to create in your db source schema a View from your sql statement (these behind your data load rule)
    2. query against this view with any GUI (Sql Developer, TOAD etc.) to fetch all rows and measure the time it takes to complete. Also count the effected (returned) number of rows for your information and for future comparing of results.
    If your query runs longer then you think is acceptable then
    a) check DB statistiks,
    b) check and/or consider creating indexes
    c) if you are unsure then kindliy ask your DBA for help. Usually they can help you very fast.
    (Don't be shy - a DBa is a human being like you and me :-) )
    Only when your sql runs fast (enough for you; or your DBA says is the best you can achieve) at the database move your effort over to essbase.
    One hint in addition:
    We had often problems when using views for dataload (not only performance but rather other strange behavior) . Thats the reaons why I like more directly to set up on (persistence) tables.
    Just to keep in mind: If nothing helps create a table from your view and then query your data from this table for your essbase data load. Normaly however this should be your last option.
    Best Regards
    (also to you Torben :-) )
    Andre
    Edited by: andreml on Mar 17, 2012 4:31 AM

  • Inventory data load from Inventory Cube to another Copy Cube

    Hello Experts,
    I am trying to load Inventory data from the Inventory cube(say YCINV) to a copy cube  say YCOPY_CINV(copy of Inventory cube), but the results appear inconsistant when I compare the reports on these 2 cubes. I am trying to populate a fiield in copy cube so that I can populate the same data back to the original cube with the new field in it, I am doing this reload back and forth for historical data purpose only.
    I have seen lot of posts as how to run the setups for Inventory data, but my case does not need to perform set up runs.
    Does the note 1426533 solve the issue of loading from one cube to another ? we are on SAP BI 7.01 with SP 06 ,but the note specifies SP 07 ?
    I have tried note 375098 to see if it works, but I do not see the options as mentioned from  step "Using DTP (BW 7.x)" in BI to perform this note
    Please advise on whether to go with implementing note 1426533 or is there any other way to load inventory data from one cube to other.
    Regards,
    JB

    Hi Luis,
    Thanks for your reply,
    I do not see any setting like Intial stock in DTP except "initial non-cumulative for non- cumulative". I did try using the option "initial non-cumulative for non- cumulative" ,but the results still do not match with the inventory cube data.I do not see the check box for marker in the copy cube (under roll up tab). Please let me know if we can really implement this solution ,i.e. copying from inventory cube to copy cube and then re-loading it back to inventory cube for historical data. Currenlty, I am comparing the queries on these 2 cubes, if the data matches then I can go  ahead and implement it in Production, other wise it would not be wise to do so .
    Regards,
    JB

  • Reporting(ods to cube data)

    hi friends,
    i have 3 recodrs , i loaded from flat file to ods again i loaded ods to cube.
    now i added 1 record in flat file. first i had sent delta to ods, again i had sent to cube. so in cube first 3 records and 1 recoprd delta.
    if i will go to rrmx,  i need to get all records(3 and delta 1 total 4 records). can i get 4 records in reporting or not.
    Thanking u
    suneel.

    I am not really sure to understand your needs. In fact, do you want to aggregate all the records together or just show them in a report? If this is the last option which suits the more, you can include the request id in the structure of your query and then hide it in the properties.
    If you just want to aggregate the records all together, the BW process will perform it for you.
    Regards,
    Cyril.

  • Ods to cube  data load verry urgent

    Hallo ALl,
    i have started the int from ods to cube (data packet 1200000) and its talking hell of time there are 85 data packet, can some suggest me how can i seep up the data load.
    regards sanjeev

    hi
    1) Stop the load first.check the job log and st22 if htere any shor dumps
    2) go to the *8 gnerated infosource...
    3) createa New IP.with the option as PSA and datatarget
    4)  it will take little time but it wont fail..
    5) this will process package by package...
    hope it helps
    regards
    AK

  • ODS to Cube Loading

    Hi All,
    I'm loading data form ODS to cube and i'm getting the following error.
    Error 18 In the update
    please throw me some ideas how to resolve this issue..

    hi jack,
    check the foll links for error 18 in the update:
    URGENT : ERROR 18 while updating the Infocube from ODS....
    Re: What is the importance of transaction RSKC?
    regards,
    raghu.

  • Delta load from ODS to cube failed - Data mismatch

    Hi all
    We have a scenario where the data flow is like
    R/3 table - >dataSrc -- > pSA - >InfoSrc -> ODS ->Cube.
    The cube has an additional field called "monthly version"
    and since it is a history cube , it is supposed to hold data
    snapshots of the all data in current cube for each month .
    We are facing the problem that the Data for the current month
    is there in the history ODS but not in the cube . In the ODS ->Manage->requests
    tab i can see only 1 red request that too with 0 recs.
    However ,In the cube -> manage-> reconstruction tab , i can see 2 Red request
    with the current month date. Could these red requests be the reason for
    data mismatch in ODS and cube .
    Please guide as to how can i solve this problem .
    thanks all
    annie

    Hi
    Thanks for the reply.
    Load to Cube is Delta and goes daily .
    Load to ODS is a full on daily basis .
    Can you help me how to sort this issue . I have to work directly in production env . so has to be safe and full proof .
    Thanks
    annie

  • ODS to CUBE loading - taking too much time

    Hi Experts,
    I am loading data from R/3(4.7) to BW (3.5).
    I am loading with option --> PSA and then Data Target (ODS ).
    I have a selection criteria in Infopackage while loading from standard Datasource to ODS.
    It takes me 20 mins to load 300K records.
    But, from ODS to Infocube ( update method: Data Target Only), it is taking 8 hours.
    The data packet size in Infopackage is 20,000 ( same for ODS and Infocube).
    I also tried changing the data packet size, tried with full load , load with initialization,..
    I tried scheduling it as a background job too.
    I do not have any selection criteria in the infopackage from ODS to Cube.
    Please let me know how can I decrease this loading time from ODS to Infocube.

    Hi,
    To improve the data load performance
    1. If they are full loads then try to see if you make them delta loads.
    2. Check if there are complex routines/transformations being performed in any layer. In that case see if you can optimize those codes with the help of an abaper.
    3. Ensure that you are following the standard procedures in the chain like deleting Indices/secondary Indices before loading etc.
    4. Check whether the system processes are free when this load is running
    5. try making the load as parallel as possible if the load is happening serially. Remove PSA if not needed.
    6. When the load is not getiing processed due to huge volume of data, or more number of records per data packet, Please try the below option.
    1) Reduce the IDOC size to 8000 and number of data packets per IDOC as 10. This can be done in info package settings.
    2) Run the load only to PSA.
    3) Once the load is succesfull , then push the data to targets.
    In this way you can overcome this issue.
    Ensure the data packet sizing and also the number range buffering, PSA Partition size, upload sequence i.e, always load master data first, perform change run and then transaction data loads.
    Check this doc on BW data load perfomance optimization
    https://www.sdn.sap.com/irj/sdn/go/portal/prtroot/docs/library/uuid/1955ba90-0201-0010-d3aa-8b2a4ef6bbb2
    BI Performance Tuning
    FAQ - The Future of SAP NetWeaver Business Intelligence in the Light of the NetWeaver BI&Business Objects Roadmap
    Thanks,
    JituK

  • Delta Load failure from ODS to Cube

    Experts,
    After CRM upgrade, I had to reinitilized the delta process. In this, I reloaded ODS with first init delta without data and then delta load with new data created in CRM. This worked good so far. After this when I tried to load cube with init. delta without data, no issues. But delta load from ODS to Cube doesn't work. Does anybody have any suggestions, please?
    Thanks,
    Nimesh
    Following error observed in Status for delta load from ODS to Cube.
    Error when transferring data; communication error when analyzing
    Diagnosis
    Data Packets or Info Packets are missing in BW, but there were - as far as can be seen - no processing errors in the source system. It is
    therefore probable that an error arose in the data transfer.
    With the analysis an attempt was made to read the ALE outbox of the source system, which lead to error .
    It is possible that no connection exists to the source system.
    Procedure
    Check the TRFC overview in the source system.
    Check the connection of the source system for errors and check the
    authorizations and profiles of the remote user in both the BW and
    source systems.
    Check th ALE outbox of the source system for IDocs that have not been

    Hi,
    As far as i understood, you have successful deltas' loading to the ODS and you want to update entire ODS data to the Cube followed by daily delta's.
    If this is the case,
    Make sure that you have active update rules exist between ODS and Cube.
    First goto your ODS and right click and select option 'Update ODS data in data target'.
    Select Init update > IT will take you to the init info package where you select init <b>with data transfer</b> (Init with data transfer will bring all the records from ODS to Cube)
    Once init is completed ,you can schedule to load delta's regularly form ODS to Cube by following the same steps.
    Hope this helps
    Praveen
    Message was edited by:
            Praveen Vujjini

  • Load from Staging ODS to Cube

    We have a staging ODS which has been loading to a cube via a delta mechanism.  By mistake someone has carried out a full load. We have deleted the full request.  Is the delta mechanism corrupt and how can we collect the requests which were supposed to be updated (there is no selection options in the package)? Thanks

    I faced a similar issue recently. I would
    1) Delete the full load request from cube
    2) Uncheck the 'data mart status' of that request in the ODS
    3) Delete that request in ODS
    4) Create a new infopackage on the ODS InfoSource (8ODSXXXX) with 'init without data' option
        - before scheduling the infopackage, delete the initialization in the scheduler by going    to (Scheduler/Initialization Options for Source System)
    5) Execute the package
    Now every time a new load comes into the ODS you will be all set.
    Hope this helps,
    Justin

  • Data loading from ODS to CUBE

    Hi All,
    I have loaded data from ODS to CUBE. now i have requirement to add some fields in the standard cube. so, for testing purpose i have created copy of the original and created transformation . now when i try to load data from ODS it shows me no more data available . while data is already there in ODS.
    now what should i do ? i don't want to delete data from original cube. is there any other way to load data through transformation ?
    Regards,
    Komik Shah

    Hi,
    Check the DTP of old cube n see whether its Delta. If yes then check whether any one of the foll is check:
    1) get delta only once
    2) get data by request.
    If 1 is checked then delta wont come for the second cube as it says to get delta once and delta is already in one of the cube.
    Generally both should be unchecked but can vary as per requirements.
    Now for your new DTP, i dont think it will aloow you to change to FULL.
    If its allowing you to select FULL, then select it and select from acive table.
    try to load and see.
    regds,
    Shashank

Maybe you are looking for