Loading ODS to Cube Creates two records

I am using one ODS to load data into Cube 1 and Cube 2.
Cube 1 is a delta load and Cube 2 is a full load. The design is that Cube 1 will have all the daily updates from R/3 and Cube 2 will be loaded with a snap-shot of the ODS data at midnight. When the snap-shot is loaded into Cube 2 an update rule will change characteristic infoobject  "Snap-Shot date" to the current date.  So, cube 2 will contain all the nightly snap-shots with different dates
The initial load of Cube 1 runs fine and it loads 1488 records. When I run Cube 2's full load it add 2976 records (double). When I look at the Cube 2 data I see one record with the Snap-shot date that is blank and one with the current date in it.
I have to click on a key figure that has an "Update Type = Addition" to get to the characteristics update rule for "snap-shot date". Is the fact that the key figure is additive causing the creation of two records?
Regards,
Mike...

Yes, that was my problem I didn't have the update rule applied to all the key figures. When I put in the update rule and I saved it, I said "no" to the pop-up "Apply to all key figures".
I just re-edited the update rule and this time I clicked "yes" apply to all key figures. Load is working fine now...
Thanks,
Mike...

Similar Messages

  • Loading through Process Chains 2 Delta Loads and 1 Full Load (ODS to Cube).

    Dear All,
    I am loading through Process chains with 2 Delta Loads and 1 Full load from ODS to Cube in 3.5. Am in the development process.
    My loading process is:
    Start - 2 Delta Loads - 1 Full Load - ODS Activation - Delete Index - Further Update - Delete overlapping requests from infocube - Creating Index.
    My question is:
    When am loading for the first am getting some data and for the next load i should get as Zero as there is no data for the next load but am getting same no of records for the next load. May be it is taking data from full upload, i guess. Please, guide me.
    Krishna.

    Hi,
    The reason you are getting the same no. of records is as you said (Full load), after running the delta you got all the changed records but after those two delta's again you have a full load step which will pick whole of the data all over again.
    The reason you are getting same no. of records is:
    1> You are running the chain for the first time.
    2> You ran this delta ip's for the first time, as such while initializing these deltas you might have choosen "Initialization without data transfer", as such now when you ran these deltas for the first time they picked whole of the data.Running a full load after that will also pick the same no. of records too.
    If the two delats you are talking are one after another then is say u got the data because of some changes, since you are loading for a single ods to a cube both your delta and full will pick same "For the first time " during data marting, for they have the same data source(ODS).
    Hope fully this will serve your purpose and will be expedite.
    Thax & Regards
    Vaibhave Sharma
    Edited by: Vaibhave Sharma on Sep 3, 2008 10:28 PM

  • Inventory Issue: Movement type 905 creating two records in BI.

    Hello Experts,
    We have a problem in Inventory scenario implemented in our system.After the month load completion in Snap shot Cube,we are getting quantity mismatch between BI and ECC values.I found out that there are records exist with movement type 905 creating problem.
    we have only one record in ECC while there are two records exist in BI with one have reversal indicator on it and other does not have reversal indicator.Qty is same in both of the case ,also the record which have reversal indicator have Process key 100 while the record with out reversal indicator doesn't have any process key.And in further update from ODs ZMMDS104 to ICSNAP1 ,in update rule we are adding or subtracting on the basis of process key as per standard Inventory scenario implementation.
    So am i right that this Movement type is creating problem ..?
    Does any one faces this issue earlier...?
    and if this is issue then how to resolve this ?
    Please advice as soon as possible as it becomes a high priority issue in client space.
    Thanks in Advance
    Regards,
    Yash Raj.

    Hi Aduri,
    Thanks for the quick reply.
    I ma new to inventory field,can you please guide me how to check your advice details in the system like you
    said "Recheck the Reversal Indicated Qty because if it has reverse indicator it should come from ECC or from ODS Check the Same at Change Log Table to trace the Changes."
    What do you mean by "or from ODS" do you mean that ODS could create this entry ..? also this ODs ZMMDS104 is got daily loaded by delta from 2lis_03_bf . but this record is related to period 12/2008.and now we are in period 1/2009 (as i am trying to reconciling the data for last period) so i Guess there is no chance to find out this entry in change log.
    Also "Try to see at DS field for the Process Key/Movement type with Reverse Indicator At ECC. "
    How to check the same in data source?
    Also "If you don't have the Reverse postings at ECC without process Key then you can  report on the same to the Functional Team/Client. But before that make sure  that you have reverse postings without process key assigned to it."
    I checked in the ECC system(RSA3) There is only one record exist in for that material document it has positive quantity and material movement type 905 with movement indicator L.
    Waiting for your response.
    Thanks once again 
    Regards,
    Yash Raj

  • Issue in loading ODS to CUBE(URGENT)

    Hi All,
    We are facing an issue in loading AR,AP and G/L ODS to Cube.
    The steps we done are
    1) Repair full request is done in ODS got around 35 lack records.
    2) Done Full load in Cube n got 35 lack records.
    3) Done delta load in ODS got 189 records.
    4) Done delta load in Cube got 1 lack records.
    This is where we guys couldnt find out what went wrong??
    1 lack records to cube is not possible...??
    Did we went wrong any where..??
    Could any one guide us to do What has to be done??
    Kind Regards,
    Shanbagavalli.S

    Hi,
    Thanks for the response.
    Already delta is done in the Cube and ODS.
    No we didn't do intialization . We did repair full in ODS and full update to cube,.
    What we have done is
    We planned to do the selective deletion from G/L cube for Q2 2007.
    Done a repair full request to ODS
    From ODS to Cube a full load. (as we have deleted the data from the cube)
    Problem is after doing this it has pulled data for Saturday, Sunday, Monday and Tuesday.(past five days)
    But the problem is wilth delta.. it is still pulling only 189 same records that got pulled in yesterday evening data load
    Can you tell me wat are all the methods to load those days records also delta records to CUBE.
    Thanks in advance.
    shanba
    Message was edited by:
            shanbagavalli shivasankaran
    Message was edited by:
            shanbagavalli shivasankaran

  • Reporting(ods to cube data)

    hi friends,
    i have 3 recodrs , i loaded from flat file to ods again i loaded ods to cube.
    now i added 1 record in flat file. first i had sent delta to ods, again i had sent to cube. so in cube first 3 records and 1 recoprd delta.
    if i will go to rrmx,  i need to get all records(3 and delta 1 total 4 records). can i get 4 records in reporting or not.
    Thanking u
    suneel.

    I am not really sure to understand your needs. In fact, do you want to aggregate all the records together or just show them in a report? If this is the last option which suits the more, you can include the request id in the structure of your query and then hide it in the properties.
    If you just want to aggregate the records all together, the BW process will perform it for you.
    Regards,
    Cyril.

  • ODS to CUbe Zero Record Load

    Hi,
    A daily load from ODS to Cube is scheduled but the source may or may not have records. so every time the source hass zero records the load happens till the ODS and the status for the ODS remains yellow.
    I manually change it to green and activate the request so that the process continues. Is there a way to automate this so that eventhough there are zero records in the load the ODS is activated.
    Help would be appreciated

    Hi
    You can do this.
    RSMO>select your request>settings>Evaluation of requests(traffic lights)>
    IF no data is available in the system the request-->select the radio button.Status Successful..
    Hope it helps
    Thanks
    Teja

  • How to store as single record data coming from different ods to cube

    Hi All,
            we have the scenario like .
    Same contract values are uploaded from 3 ods to cube.In the cube this information is stored as 3 different records.
    Is there any option having same contract no and same fiscal period values should be stored as single record in cube.
    Thanks in advance.
    Regards,
    Shradda.

    Hi Shradda,
    On Performance Side .....consider below points
    1. Indexes on Cube ( That delete indexes before load and create them after Load ).
    2. Aggregate Design ( Decision on Base Aggregates, roll up Hierarchy , BW Statistics etc ).
    3. Partition of InfoCube ( Basically decision on No. of Partition )
    4. Data Package Size ( Always try to have larger Data Package so that preaggreation will reduce the no. of Data Records ).
    Best is Service.sap.com on BI page u will find material for performance . THat will help you .
    To assign points on the left of screen u will get radio buttons for assigning points for each person who has responded .
    Regards,
    Vijay

  • Loading ODS - Data record exists in duplicate within loaded data

    BI Experts,
    I am attemping to load an ODS with the Unique Data Records flag turned ON.  The flat file I am loading is a crosswalk with four fields, the first 3 fields are being used as Key Fields in order to make the records unique.  I have had this issue before, but gave up in frustration and added an ascending number count field so simply create a unique key.  This time I would like to solve the problem if possible.
    The errors come back referring to two data rows that are duplicate:
    Data record 1 - Request / Data package / Data record: REQU_4CNUD93Q3RCC80XFBG2CZXJ0T/000003/ 339
    Data record 2 - Request / data package / data record: REQU_4CNUD93Q3RCC80XFBG2CZXJ0T/000003/ 338
    And below here are the two records that the error message refers to:
    3     338     3902301480     19C*     *     J1JD     
    3     339     3902301510     19C*     *     J1Q5     
    As you can see, the combination of my three Key Fields should not be creating a duplicate. (3902301480, 19C(asterisk) , (asterisk))   and (3902301510, 19C(asterisk) , (asterisk))  I replaced the *'s because they turn bold!
    Is there something off with the numbering of the data records?  Am I looking in the wrong place?  I have examined my flat file and can not find duplicates and the records that BW say are duplicates are not, I am really having a hard time with this - any and all help greatly appreciated!!!

    Thank you for the response Sabuj....
    I was about to answer your questions but I wanted to try one more thing, and it actually worked.  I simply moved the MOST unique Key Field to the TOP of my Key Field list. It was at the bottom before.
    FYI for other people with this issue -
    Apparantly the ORDER of your Key Fields is important when trying to avoid creating duplicate records.
    I am using four data fields, and was using three data fields as the Key Fields.  Any combination of all three would NOT have a duplicate, however when BW finds that the first two key fields match, sometimes it apparantly doesn't care about the third one which would make the row unique.  By simply changing the order of my Key Fields I was able to stop getting the duplicate row errors...
    Lesson - If you KNOW that your records are unique, and you are STILL getting errors for duplicates, try changing the ORDER of your key fields.

  • Can I do Parallel Full loads from ODS to Cube.

    Hai,
       Usually I will do one a full update load from OD'S to Cube. to speed up the process can I do parallel Full update loads from ods to Cube?
    Please advise.
    Thanks, Vijay,

    Assuming that the only connection we are talking about is between a single ODS and a single Cube.
    I think the only time you could speed anything up is in full drop an reload scenario. You could create multiple InfoPackages based on selection and execute them simultaneously.
    If the update is a delta there is really no way to do it.
    How many records are we talking about? Is there logic in the update rule?

  • Date load error From ODS to Cube?

    Guru's,
    Here is a status of data load and suggest me.
    Actually we are loading data from ODS to Cube which is full load.This loading is done thorugh process chain and it is loaded for every half an hour.Every time the earlier load will be deleted and next load will be taken to cube.Two days ago the load had gone wrong.When I tried to see cause for the failure in Monitor details I could see Update missing and processing missing.When I see in process chains log view I got info stating some invalid characterstic had appeared.But when I search in PSA the whole of the data packets are green.If there is any invalid characterstics PSA should show a red status.But it is not showing.Please guide me how could I solve this?
    point will be definitely awarded for the proper answers.
    Thanks in advance.
    vasu.

    Hi Vasuvasu
    Once the edit is done and reloade the sata it wont show u the records in Red..
    Hope itz Helps..!
    Regards
    KISHORE M REDDY
    **Winners Don't Do Different things,They Do things Differently...!**
    > I dont whink this is with Disk space Y because after
    > every full load we are deleting data.

  • Delta Load failure from ODS to Cube

    Experts,
    After CRM upgrade, I had to reinitilized the delta process. In this, I reloaded ODS with first init delta without data and then delta load with new data created in CRM. This worked good so far. After this when I tried to load cube with init. delta without data, no issues. But delta load from ODS to Cube doesn't work. Does anybody have any suggestions, please?
    Thanks,
    Nimesh
    Following error observed in Status for delta load from ODS to Cube.
    Error when transferring data; communication error when analyzing
    Diagnosis
    Data Packets or Info Packets are missing in BW, but there were - as far as can be seen - no processing errors in the source system. It is
    therefore probable that an error arose in the data transfer.
    With the analysis an attempt was made to read the ALE outbox of the source system, which lead to error .
    It is possible that no connection exists to the source system.
    Procedure
    Check the TRFC overview in the source system.
    Check the connection of the source system for errors and check the
    authorizations and profiles of the remote user in both the BW and
    source systems.
    Check th ALE outbox of the source system for IDocs that have not been

    Hi,
    As far as i understood, you have successful deltas' loading to the ODS and you want to update entire ODS data to the Cube followed by daily delta's.
    If this is the case,
    Make sure that you have active update rules exist between ODS and Cube.
    First goto your ODS and right click and select option 'Update ODS data in data target'.
    Select Init update > IT will take you to the init info package where you select init <b>with data transfer</b> (Init with data transfer will bring all the records from ODS to Cube)
    Once init is completed ,you can schedule to load delta's regularly form ODS to Cube by following the same steps.
    Hope this helps
    Praveen
    Message was edited by:
            Praveen Vujjini

  • Error while loading from ODS to CUBE

    Hi guys,
    I am loading data from source system to ODS and from ODS to CUBE. Ok. The data came successfully from Source System to ODS. But while coming to ODS to CUBE it is showing some error at Data Packet 13. Ok. In CUBE, I loaded the data by using “update ODS data to data target”, at that time it shows two options like FULL UPDATE and INIT UPDATE. Then I selected FULL UPDATE and then I checked in “processing” tab there is only one option enabled i.e. only data target. OK and then I loaded it. Now after getting the error where I can correct it. There is no PSA option in monitoring.
             Other wise how I can change option in Processing tab of infopackage for PSA. But I know one point that when we load the data from one target to another the only one option available in processing tab is “only data target”. How can I change that option and how can I correct the error.
    Thanks
    Rajesh

    Hi,
    i solved my question like the following.
    Go to monitoring of the CUBE and select the option "read every thing as manually" -> then it shows another screen for correcting the records -> correct all the records and load the data again. ok
    Thanks
    Rajesh

  • Load ODS data into multiple cubes

    I have an ODS that has global data.  I need to load the ODS data to multiple cubes based on region and year.  The delta is setup between ODS and cubes.  How do we go about doing this in 3.x and 7.0 and is there any difference in the procecure between these two versions.
    Regards,
    Ram.

    Hi Ram.
    In BI 7.0, you need to create separate Transformations from your Data Store object (DSO) to all the Multiple cubes. For each transformation you then have to create a Data transfer process (DTP). In the DTP you can select data based on the Region and Year.
    DTP data selection
    1. Open the DTP
    2. Goto the Extraction Tab
    3. There will be a "Filter" button opposite to the Extraction mode.
    4. By using this Filter button you can do data selection on different fields.
    In BW 3.x you need to create update rules instead of transformations. The following link should give you more details
    http://help.sap.com/saphelp_nw2004s/helpdata/en/3f/0e503c3c0d563de10000000a114084/frameset.htm
    Regards, Uday
    Assign points for helpful answers and get one point for yourself.

  • Data loading from ODS to CUBE

    Hi All,
    I have loaded data from ODS to CUBE. now i have requirement to add some fields in the standard cube. so, for testing purpose i have created copy of the original and created transformation . now when i try to load data from ODS it shows me no more data available . while data is already there in ODS.
    now what should i do ? i don't want to delete data from original cube. is there any other way to load data through transformation ?
    Regards,
    Komik Shah

    Hi,
    Check the DTP of old cube n see whether its Delta. If yes then check whether any one of the foll is check:
    1) get delta only once
    2) get data by request.
    If 1 is checked then delta wont come for the second cube as it says to get delta once and delta is already in one of the cube.
    Generally both should be unchecked but can vary as per requirements.
    Now for your new DTP, i dont think it will aloow you to change to FULL.
    If its allowing you to select FULL, then select it and select from acive table.
    try to load and see.
    regds,
    Shashank

  • Error while loading data from write optimized ODS to cube

    Hi All,
    I am loading data from a write optimized ODS to cube
    I have done Generate Export Datasource
    schedulled the info packge with 1 selection for full load
    then it gave me following error in Transfer IDOCs & TRFC
    Info IDOC 1: IDOC with errors added
    Info IDOC 2: IDOC with errors added
    Info IDOC 3: IDOC with errors added
    Info IDOC 4: IDOC with errors added
    Data packege 1 arrived in BW Processing : selected number does not agree with transferred number
    Processing below is green
    shows update of  4 new records to Datapackage 1.
    Please provide inputs for the resolution
    Thanks & Regards,
    Rashmi.

    please let me know, What more details you need?
    If I click F1 for error details i get following message
    Messages from source system
    see also Processing Steps Request
    These messages are sent by IDoc from the source system. Both the extractor itself as well as the service API can send messages. When errors occur, several messages are usually sent together.
    From the source system, there are several types of messages that can be differentiated by the so-called Info-IDoc-Status. The IDoc with status 2 plays a particular role here; it describes the number of records that have been extracted in a source system and sent to BI. The number of the records received in BI is checked against this information.
    Thanks & Regards,
    Rashmi.

Maybe you are looking for

  • PPV DataFlow

    Hi, Could some one give me a step by step data flow for a PPV report. I understand thats its a custom report. Im looking for a generic sample. Thanks Edited by: TEE JAY on Feb 22, 2008 6:59 PM

  • Alert Configuration and Alert inbox showing error

    I have activated all services in SICF which are related to alert configuration and changed the HTTP ABAP port according the note 750287 com.sap.aii.rwb.server.centralmonitoring.httpport to the HTTP port number mentioned in SMICM -> goto -> services -

  • Design Patterns: 'Program to an interface, not an impl.' and Factory Method

    Design Patterns: 'Program to an interface, not an implementation' and Factory Method. Hi All, I've 4 questions. And 1M thanks for your precious input. 1. OOAD steps: Requirement-->Use Cases-->Analysis Classes-->Sequence Diagrams-->CRC-->other UML dia

  • Our regular photographer send us images that, unlike before, are now locked in rights and sharing. T

    Our regular photographer send us images that, unlike before, are now locked in rights and sharing. This means that we have to lock up every photo separately with the admin code and it takes time. Recent pictures from him came with read and write perm

  • Deploying xsd to the SOA MDS

    Hi, I am trying to deploy some xsds to the SOA MDS .. I am following the SOA developer's guide (http://download.oracle.com/docs/cd/E12839_01/integration.1111/e10224/toc.htm - 43.4 Deploying and Using Shared Metadata Across SOA Composite Applications)