Record Check / Aggregation when loading to infocube

When pushing daily data to a infocube, summarized by week, I am having inconsistent results in the cube.
For example, in a week, if I have 5 separate records in the ODS with the same plant, variety, and week, but each on a different day, those 5 records should roll up to one record in the infocube by week.
In my record count, however, I notice the system generates packets of varying sizes during update to the cube.  My question is, what if ODS records with the same keys are spread across the packets, could this result in inaccurate update to the cube?
In the record check screen,  the Converted --> Updated columns suggest to me that unless similar records exist in the same Packet, there is a chance they will not be rolled up properly.
Any thoughts?
CB

I agree that compression will yield the correct results, but this seems not to address the root concern, that being that data are not fully roll-ed up during load.
I would not expect that the individual packets would have an impact to overall cube roll-up, but in our testing it appears this is the case.
Do you know if the roll-up of data in a cube with similar characteristic values should be impacted by the breakdown of data in the packets?

Similar Messages

  • How to avoid 'duplicate data record' error message when loading master data

    Dear Experts
    We have a custom extractor on table CSKS called ZCOSTCENTER_ATTR. The settings of this datasource are the same as the settings of 0COSTCENTER_ATTR. The problem is that when loading to BW it seems that validity (DATEFROM and DATETO) is not taken into account. If there is a cost center with several entries having different validity, I get this duplicate data record error. There is no error when loading 0COSTCENTER_ATTR.
    Enhancing 0COSTCENTER_ATTR to have one datasource instead of two is not an option.
    I know that you can set ignore duplicates in the infopackage, but that is not a nice solution. 0COSTCENTER_ATTR can run without this!
    Is there a trick you know to tell the system that the date fields are also part of the key??
    Thank you for your help
    Peter

    Alessandro - ZCOSTCENTER_ATTR is loading 0COSTCENTER, just like 0COSTCENTER_ATTR.
    Siggi - I don't have the error message described in the note.
    "There are duplicates of the data record 2 & with the key 'NO010000122077 &' for characteristic 0COSTCENTER &."
    In PSA the records are marked red with the same message (MSG no 191).
    As you see the key does not contain the date when the record is valid. How do I add it? How is it working for 0COSTCENTER_ATTR with the same records? Is it done on the R/3 or on the BW side?
    Thanks
    Peter

  • OBJECTS_NOT_CHARLIKE when loading to infocube from DTP

    Hello Guru's,
    I am trying to load a simple excell spreadsheet into our BW system.
    I created the  data source, turned excell into matching format then saved as csv, created a conversion to be able to load from the data source, then created the info package and finally created the dtp to load (it's a full load, not delta package)
    When i start InfoPackage it loads the excell to PSA perfectly but when i start the DTP i get an error that says
    OBJECTS_NOT_CHARLIKE
    I checked all the characteristics and keyfigures have matches in the cube, also in the conversion i used no rutines.
    I am waiting for your answers, thanks in advance for helping,
    Davut

    Here are the fields in Data Source
    1     /BIC/ASEM_POIT                              ASEM_POIT     CHAR     16     0     16
    2     /BIC/ACOMCODE                              ACOMCODE     CHAR     4     0     4
    3     /BIC/ABOLGE                              ABOLGE     DEC     3     0     2
    4     /BIC/AARMATOR                              AARMATOR     DEC     5     0     4
    5     FISCPER3     Posting period     0FISCPER3     NUMC     3     0     3
    6     FISCYEAR     Fiscal year     0FISCYEAR     NUMC     4     0     4
    7     /BIC/LHGCH013                              LHGCH013     CHAR     2     0     2
    8     /BIC/ASMIKTAR                              ASMIKTAR     QUAN     17     0     17
    9     VERSION     Version     0VERSION     CHAR     3     0     3
    10     UNIT     Unit of measure     0UNIT     UNIT     3     0     3
    11     FISCVARNT     Fiscal Year Variant     0FISCVARNT     CHAR     2     0     
    And this is a sample line from our excell file
    AC01605018      AG03     8     4     1     2010     20     1406     4   ADT     K4
    I hope it is clearly viable.
    Thanks for cooperating again.

  • Check email when load up mail

    How do I set it so that when I load up mail, it automatically checks my email. I seem to have to manually check for new email when I load up mail. I mean, I can have it auto-check every 5 mins, etc - but how about when I initially load the program?

    When you set Mail to autocheck every so often, Mail will automatically check for new mail when it startsup if it's not been launched since it last checked and the time has passed.

  • Error /SAPAPO/TSM 041 when loading from InfoCube to SNP planning area

    I am using APO V5.1.
    I have a 'backup' InfoCube with characteristics 9ARNAME and 9AVERSION which I'm loading to an SNP planning area via trans /SAPAPO/TSCUBE. The InfoCube is itself populated from the 9ARE aggregate of the SNP planning area.
    But I get error /SAPAPO/TSM 041when I run the load, suggesting absence of a required characteristic or navigation attribute in the Cube. Do I need 9ALOCNO for instance?
    Can anyone advise me here?...

    Hi,
    I am not very sure about the 9ARE aggregate (haven't used it in backups), but RTSCUBE is used to copy time Series (TS) KF data from cube to planning area (SNP or DP).
    Are you trying to restore some time series data from your backup cube to the planning area? If yes, then do a mapping of characteristic from cube to planning area in RTSCUBE, and also map the TS KF between cube and planning area.
    If your KF is not a time series KF, then you can't copy it from cube to planning area. You could get data to cube for some reporting, otherwise I am not sure what use the backup is for you. For SNP, most of the data would be received from R/3, so there's not much point in having a backup.
    Hope this helps.
    Thanks - Pawan

  • Load from Infocube to an ODS.

    Hi All,
    I needed some information on the logic when you are loading from an Infocube to the ODS.
    I have 2 requests with 100,000 records each in an Infocube.All the keys(dimensions-chars) in the infocube match. The keyfigures in both the requests are different.When i execute  Full load to load the ODS i see only 100,000 records loaded into
    the ODS and the keyfigures have been aggregated.
    Verified the update rules for the keyfigure and the "Update Type" is Overwrite.
    Please help me understand if this is the way its supposed to be workign when loading from an infocube to an ODS.
    thanks in advance

    Hi sandesh,
    As your saying this is how it works"records are aggregated when data is extracted from cube even before DSO update rules comes into act".
    So in this case,i will never be able to overwrite the kefy figures even though i have the update type as "Overwrite".
    Ex- I have 2 reqs in the cube
                 Mat        Plant       KF1   
    Req1     1000        07           10
    Req2     1000        07             5
    Loaded this into the ODS .
    Then i have another request in the cube
    Req3      1000      07        1
    Layter when i load this Req3 into the ODS,how would it show up in.If its aggregated int he cube before its loaded intot he cube,then it will never use the overwrite functionality i ahve selected for the update rules.
    Please help me understand.

  • # of records in a full load dont square with d # in an Init with data trans

    Heallo to all BI/FI SDNer's,
    FI AP n AR extractors like 0FI_AR_3 & 0FI_AP_3 pull 1785 records to PSA when loaded full but an Init with datatransfer for the same pull only 1740 records to PSA.
    I am very skeptical dat even a delta aft a repair full will not bring all records.
    What update methodologies are qualified for AP & AR ?
    OSS notes I found really dont answer my concern here......please comment SDNer's !!!
    Message was edited by:
            Jr Roberto

    Somehow it worked after redoing !!

  • "****Error from PSA****" error message when loading InfoCube

    Hi all,
    I am getting the following error message when loading transaction data to an InfoCube.
    ***Error from PSA***
    This is not MD or Hierarchies (like previous messages in the forum)
    Please advise.
    Thanks,
    Miki

    Hi Miki,
    Before just loading data into the infocube. Catchup PSA table and see the data into it.
    Then u may find the solution.
    But as ronald said this shouldnot be the complete error statement.
    bye.

  • How to only update existing records when loading master data ?

    Hello experts, I need your lights one more time.
    Here is my need :
    I have created an infoobject (IO) which is a very simple version of 0material, let's call it Znewmat --> Znewmat has material type and trademark as attributes, those two fields are available in 2 different datasources :
    - 0MATERIAL_ATTR for material type (field MTART)
    - 0MAT_SALES_ATTR for trademark (field MVGR2)
    When loading my new IO from 0MATERIAL_ATTR I use a filter (at DTP level) to get only a few material types (I get something like 1000 records),
    here is my issue : when I load from 0MAT_SALES_ATTR the field "material type" is not available to keep the same filter as for 0MATERIAL_ATTR and existing records are updated with the trademark, but I also get 5000 records I don't need, and my master data is "polluated" with useless lines.
    *and my question : is there a way while performing the second loading to ONLY UPDATE EXISTING RECORDS AND NOT ADD ANY
    NEW RECORDS ? (i didn't find anything in main options of my DTP)*
    (I'd like to avoid the solution to update the 0MAT_SALES_ATTR datasource to add the missing field)
    Thanks in advance for any help, points will be distributed.
    Guillaume P.
    Still no idea ?

    in the start routine of transformation from 0MAT_SALES_ATTR to znewmat do the following:
    select materials from /BIC/PZNEWMAT into i_mat
    for all entries in source_package where material eq source_package-material.
    loop at source_package.
    p_ind = sy-tabix.
    read table i_mat with key material = source_package-material.
    if sy-subrc ne 0.
    delete i_mat index p_ind.
    endif.
    this way you'll only update records that have previously been loaded by 0MATERIAL_ATTR DS
    loading sequence:
    first load ZNEWMAT from 0MATERIAL_ATTR. then activate ZNEWMAT. then load 0MAT_SALES_ATTR to ZNEWMAT.
    M.

  • ABAP short dump when loading data into Infocube

    Hello All,
    I am getting an ABAP short dump error when loading data into Infocube. I tried to load data from PSA and ODS.
    I ran the code to compress the Infocube, no improvement still getting the same error.
    Error Analysis:
    A raise statement in the program "SAPLRSDU_PART" raised in the exception.
    Internal Notes:
    The termination occured in the function "ab_i fune" of the SAP Basis system. Specifically in the line 2316 of the module "//bas/620/src/krn/runt/abfunc. C#18".
    The internal operation just processed is "FUNE"
    Advance thanks !!!
    Siv

    Hello Siv,
    try to run program SAP_DROP_EMPTY_FPARTITIONS to display existing partitions. Then compare it to the number of uncompressed requests. Maybe there's a mismatch.
    SAP notes that might help: 385163, 590370
    Regards,
    Marc
    SAP NetWeaver RIG, US BI

  • Rejected records when loading

    hi!
    i'm using awm to build cube, when loading the measures from sales_fact view in the global demo of oracle, log hints
    "13:05:12 Finished Load of Measures: UNITS from Cube UNITS_CUBE.CUBE. Processed 0 Records. Rejected 222589 Records."
    i'm assure awm user has privilege to select sales_fact view. but why rejected the records, needs help.thanks!
    best regrads

    Hi there,
    The only situation where records get rejected is when the relevant dimension values are not present and the fact record cannot be joined on one (or more) of it's foreign key records.
    Have you maintained all of your dimensions first?
    Thanks,
    Stuart Bunby
    OLAP Blog: http://oracleOLAP.blogspot.com
    OLAP Wiki: http://wiki.oracle.com/page/Oracle+OLAP+Option
    OLAP on OTN: http://www.oracle.com/technology/products/bi/olap/index.html
    DW on OTN : http://www.oracle.com/technology/products/bi/db/11g/index.html

  • Data load from Infocube to Infocube 170 million records.

    Hi All,
    I want to load the data from 1 infocube to another infocube. Now I have developed in Development. But in production if I start the Full load it will load or any time out error it will show? bcz 17 crores records I have to load from old infocube to new infocube.
    Please advise any precaution should I take before start load from old to new infocube.
    Thanks in Advance
    Shivram

    You need not load the entire 170 mil records at a go.
    Please do a selective loading, i.e. say, based on Doc Number or Doc Date or Fisc. Period or Cal month, some characteristic like this which is unique for all records.
    This will ensure that the data is getting loaded in small amounts.
    As said above, what you can do is, create a process chain.
    Drop indexes from the 2nd cube. Make multiple infopackages, with different selections, placed one after the other in the process chain, loading the data into cube 2.
    Then build your indexes, after the loads are complete i.e. after 170 mil records have been added to cube 2.

  • Records are Missing when we run Delta Load( FIGL-ACCOUNT ITEMS)

    Hi,
      Some records are missing when we run Delta load.
    We have one generic data source on FAGLFLEXA table.
    We select  Field nm TIMESTAMP
    TIME STAMP IS LOCAL.
    upper limit is Blank
    Lower limit is 1000.
    we run this process chain every day.
    we select delta settings is New status for changed records.
    Please give me any idea why records are missing when we run deltas.
    Thanks
    Naik

    Hi Anju,
    Please ensure that you are following the below steps while Initilizing the application 13:
    1. All the users in source system, must be locked.
    2. Ensure that SMQ1 and RSA7 are having no data for application 13. Delete the application 13 datasource entries from RSA7.
    3. Delete and refill the setup tables for application 13.
    4. Run a INIT with data transfer load for the required datasources of application 13.
    5. The Deltas can follow from the next day.
    This will ensure that your deltas are fetching correct data. Note that, delta will pick up the same conditions on which the INIT load was run.
    Please let me know if you need any more information.
    Regards,
    Pankaj

  • [svn:osmf:] 15114: Don't check policy file when loading SWFs.

    Revision: 15114
    Revision: 15114
    Author:   [email protected]
    Date:     2010-03-29 12:24:58 -0700 (Mon, 29 Mar 2010)
    Log Message:
    Don't check policy file when loading SWFs.  Don't import local SWFs into current security domain.
    Modified Paths:
        osmf/trunk/framework/OSMF/org/osmf/elements/ImageLoader.as
        osmf/trunk/framework/OSMF/org/osmf/elements/SWFLoader.as
        osmf/trunk/framework/OSMF/org/osmf/elements/loaderClasses/LoaderUtils.as

    If you managed to get to the point where the applet container is created (the "gray square"), but the form never appears then you can assume one or more of the following has occurred:
    1. The JRE crashed after startup. Many times, but not always, if such a crash occurs it will leave a JRE dump file on the desktop. Its content may help to identify the cause.
    2. The Forms runtime crashed at startup. Many times, but not always, a Forms dump file will be created on the server. Its content may help to identify the cause.
    3. The Forms runtime was unable to start at all. This can occur on unix systems when/if there is a resource or permissions issue. One of the more common causes is if the file descriptor (nofiles) value is set too low.
    4. The applet is actually running, but has attempted to display a dialog box and is awaiting your acknowledgement, but the box was wrongfully sent to the background behind the browser. A similar issue was reported in one of the JRE 1.6.0_xx series, however I don't recall which one. Uninstall your current version and install the latest which is 1.6.0_27
    There are other possibilities, but these are most common.
    I would recommend the following:
    1. Uninstall any JRE older than 1.6.0_27. Reboot. Install 1.6.0_27
    2. Set networkRetries=5 in formsweb.cfg
    3. Set FORMS_TIMEOUT to 15 (default). Setting to a high value as you have is not recommended and is rarely necessary.
    4. Verify that the test form works. For example:
    http://machine:port/forms/frmservlet?form=test
    5. It appears that you are trying to use WU_TEST_106.fmx. Instead, download an updated version of this file (the name has also changed)
    http://www.oracle.com/ocom/groups/public/@otn/documents/webcontent/196249.zip
    6. Ensure that you have compiled webutil.pll into .plx. Do not use an old version of this file. The installation will include one. If not, check for it in an installation that also includes the Builders.

  • When load google maps V3 check box became hiden.

    Hi,
    Hiere is example where when load google maps V3 check box became hiden
    only with Safari browser ( Tested with Safari 5.0 on Windows XP)
    url: http://jtrpovski.hostwebs.com/test99.html.
    With other browsers (IE8, Opera 10.62, Chrome 6.0, FF 3.6.10) there is
    no problem.
    jtrpovski.

    Hi,
    With the latest release of Google Maps 3.2 from 11-Nov-2010 the problem is solved.
    JTrpovski.

Maybe you are looking for

  • Email footer option ?

    Can the email footer be edited on the 8520? What i am trying to do is to have a few spaces between the end of the email before the signature. Currently the signature is on the next line of text at the end of the email. Like below Thanks. signature he

  • Keyfigure in Output mode

    Hi all, I have an issue in DP Planning book. Normal scenario: Keyfigure is in Output mode at the aggregated level  i.e value is uneditable at the Totals level, but if I drill down to Details all,  it is editable. Issue: For a particular product, the

  • Combination of RAW and JPG displayed in Lightroom?

    When I set my Canon 30D camera to take both Raw and JPG images, Lightroom displays only Raw images but not JPG images. What do I do to have Lightroom display JPG as well as Raw images?

  • NullPointerException in Approve Time

    A manager using this function is encountering the exception below.  It seems it's happening to managers who have employees who have changed managers (i.e. employee is no longer part of his team).  It seems to be also happening to the new manager of t

  • Best microSD producer for nokia phones

    recently i bought a new microSD memory (2gb) Kingston for my 5700 xpressMusic. i just wonder if it's the best choise. what do you think?