Inventory - No Marker update is set for delta loading, any impact ?

Dear all,
Since several days the daily delta on inventory cube is done with flag "no marker update set".
What are the possible issue ?
Do we have to reload the data or can we just set the flag as unchecked ?
Thanks for your help
Best regards
Christophe
Edited by: Christophe Courbot on Sep 15, 2011 5:04 PM

Hi,
Pl check below
Marker Update is used to reduce the time of fetching the non-cumulative key figures while reporting. It helps to easily get the values of previous stock quantities while reporting. The marker is a point in time which marks an opening stock balance. Data up to the marker is compressed.
The No Marker Update concept arises if the target InfoCube contains a non-cumulative key figure. For example, take the Material Movements InfoCube 0IC_C03 where stock quantity is a non-cumulative key figure. The process of loading the data into the cube involves in two steps:
1) In the first step, one should load the records pertaining to the opening stock balance/or the stock present at the time of implementation. At this time we will set marker to update (uncheck 'no marker update') so that the value of current stock quantity is stored in the marker. After that, when loading the historical movements (stock movements made previous to the time of implementing the cube) we must check marker update so that the marker should not be updated (because of these historical movements, only the stock balance / opening stock quantity has been updated; i.e. we have already loaded the present stock and the aggreagation of previous/historical data accumulates to the present data).
2) After every successful delta load, we should not check marker update (we should allow to update marker) so that the changes in the stock quantity should be updated in the marker value. The marker will be updated to those records which are compressed. The marker will not be updated to those uncompressed requests. Hence for every delta load request, the request should be compressed.
Check or uncheck the Marker Option:
Compress the request with stock marker => uncheck the marker update option.
Compress the loads without the stock maker => check the marker update option.
Relevant FAQs:
1) The marker isn't relevant when no data is transferred (e.g. during an delta init without data transfer).
2) The marker update is just like a check point (it will give the snapshot of the stock on a particular date when it is updated).
Reference information:
Note 643687 Compressing non-cumulative InfoCubes (BW-BCT-MM-BW)
Note 834829 Compression of BW InfoCubes without update of markers (BW-BEX-OT-DBIF-CON)
Note 745788 Non-cumulative mgmnt in BW: Verifying and correcting data (BW-BCT-MM-BW)
Note 586163 Composite Note on SAP R/3 Inventory Management in SAP BW (BW-BCT-MM-IM)
Thanks and regards

Similar Messages

  • SQL 2014 cluster installation on mounting point disks error: Updating permission setting for file

    Dear all,
    I am attempting to install SQL 2014 fail over cluster under a physical windows 2012 R2 server which have mounting point disks.
    At the finish of setup I get the following error message.
    Could you please help me on this?
    Thanks
    The following error has occurred:
    Updating permission setting for file 'E:\Sysdata!System Volume Information\.....................................' failed. The file permission setting were supposed to be set to 'D:P(A;OICI;FA;;;BA)(A;OICI;FA;;;SY)(A;OICI;FA;;;CO)(A;OICI;FA;;;S-1-5-80-3880718306-3832830129-1677859214-2598158968-1052248003)'.
    Click 'Retry' to retry the failed action, or click 'Cancel' to cancel this action and continue setup.
    For help, click: go.microsoft.com/fwlink
    I am using an administrator account which have all security rights (cheched on secpol.msc mmc) needed for installation.

    Hi Marco_Ben_IT,
    Do not install SQL Server to the root directory of a mount point, setup fails when the root of a mounted volume is chosen for the data file directory, we must always specify
    a subdirectory for all files. This has to do with how permissions are granted. If you must put files in the root of the mount point you must manually manage the ACLs/permissions. We need to create a subfolder under the root of the mount point, and install
    there.
    More related information:
    Using Mount Points with SQL Server
    http://blogs.msdn.com/b/cindygross/archive/2011/07/05/using-mount-points-with-sql-server.aspx
    I’m glad to be of help to you!
    Please remember to mark the replies as answers if they help and unmark them if they provide no help. If you have feedback for TechNet Support, contact [email protected]

  • How to make JOB CONTROL setting for delta extraction in Production system .

    Hello All,
    We are in the process to transport our development to PD server in BI . We have transported the data sources to ECC production and now we are filling the setup table for " SD BILLING BW " . After filling Setup table data in full mode through infoPackage we have to load data in delta mode .    
    How we can set our Update mode(LBWE setting) in PD server ?
    Is LBWE  T.Code authorisation required in ECC PD server to make the setting of Update mode   or   We have to set all  Job Control Paramater in Dev System at the time of transport the datasource request to ECC production .  Is our time which is set in job control of  ECC development  will be reflected in ECC production ?
    can any body tell me about how to make setting for delta extraction in Production system .
    Thanks ...

    Hi,
    How do you load the data in the development system?
    - you set up the update mode in the infopackage. There you must customize "Initialization"
    - you have to filled the setup table in the production system without laoding the data in the BW? then this step was for nothing. Filling the setup table and do the INIT in the BW, this must be together - during no changing will be done in your R/3 SD.
    Sven

  • ALE change Pointers for delta load

    The master data datasource is 0CRM_BPSALESCL_ATTR, which is in my source system use change pointers for delta loading. When I use delta loading, the error is "ALE change pointers are not set up correctly". I went to the source system, and use BD61 to activate change pointers. But it does not work. Then I created new Business partners to make changes. I can find these new records in table CDHDR and CDPOS, but there is no records found in table BDCP. And my delta loading still got the error as "ALE change pointers are not set up correctly."
    Can anybody give me advice?
    Thanks,
    Wenjie

    Hi, Ron,
    Thanks for your reply. I already assign points to you.
    I relicated the datasource, and re-active the transfer structure, but it still doesn't work. My one more question is that do I run BD61 on BI system or on source system? I did on source system only. And is there any more setting need to be done on source system? Because my error message said setting on source system is wrong.
    Thank you very much.
    wenjie

  • What is 'No marker update'? do opening balance have any relation with it?

    what is 'No marker update'? do opening balance have any relation with it? i know that 'No marker update' will not update reference point that means it will not change the opening balance date to current data, is it right? please explain.
    York

    Hi
    https://www.sdn.sap.com/irj/servlet/prt/portal/prtroot/com.sap.km.cm.docs/documents/a1-8-4/how%20to%20handle%20inventory%20management%20scenarios.pdf
    When you load historic non cummulatives you need to set the "No Marker Updating" indicator if initalization has already taken place otherwise your data will not match with the stock on R/3 side.
    Raja

  • Can we use both 0FI_AP_3 and 0FI_AP_4 for Delta Loads at the same time.....

    Hi Gurus:
    Currently my company uses 0FI_AP_3 for some A/P reporting. It has been heavily customized & uses Delta loading. However, SAP recommends the use of "0FI_AP_4" for A/P data fro delta loads. I was able to Activate 0FI_AP_4 as well & do some Full Loads in Dev/Test boxes. Question is whether I can use both the extractors for "Delta" loads at the same time......? If there are any issue, what is the issue and how ccan I resolve it? Is the use of only one extractor recommended......??
    Please let me know as this impacts a lot of my development....! Thanks....
    Best...... ShruMaa
    PS:  I had posted this in "BI Extractors" forum but there has been no response......  Hope to get some response.......!  Thanks

    Hi,
    I would recommend you to use 0FI_AP_4 rather using both, particularly for many reasons -
    1. DS: 0FI_AP_4  replaces DataSource 0FI_AP_3 and still uses the same extraction structure. For more details refer to the OSS note 410797.
    2. You can run the 0FI_AP_4 independent of any other FI datasources like 0FI_AR_4 and 0FI_GL_4 or even 0FI_GL_14. For more details refer to the OSS note: 551044.
    3. Map the 0FI_AP_4 to DSO: 0FIAP_O03 (or create a Z one as per your requirement).
    4. Load the same to a InfoCube (0FIAP_C03).
    Hope this helps.
    Thanks.
    Nazeer

  • Short dump error for delta load to 0CFM_C10 via 0CFM_DELTA_POSITIONS flow

    hi all,
    i am getting short dump error for delta load to 0CFM_C10 via 0CFM_DELTA_POSITIONS flow.
    not able to figure out the actual issue and how to solve it.
    can anyone suggest?
    below is the details of the short dump
    Short text
        Exception condition "UNKNOWN_TRANSCAT" raised.
    What happened?
        The current ABAP/4 program encountered an unexpected
        situation.
    What can you do?
        Note down which actions and inputs caused the error.
        To process the problem further, contact you SAP system
        administrator.
        Using Transaction ST22 for ABAP Dump Analysis, you can look
        at and manage termination messages, and you can also
        keep them for a long time.
    Error analysis
        A RAISE statement in the program "CL_SERVICE_TRG================CP" raised the
         exception
        condition "UNKNOWN_TRANSCAT".
        Since the exception was not intercepted by a superior
        program, processing was terminated.
        Short description of exception condition:
        For detailed documentation of the exception condition, use
        Transaction SE37 (Function Library). You can take the called
        function module from the display of active calls.
    How to correct the error
        If the error occures in a non-modified SAP program, you may be able to
        find an interim solution in an SAP Note.
        If you have access to SAP Notes, carry out a search with the following
        keywords:
        "RAISE_EXCEPTION" " "
        "CL_SERVICE_TRG================CP" or "CL_SERVICE_TRG================CM003"
        "SORT_TRANSACTIONS"
        or
        "CL_SERVICE_TRG================CP" "UNKNOWN_TRANSCAT"
        or
        "SBIE0001 " "UNKNOWN_TRANSCAT"
        If you cannot solve the problem yourself and want to send an error
        notification to SAP, include the following information:
        1. The description of the current problem (short dump)
           To save the description, choose "System->List->Save->Local File
        (Unconverted)".
        2. Corresponding system log
           Display the system log by calling transaction SM21.
           Restrict the time interval to 10 minutes before and five minutes
        after the short dump. Then choose "System->List->Save->Local File
        (Unconverted)".
        3. If the problem occurs in a problem of your own or a modified SAP
        program: The source code of the program
           In the editor, choose "Utilities->More
        Utilities->Upload/Download->Download".
        4. Details about the conditions under which the error occurred or which
        actions and input led to the error.

    Hi,
    It seems like some routine are there and check your rotines , transformation etc.
    Regards
    sivaraju

  • Error in activating init flag for delta load

    Hi everybody, I am trying to activate the init flag for delta load, when I schedule the infopak it is taking too long to execute and the status is remaining in yellow for a week. I checked in details tab the error message is u201C no data available in source system u201C Please helpu2026

    Hi Vinod,
    When there is no data available in the source system, you will have yellow flag for the delta init request. Check for the delta queue entry (RSA7) on the source system. Delta mechanism starts working once this flag is created. After that schedule the delta infopackage.
    Also check on the source system whether there is any data avilable in the application tables related to this datasource. Take the help of functional consultants if required.
    Regards,
    Sreenivas.

  • Missing some master data for delta load( its very urgent please)

    Hi,
    I am working master data for delta load,the problem is when ever changes(SO) in r/3 RECORDS for address text,City Dstrct Nm.its not getting loaded into BW.every day some delta records are coming into BW.
    the data coming thru bw satging and bw.Please help what would be reson and where can i find the detail info,where is it missing.Please see below for example.
    Address Number/     Addr Ln 1 Txt/     City Dstrct Nm
    9025750333/                      #/               #     
    help me its very urgent

    Hi Sumanth,
    check the delta queue and the V3 job is it running correctly ,
    Have a look at OSS note :728687
    and also see the following thread
    Deltas are not available in Delta que
    Delta Queues are not cleared in R/3
    No data in RSA7 for 2lis_03_bf : HELP
    check the data may b it is in modifide status not active.
    regards,
    supriya

  • Question for delta loading, why can not update existing record in cube

    Hi, Gurus,
    I load data from a flat file into one infocube with delta loading option, and in the related datasource, i chose 'additive delta' option.
    However after delta initial and delta update, when I checked cube content, I found that there are 2 records with the same characteristic value (just like duplicated key), and my expectation is to sum key figure with the same characteristic. Who can help me on that?
    BTW: when I check delta queue, why there is not any queue?

    Hi,
    In uploading of flat file , If you are able to capture the changes to the previously loaded records, Then you can follow the additve delta. So it decreases the time of uploading .Only changed or new records will be uploaded every time.
    If you are not able capture the changes in flat file ,Every time you have to do full upload ,Before this delete the contents of Infocube. (Assuming there is no ODS B/W Infocube and Flat file).
    With rgds,
    Anil Kumar Sharma. P

  • Selection Criteria for Delta loads triggered using MDM_CLNT_EXTR

    Hi ,
    Currently we are required to load data only from two specific account groups . Therefore in the selection criteria in mdm_clnt_extr we maintian these two account groups and extract data.
    However, when setting up the delta load there does not to be an option to restrcit the delta to a specify an criteria. Therefore, in our case several records from account groups which are not required are passed on to the MDM system.
    Is there a way to address this requirement ?
    1. There is an option to run the initial load variants at regular intervals, the problem with this is that in our case we have several variant creates as the R/3 system does not allows extraction on more that 999 records at a time. Therefore at point of time to capture the delta records , if we run all these jobs our network gets choked.
    2. Run delta job, but without selection criteria it pulls out records which we don't require
    Any suggestions on how this problem can be addressed are welcome ,
    Thanks in advance,
    Anita George

    Thanks Michael !
    Well, regarding hte ALE settings, we did take a look in bd64 the filters are based on objects which one of these is relevant for customer account groups ? Probably in the future we would require all data, and then is it enogh to remove the filter ?
    Can a filter be maintained at the XI level , this would be good as only the required account groups are passed on to MDM . Is there any documentation on available on this ?
    Regarding the Import Manager, will have to check  this functionality, as it is now no matter how many times we update the map/rename /load/unload it automated imports keep failing.
    Our R/3 system seems to have problem extracting more than 999 records, as we tried with different number of records and all those extraction jobs got cancelled. So, currently we are running multiple jobs for the initial load and even this is very time consuming.
    Thanks & Regards,
    Anita George

  • Problems for Infopackage setting for Hierarchy Load

    Hello Masters,
    I have a problem, in editing the setting for Hierarchy Infopackage, I was asked to reset the Processing mode of infopackage directly into Infoobject, instead of PSA and then into Subsequent Data targets.
    But here, i am unable to edit the setting because, every thing is gryed out. Only option selected  before is "Only PSA, and then Update in subsequent Data Targets"
    I have a question, like is it mandatory setting for Hierarchy Infopackage, to Update in PSA and then into Data Targets.
    As i was able to edit settings for Attribute, and Text Infopackages even though these are in Process chains.
    Could any one please help, if it is possible to edit Hierarchy Infopackage, instead of creating a new one.
    Thank you
    Regards
    Raghu

    Hi,
    I have checked, the transfer rules, The Transfer Method used is PSA. I have to reset this infopackage directly to Infoobject, because this process chain runs for every four hrs, and it increases PSA size in tons.
    Hence, i was asked to reset the setting for infopackage. But it is gryed out.
    Thank you,
    Regards
    Raghu
    Edited by: raghuram alamuri on Jul 16, 2009 3:29 PM

  • Repair request for Delta load? Orders missing in Billing line Items

    Hi
    I am using BW 3.5; Some Orders are missing in BW for Billing line item reports (2LIS_13_VDITM) according to the users.
    The report is based on Cube directly, ODS is not used. I planned to do a repair full request for this month's data so as to get the missing orders.
    Could you please advice me on correct procedure to do that? Delta loads runs every day and todays delta is completed.
    Thanks and BR
    P.B

    While it's usually recommended to execute setups during "quiet time", it's not required. Especially in this case. Since you're going to be restricting the setup to only extract a range of Billing Documents that are representative of the dates of data missing (or purely the identified missing documents), there is no need to wait for "quiet time" because you'll pick up the data either in your Full Repair extract or in the delta extract. Depending on how the target InfoProvider for this data is setup (cumulative v. non-cumulative), you may have to look out for duplicate data.
    My recommendation would be to run your setup for only those documents that have been identified as missing. That way, you're limiting the amount of data that's required to be extracted to the setup table and then into BW, but also will remove any risk of duplicate data in your target InfoProviders.

  • Question for delta loading

    Hi, Gurus,
    I load data from a flat file into one infocube with delta loading option, and in the related datasource, i chose 'additive delta' option.
    However after delta initial and delta update, when I checked cube content, I found that there are 2 records with the same characteristic value (just like duplicated key), and my expectation is to sum key figure with the same characteristic. Who can help me on that?
    BTW: when I check delta queue, why there is not any queue?

    Ls,
    Please do not post same question in both forums - please refer the answers in the other post in the other forum.
    Arun

  • Why when I update Adobe it won't load any higher than 25 or 30%?

    Anytime I try to upload the new Adobe Flash Player Update, the installer won't download any further than 25% or 30% through.
    And this annoys me a lot because now it is impossible for me to view any videos, and I think I may not have access to some certain web sites anymore because of that (e.g.: my school's portal).
    This gets me very angry
    I just want to get it fixed.
    P.S.: I am a MacBook user, I do not know if that influences anything, but it may...
    So please help!!!

    Hello,
    This morning we resolved a problem that prevented Flash Player installations from completing for some of our Macintosh users.
    If you encountered this problem, please delete any previously downloaded Flash Player installer and either:
    Download the installer again from https://get.adobe.com/flashplayer, OR
    Download the stand-alone installer posted at the bottom of the http://helpx.adobe.com/flash-player/kb/installation-problems-flash-player-mac.html in the ‘Still having problems’ section.
    If the behaviour reproduces, please clear the browser's cache and try downloading again.
    Thank you.
    Maria

Maybe you are looking for