When to set delta index flag for master data?

In transaction RSDDBIAMON2 option "Set Delta Flag" shows:
Table Name                     |Table Size          |Delta Index
/BIC/FTSGCSGMC          |10,155,000          |check mark in box
/BIC/DTSGCSGM1           |5,000,000           |check mark in box
/BI0/SVERSION               |700                    |check mark in box
When should I check the "Delta Index" column for fact, dimension, and master tables?  I believe I need to check fact and dimension tables for delta BIA to occur during roll up so users can see the newly loaded requests but I am uncertain on when and why I should check this box for master data.
Thanks for your input.

Vitaliy,
Thanks for answering another of my question.  Yes, we load master data a couple of times a day so based on your answer, I should also check box "Delta Index" for the master tables as well.  Thanks for helping me understand this point.
If the below is our approach, #2 should be checked for all tables, correct?
1)  Create/fill BIA index for cube "A"
2)  Check the box "Delta Index" for all the tables which includes fact, dimension, and master tables for cube "A" .
3)  Run process chain to roll up cube "A" daily
4)  Run process chain to merge cube "A" weekly
Many thanks,
Thao

Similar Messages

  • REPEAT DELTA WILL WORK FOR MASTER DATA DELTA LOADS

    Hi Guru's,
    REPEAT DELTA WILL WORK FOR MASTER DATA DELTA LOADS?
    AS I AM NEW TO SAP-BI CAN ANY ONE GIVE THE DETAIL EXPLANATION ON IT.
    IT WILL BE GREAT AND HELPFULL.
    Thanks and Regards,
    CM

    Hi CM,
    yes you can perform the repeat delta even for the master data if the datasource which was supplying the data was delta capable.
    You can do the repeat delta only in case if your previous request was erroneous and dont forget to make the request red in the status,so that system will treat like you are requesting the failed request once again where in case you will miss the previous request, i,e erroneous request if you dont make the request red.
    when you make the request red manually in the status,the pointer will move to the previous request once again in the delta queue and there will be no chance of missing the delta records.
    This process was same for both transaction as well as master data.
    Regards,
    Sunil...

  • About delta loading for master data attribute

    Hi all,
    We have a master data attribute loading failed which is a delta loading. I have to fix this problem but I have two questions:
    1. how can I find the those delta requests because I need to delete all these failed requests first, am I right ? Master data is not like cube or ods that we can find the requests in Manage, for master data how can we find them ?
    2. Could you please let me know the detailed procedures to perform this delta loading again ?
    Thanks a lot

    Hi...
    1. how can I find the those delta requests because I need to delete all these failed requests first, am I right ? Master data is not like cube or ods that we can find the requests in Manage, for master data how can we find them ?
    Look.....for master data.....no need to delete request from the target..........just make the status red.......and repeat the load.....But problem is that master data sometimes does'nt support Repeat delta..........if u repeat......then load will again fail with Update mode R.........in that case u hav to do re-init.......
    1) delete the init flag.......(In the IP scheduler >> in the top Scheduler tab >> Initialization option for source system)
    2) Init with data transfer(if failed load picks some records)..........otherwise .....init without data transfer.....if the last delta failed picking 0 records.......
    3) then Delta.......
    2. Could you please let me know the detailed procedures to perform this delta loading again ?
    1) Make the QM status red.........to set back the init pointer.......
    2) Repeat the load.....
    After that.........if again load failed with Update mode R.....
    1) delete the init flag.......
    2) Init with data transfer(if failed load picks some records)..........otherwise init without data transfer.....
    3) then Delta.......
    Regards,
    Debjani.....

  • BW - Delta for Master data not loading

    Hello All,
    I am not able to load delta for master data of vendor and material.
    The initial load works fine but when I load delta it gives an error
    " The extraction program does not support object 0vendor"
    " ALE change pointers are not set up correctly"
    Do i need to activate the change pointer in BD61. I was not sure so checking up with all experts.
    Regards
    Vanya

    Hi Ravi,
                We faced the same problem as you for <b>0MATERIAL_ATTR</b> and we solved it by the following steps,
    1. Delete the previous Delta Init for the InfoObject
    If you have access to R/3 Side you can follow the steps or you can follow the BW Steps
    Goto RSA7 ->Select and Delete the Init for the InfoObject
    <b>From BW Side</b>
    InfoPackage-> Inti for Source System -> Select & Delete
    2. Execute the Init InfoPackage for the InfoPackage again.
    3. After successful completion, execute the delta InfoPackage.
    We have have got resolved by doing these steps and it may help for your problem as well.
    Thanks & Regards,
    Chandran Ganesan
    SAP Business Intelligence

  • Is there settng source system level for master data delta loading perfo

    Hi Viwers,
    Good Morning,
    I am loading data from the source to target level for Master data it is taking too much time,I would like to know the how to increase the master data delta loading performance.
    I would like to know the source system level is there any setting requited & target system level also.
    Please give your inputs..........
    Thanks & Regards,
    Venkat Vanarasi.

    Venkat -
            Are u deleting the indexes of data target before delta loading ? If not,delete the indexes of data target before delta loading  and recreate them after delta loading is done.This procedure increase load performance.You can perform whole procedure in process chain.
    Anesh B

  • Generic delta for master data

    Hi All,
    I am trying to create a generic delta for master data "equipment"  . table in SASP is "EQUI". I am choosing ALE delta . in that I have to provide table name and "changed doc object' . when I give "equi" as table name and "aedat" (changed on ) as "changed doc object' -
    it is throwing error.
    also when I  tried  for time satmp /calday delta , the delta doesnot work when changeed are done. don't know why?
    Any idea what has to be given in these fields.
    Regards,
    Dola
    Edited by: Dola das on Mar 17, 2010 12:55 PM

    Hello,
    TCDOB is a table where you need to maintain entries for Generic Detla.
    Please goto SE16 : enter table name TCDOB
    In the selection Screen Enter Object as EQUI and check the Number Of Entries..
    I guess you need to maintain the number of fields on which the change is dependent for Detla ()
    I treid creating the same Datasource Using Ale Detla
    It worked and saved without errors :
    i have the following entries in TCDOB table
    OBJECT          TABNAME
    EQUI            EQKT
    EQUI            EQUI
    EQUI            EQUZ
    EQUI            FLEET
    EQUI            ILOA
    Please tell me your count and entries..
    so that we can discuss this Further
    Also if you find if there are no entries then you can Go to SCDO tcode and maintain entries with respect to change object.
    waiting for your feedback
    Regards
    Nitin Bhatia
    Edited by: Nitin Bhatia on Mar 18, 2010 6:44 AM

  • Deltas for master data

    Hi,
      Will you please let me know the proceedure for master data delta set up.is it something, after init the delata gets activated automatically or do we need to set it in r/3 side(Where checking the check box ('delta enable')for a data source or is it something there are some specific data sources which will support delta.
    Please let me know the proceedure for this, or if you have any docs or links please give me those.
    Thanks & Regards,
    Shashikanth.

    Shashi
    also please check this thread
    Re: Delta update for master data
    Hope this helps
    Thnaks
    Sat

  • Setting deletion indicator/Flag for an operation in production order

    Dear All, Hi
                     How to set deletion indicator/flag for an operation in a production order. I don't want to delete the operation as such. Please help & suggest.
    Ragards,
    Shaiz

    Hi Shaiz,
    I hope the requirement is just like you posted earlier as in case of component deletion.
    Here in this case also the behaviour of the order remains the same.
    So you can go for release the order as soon as the order gets created, your both the requirements can be fullfilled.
    Also you havn't provided the exact feedback on the solutions offered by all of us, on your last query.
    Request you to give detailed input on the same.
    Hope thish helps you.
    SmanS

  • Set Mass Deletion Flag for production orders

    Hi,
    I am planning to set mass deletion flag for production orders using programme PPARCHP1.
    My users dont have access to SE38/SA38 and they cant run this program from directly.
    I dont wnat my users to do this preprocessig using SARA or CO78 as it will give access to them for whole archiving process + it only works in background.
    I am thinking to create a Z tocde for this program and use it in dialogue mode also. Do you guys see any aftereffects with this process?
    Let me know if creating a Z tocde sounds a good idea?
    Edited by: santosh sarda on Mar 10, 2011 3:23 AM
    Edited by: santosh sarda on Mar 10, 2011 3:28 AM

    The standard SAP procedure to set the archive is that
    - you run COAC in background to set deletion flag and deletion indicator.   It should be set as the variant and run periodically (i.e., monthly). 
    - The system get all orders with DLV status in the past xx days depending on your variant to set deletion flag. 
    - Also it will change the status DLFL to DLT based on your residence time 1. 
    - The orders with DLT that reside in system longer than residence time 2 then archive.
    This should be done by IT department or background jobs.  Not the users.
    Which, in my opinion, is enough.  However, if you do need the user to select by themselves which orders they want to set deletion flag (why is that?).  Then you may want to explore enhancement. 
    For the sake of simplicity, I will recommend to use user status (normal cohv can change that, as we know; therefore no new fancy screen needed).  Then develop the user exit to set deletion indicator in PPCO0002 - EXIT_SAPLCORE_001 based on user status.  The rest of the procedure should remain the same.
    If you want to allow user to do every steps and develop some Z* screen for users, they must have the authorization for archiving anyway.  They just don't have the t-code COAC, and SARA.  In my opionion, it is overkill.  Plus in terms of users, I don't think they have skills enough if there is some errors during archiving (i.e., logical file error, etc.).
    Hope it helps.

  • DTP for Master data

    Hi Guys,
    How to extract master data and hierarchies from ECC into BI infoobjects?
    Do we follow the same procedure for loading master/hierarchy data as for transaction data - i.e. Load into PSA and then create Transformations and DTPs to load into infoobjects? If so,  what is the best approach as far as Full/Delta loads.
    1. Is it ok to do a full load into PSA (using infopackage) and then do a full load always (The process chains for master data would be running nightly).
    or
    2. Should we have 2 infopackages (full and delta) into PSA and then multiple DTPs. (The process chains for master data would be running nightly).
    Regards.

    Indicator: Only Get Delta Once
    Source requests of a DTP for which this indicator is set are only transferred once, even if the DTP request is deleted in the target.
    Use
    If this indicator is set for a delta DTP, a snapshot scenario is built.
    A scenario of this type may be required if you always want an InfoProvider to contain the most up-to-date dataset for a query but the DataSource on which it is based cannot deliver a delta (new, changed or deleted data records) for technical reasons. For this type of DataSource, the current dataset for the required selection can only be transferred using a 'full update'.
    In this case, a DataStore object cannot usually be used to determine the missing delta information (overwrite and creation of delta). If this is not logically possible because, for example, data is deleted in the source without delivering reverse records, you can set this indicator and perform a snapshot scenario. Only the most up-to-date request for the DataSource is retained in the InfoProvider. Earlier requests for the DataSource are deleted from the (target) InfoProvider before a new one is requested (this is done by a process in a process chain, for example). They are not transferred again during the DTP delta process. When the system determines the delta when a new DTP is generated, these earlier (source) requests are seen as 'already fetched'.
    Setting this indicator ensures that the content of the InfoProvider is an exact representation of the source data.
    Dependencies
    Requests that need to be fetched appear with this indicator in the where-used list of the PSA request, even is they have been deleted. Instead of a traffic light you have a delete indicator.
    Get Data by Request
    This indicator belongs to a DTP that gets data from a DataSource or InfoPackage in delta mode.
    Use
    If you set this indicator, a DTP request only gets data from an individual request in the source.
    This is necessary, for example, if the dataset in the source is too large to be transferred to the target in a single request.
    Dependencies
    If you set this indicator, note that there is a risk of a backlog: If the source receives new requests faster than the DTP can fetch them, the amount of source requests to be transferred will increase steadily.

  • Error handling for master data with direct update

    Hi guys,
    For master data with flexible update, error handling can be defined in InfoPackege, and if the load is performed via PSA there are several options - clear so far. But what about direct update...
    But my specific question is: If an erroneous record (e.g invalid characters) occur in a master data load using direct update, this will set the request to red. But what does this mean in terms of what happens to the other records of the request (which are correct) are they written to the master data tables, so that they can be present once the masterdata is activated, or are nothing written to masterdata tables if a single record is erroneous???
    Many thanks,
    / Christian

    Hi Christian -
    Difference between flexible upload & Direct upload is that direct upload does not have Update Rules, direct upload will have PSA as usual & you can do testing in PSA.
    second part when you load master data - if error occurs all the records for that request no will be status error so activation will not have any impact on it i.e. no new records from failed load will be available.
    hope it helps
    regards
    Vikash

  • Effect of 'No update' for master data texts

    Hello all,
    I am using flexible update - update rules - for master data text. I want to confirm effect of 'No update' in following scenario
    In the fisrt set of update rules -from first source system -I have mapped both short text and long text
    Key  |  short text| long text
    ......| overwrite| overwrite
    K1    | some SHTX| some LGTX
    In the second set of update rules, I have selected 'No update' for Long text
    Key  |  short text | long text
    ......|overwrite| NO UPDATE
    K1    | some SHTX |
    My loading sequence is first source system followed by second source system
    what should be my ultimate long text and why?
    Many thanks in advance!
    Regards
    Sanjyot

    Hi surekha,
    your long text will not get changed.
    Take the example you mentioned
    Key | short text.....| long text
    ........| overwrite......| overwrite
    K1 | some SHTX.. | some LGTX
    In the second set of update rules, I have selected 'No update' for Long text
    Key | short text... | long text
    ........|overwrite......| NO UPDATE
    K1 | some SHTX. |
    In the second set of update rules, you mentioned "No UPdate", So, when a record with the same key is found n Masterdata, it is not going to make any changes to the field Long text.
    Where as short text will get overwritten.
    Take some data for ex with the above mentioned type
    data from source 1 : 2000 Maggi Noodles
    data from source 2 : 2000 Feasters
    when the data from first source system arrives, data will be 2000 maggi noodles
    when the data from second source suystem arrives, as there exist this records with same key(2000), it checks for the other fiels
    short text will be over written with the value in second system (according to rules its in overwrite mode)
    Long text will not be updated as the rule says no update.
    hope this helps,
    Cheers,
    Srinath.
    Cheers,
    Srinath.

  • How can i make indexes on the Master data table

    Hi Gurus,
    I got a query, in this one i have an Infoobject with many values, like to say invoice number, and i filter and need a lot of Nav Attr of this IO, for this reason the query performance its really bad but i dont know what to do, some friend advice me this : So I would suggest some indexes on the Master data table of your IO for the navigationnal attributer you want to use.
    what else can you tell me? if i put this IO in Line item dimension? , or just with flag high cardinality ? help guys.....

    Hi Jorge.....
    Look.........Line item dimension and High Cardinality r related to each other..............Characteristic which has High Cardinality..we will use it as Line item dimension..........But........A dimension marked as a line item cannot subsequently include additional characteristics.......... This is only possible with normal dimensions...........If u r very sure.............then u can go for Line item dimension............no issues........
    U can also try to Create index..............
    Check this...........
    Re: Indexes on Master Data tables
    Regards,
    Debjani..........

  • DBconnect for master data

    Hi everbody,
    we are extracting customer master data from an Iseries-system cia DBconnect.The relevant info object 'Customer' has attributes and texts.
    Generation of the data source and loading of data worked without problems for the info object attributes (after generating the data source in the source system tree with setting 'master data attributes for 'data source type').
    Now i would like to extract from the same data source the info object -texts. Problem is that the system does not allow to assign for the same info source 'customer' the source system for a second time: 'No data sources for source system 'Iseries' available/message RSAR383.
    This is although I did generate the data source (below the source system tree) a second time with the setting 'master data texts' (for data source type).
    While it is normal that I can connect the data source only once to an info source for transaction data, I would expect to be able to use the data source twice for master data when differentiating between 'text' and 'attributes'.
    Has anybody an idea on this topic?
    Thanks and best regards
    Marcus

    Hi Joe, Hi Uday,
    thanks for the further clarification on the topic. The problem is now solved following your recommendations. From the source system we receive attributes and texts in one view. The corresponding data source was generated with type  'texts'. I had to manually enter the text fields 0TXTSH, 0TXTMD, 0TXTLG in info source and had to maintain mapping in the communication structure.On info provider level it was possible to create two update rules (type attribut and type text). Then-using one info package- the upload of attributes and texts worked fine.
    Thanks again for your support
    Best Regards
    Marcus
    I tried to reward due points, but I receive a message Rewarding the message failed
    Please bear with me, I will figure that out and reward the points.
    Message was edited by: Marcus Theviot

  • Error while creating Data Source for master data attributes

    Hi BI Experts,
    Well its been some time for me that I have been part of Extraction in BI.I primarily handled reporting in my last assignments.
    I was trying extraction with flat files in SAP BI 7(new to sap bi 7 but very much familiar with BW3.5) but failed in the activity during master data attributes and text upload in infoobject (say IOSP_Mat).
    Here is the procedure that I did after creation of characteristic IOSP_Mat.I created a source system for flat file followed by data source for Master data attributes, i selected all the parameters correctly.i.e. csv file format, data seperator as   ,
    and other settings, now when i am trying to look at the proposed data in the next tab using Load example data.its not showing the desired result.The columns that I have maintained in Flat File is  MAT_NUMBER and MAT_NAME (with say 100 data in the file)
    same is the result when I am trying to load the text data too columns maintained are
    (LANGUAGE MAT_NUMBER Short Description)(same 100 data).
    now i used to rsa1old transaction  to upload the file using 3.5 version.i created info source for master data/text/hierarchies for  IOSP_Mat
    now when trying to upload it using info package for master and text data,I observe its(the data) not maintained in the characteristic IOSP_Mat.
    When I monitored ,I figured the data has not been even uploaded to the PSA level.
    Can you BI experts tell me the answer for this.
    Thanks,
    Srijith

    apologies to all of you for late response,
    was busy with some other activities.
    I don't remember the exact message,but I remember it was not loaded to even the PSA level.I will try it again and post the exact message.
    Thanks again for your quick response.
    Once again sorry to all of you for my late response
    Thanks,
    Sri

Maybe you are looking for

  • Ftting headers and footers to a pdf page

    Hi All, I want to add headers and footers to the pdf pages ,but the problem is the page content spans all accross the page due to which headers will be over writing the content of the pdf page. Is there any way to compress the content of the page so

  • Stuck with Labview?

    So your stuck with Labview. Don't let that restrict your choice of DAQ boards. Amplicon provide PC based data acquisition boards with a range of software support, including Labview drivers. Take the ADLink DAQ2200 for example: http://www.amplicon.co.

  • Qosmio G20: Logitech quickcam driver doubles all media speed

    When I install the Quickcam Fusion driver on my Qosmio G20, the weirdest thing happens... All the sounds/video on the computer (music, mic, video, etc.) seems to get doubled in playing speed, how in the world is this possible?? Is the Quickcam Fusion

  • Setting date format for the entire SQL Server installation - SQL Server 2008 R2-2012

    Hi, I need to safeguard the behaviour of SQL codes that inserting into SQL tables with some date columns. For this purpose, I need to change the date format for the SQL instance and not using CONVERT and CAST. I've seen the SET DATE FORMAT statement,

  • Subroutine IN VTFL

    Dear Gurus, I am using a freight condition type as a header condition, The user has added freight in the price initially at the time of sales order creation, later on suppose the user wants to subtract the freight value at the time of invoice creatio