Delta Capture

hello Everyone
I would like to know whether two changes made to the records can be captured in BW  by delta as two diffrent record or single record for the same field in BW .

Hi Rajesh,
When you delete the record that has already been pulled to BW, in your R/3 a record gets created that is the exact negative of the one in BW. This is for cancelling the earlier record. Now if you create the record again another line is created giving you the newly created record.
But please understand that when you check in your RSA3 you see only a single consolidated record. The above mechanism i have explained holds good for delta only. So when you check the RSA7 delta queue or LBWQ outbound queue you will see 2 records.
But in your RSA3 you see only 1 consolidated record.
Please revert if any issues
Regards,
Pramod

Similar Messages

  • Delta capturing methods -num pointer,calday,timestamp....

    hi experts
    while discussion with seniors ,I got one question which was like...
    Normally in Generic Extraction for Transaction Data...we capture delta by NUMERIC POINTER or TIME STAMP or CALENDAR DAY....Is there any other way to capture delta....?Are there any tables which capture deltas?
    Incase of Generic Extractionfor Master Data ....is there any other way to capture deltas other than NUM POINTER or CAL DAY or TIME STAMP..?How do you do it step-by-step?which tables capture delta in this case?
    Thanks and Regards

    Hello,
    If your datasource is created based on a view you can use the generic delta - RSO2 - Generic Delta.
    (there you can see the options)
    If your datasource is based on a function module, you can not use this option - it will not work. In the function module you have to deal your delta in the FM code.
    Create a copy after the Function Module RSAX_BIW_GET_DATA_SIMPLE and in this code try to deal the delta.
    There is this table: ROOSGENDLM, that is showing you if you have an init and all the deltas. (time stamp). you can use this table.
    Please see the links below. Are very usefull:
    To create the FM simple (full) use:
    /people/p.renjithkumar/blog/2009/10/07/generic-datasource-creation-using-function-module
    https://wiki.sdn.sap.com/wiki/display/BI/Codeforgenericextractionusingfunctionmodule
    For the delta managing in the code, use this link
    /people/arun.varadarajan/blog/2009/07/09/generic-data-extraction-in-sap-bw-decoded--part-2
    Hope this is helpful
    Ramona

  • No delta can be captured on generic datasource

    dear experts,
    i create a new generic transaction datasource for sale order list.
    the fields include : 1 sales order number   
                               2 sales org
                               3 sales person
                               4 change date
                               5  bill to party
                               6  ship to party
    i choose 1 sales order number as the delta field with numeric pointer and the upper limit is set 1
    today(2009-03-18) i have a delta init with data on BW side and i get all sales order list data in PSA;
    also, i can monitor its delta queue via rsa7, it shows the current staus is 2009-03-17.
    then, i create a new sales order in r/3 side, and execute the first delta uploading on bw side.
    but there is no any delta captured in PSA, and i repeat the delta, still nothing.
    my question is why i can not get the delta when i create a new sales order.
    many thanks.
    best regards,
    steve

    Hi
    Your Source System date is different?? does this leads to mismatch of delta records?? Check your timestamp /system Date again .. May be you need Functional Consultant also to update Sales Order...
    If delta field is a Numeric Pointer i.e. generated record # like in GLPCA table, then use Lower Limit. Use count 10-100. Leave upper limit blank. If value 10 is used then last 10 records will be loaded again. If a record is created when load was running, those records may get lost. To prevent this situation, lower limit can be used to backup the starting sequence number. This may result in some records being processed more than once; therefore, be sure this DataSources is only feeding an ODS Object.
    Hope it  helps and clear

  • Extraction, Flat File, Generic Data Source, Delta Initialization

    Extraction, Flat File, Generic Data Source, Delta Initialization
    I have couple of questions regarding data extraction.
    1.     If you have Data Source a Flat File e.g. Excel file I know that you have to create Data Source at BW side. How do you upload updates, by selecting Delta Update when executing next Data Load? Do you ever u201Cconvertu201D this Excel file into Application Tables to become SAP Source 
    2.     Can you please give me example of situation where you had to create Generic Data Source? What is difference between Time Stamp, Calend. Day and Numeric Pointer. Which one is most common to select?
    3.     I read somewhere that Generic Data Source does not have Setup Table. I thought that you have to have Setup Table in order to load transaction Data otherwise you will lock the Application Tables. Where am I going wrong im my thinking please?
    4.     What are steps in terms of IP before, under and after Delta Initialization. I belive that you can do two ways:
    Full Update - Initialize Delta Process (without Data Transfer) u2013 Delta Update  or
    Initialize Delta Process (with Data Transfer) u2013 Delta Update
    Am I right? What is most common method and why?
    5.     If you want to add a filed in Data Source after 6 month using it, you want to do it without re-init Delta Queue. You add field in RSA6, then provide info for ABAP to populate new filed (info u2013 name of Data Source, Extract Structure, field added, name of Application Table which contains the field). How does it work now as there is no SetUp table it has been deleted after Initialisation? How does Delta Queue know that it is going to receive data which has been  expanded by one field or it may does not need to know at all?
    THANKSSSSSSSSSs

    Hi,
    1. If you have Data Source a Flat File e.g. Excel file I know that you have to create Data Source at BW side. How do you upload updates, by selecting Delta Update when executing next Data Load? Do you ever u201Cconvertu201D this Excel file into Application Tables to become SAP Source
    Once you create Datasource for A flat file extraction then it is file source system specific hence you cont change to Application table source Data source
    In info package you can change the source as application server instead of desktop no need to change the DS
    2. Can you please give me example of situation where you had to create Generic Data Source? What is difference between Time Stamp, Calend. Day and Numeric Pointer. Which one is most common to select?
    When we don't find any standard extractor then we can go for Generic(if i want information sales along with finance information in a data source then generally we dont get standard one hence we can go for generic DS)
    Check the below link for More about generic DS .
    http://wiki.sdn.sap.com/wiki/display/BI/Generic+Extraction
    for Delta capturing you can use
    Timestamp(if the table has time stamp filed  so it can stamp the last changed record in that table hence it is easy to get delta based on the time stamp)
    Calday- (If the table doesn't have the Timestamp filed then search for Calday where you can stamp the delta based on the date when documents are changed )
    Numericpointer : If the table doesn't above both then we go for this option where you can the numeric value change stamp )
    3. I read somewhere that Generic Data Source does not have Setup Table. I thought that you have to have Setup Table in order to load transaction Data otherwise you will lock the Application Tables. Where am I going wrong im my thinking please?
    Generic datasource nothing but we extracting data directly from the database table without any interface between the application/systems
    4. What are steps in terms of IP before, under and after Delta Initialization. I belive that you can do two ways:
    Full Update - Initialize Delta Process (without Data Transfer) u2013 Delta Update or
    Initialize Delta Process (with Data Transfer) u2013 Delta Update
    Am I right? What is most common method and why?
    Correct
    5. If you want to add a filed in Data Source after 6 month using it, you want to do it without re-init Delta Queue. You add field in RSA6, then provide info for ABAP to populate new filed (info u2013 name of Data Source, Extract Structure, field added, name of Application Table which contains the field). How does it work now as there is no SetUp table it has been deleted after Initialisation? How does Delta Queue know that it is going to receive data which has been expanded by one field or it may does not need to know at all?
    Once you add the new field to structure(DS) you will get the data as on date onwards not historical data hence what is the concept of setup table  ( delta records come from the Delta Que not from the setup table )
    If you want histaric data to new field then you need to setp table deletion ...etc...
    Hope it is clear..
    Regards,
    Satya

  • BW GENERIC EXTRACTION DELTA PROBLEM

    I have a problem for extracting delta from R/3 to BW.
    Init data is loaded from R/3 to BW system for ZST_ORDER datasource.
    Repair full request and selection parameter is created on date August 1st 2007 to August15th 2007.
    During the delta extraction - A lot of records around 500,000 records are getting extracting from R/3 to BW.
    R/3 side - There is only 5 entries available in AUFK table.
    It should extract delta only less than 35 entries.
    ZST_ORDER is generic datasource . Delta captured field is DATUM -New status changed record value 2.
    Advance Thx for your help. The code looks like
    FUNCTION ZST_ORDER.
    ""Local Interface:
    *" IMPORTING
    *" REFERENCE(I_REQUNR) TYPE SRSC_S_IF_SIMPLE-REQUNR
    *" REFERENCE(I_DSOURCE) TYPE SRSC_S_IF_SIMPLE-DSOURCE OPTIONAL
    *" REFERENCE(I_MAXSIZE) TYPE SRSC_S_IF_SIMPLE-MAXSIZE DEFAULT 1000
    *" REFERENCE(I_INITFLAG) TYPE SRSC_S_IF_SIMPLE-INITFLAG OPTIONAL
    *" REFERENCE(I_READ_ONLY) TYPE SRSC_S_IF_SIMPLE-READONLY OPTIONAL
    *" TABLES
    *" I_T_SELECT TYPE SRSC_S_IF_SIMPLE-T_SELECT OPTIONAL
    *" I_T_FIELDS TYPE SRSC_S_IF_SIMPLE-T_FIELDS OPTIONAL
    *" E_T_DATA STRUCTURE ZST_ORDEROPTIONAL
    *" EXCEPTIONS
    *" NO_MORE_DATA
    *" ERROR_PASSED_TO_MESS_HANDLER
    TABLES: AUFK, "Order master data
    TJ02T, "System status texts
    TJ30T, "Texts for user status
    JSTO. "Status object information
    DATA DECLARATION
    DATA: L_DATE TYPE DATS,
    L_STATUS TYPE J_STATUS,
    L_LINES TYPE SY-TABIX,
    L_CHANGED(1) TYPE C.
    Auxiliary Selection criteria structure
    DATA: L_S_SELECT TYPE SRSC_S_SELECT.
    Another data objects
    Service Order Data
    DATA: BEGIN OF LT_AUFK OCCURS 0,
    AUFNR LIKE AUFK-AUFNR,
    AUART LIKE AUFK-AUART,
    ERDAT LIKE AUFK-ERDAT,
    AEDAT LIKE AUFK-AEDAT,
    STDAT LIKE AUFK-STDAT,
    AEZEIT LIKE AUFK-AEZEIT,
    ERFZEIT LIKE AUFK-ERFZEIT,
    IDAT1 LIKE AUFK-IDAT1,
    IDAT2 LIKE AUFK-IDAT2,
    IDAT3 LIKE AUFK-IDAT3,
    LOEKZ LIKE AUFK-LOEKZ,
    OBJNR LIKE AUFK-OBJNR,
    END OF LT_AUFK.
    Individual Object Status
    DATA: BEGIN OF LT_JEST OCCURS 0,
    OBJNR LIKE JEST-OBJNR,
    STAT LIKE JEST-STAT,
    INACT LIKE JEST-INACT,
    CHGNR LIKE JEST-CHGNR,
    END OF LT_JEST.
    ***Change Documents for System/User Statuses (Table JEST)
    DATA: BEGIN OF LT_JCDS OCCURS 0,
    OBJNR LIKE JCDS-OBJNR,
    STAT LIKE JCDS-STAT,
    CHGNR LIKE JCDS-CHGNR,
    USNAM LIKE JCDS-USNAM,
    UDATE LIKE JCDS-UDATE,
    UTIME LIKE JCDS-UTIME,
    INACT LIKE JCDS-INACT,
    CHIND LIKE JCDS-CHIND,
    END OF LT_JCDS.
    DATA: BEGIN OF LT_JSTO OCCURS 0,
    OBJNR LIKE JSTO-OBJNR,
    STSMA LIKE JSTO-STSMA,
    END OF LT_JSTO.
    STATIC FIELDS
    STATICS: S_S_IF TYPE SRSC_S_IF_SIMPLE,
    counter
    S_COUNTER_DATAPAKID LIKE SY-TABIX,
    cursor
    S_CURSOR TYPE CURSOR.
    User-defined Ranges
    RANGES: L_R_AUFNR FOR AUFK-AUFNR ,
    L_R_ERDAT FOR AUFK-ERDAT.
    Initialization mode (first call by SAPI) or data transfer mode
    (following calls) ?
    IF I_INITFLAG = SBIWA_C_FLAG_ON.
    Check DataSource validity
    CASE I_DSOURCE.
    WHEN 'ZST_ORDER '.
    WHEN OTHERS.
    RAISE ERROR_PASSED_TO_MESS_HANDLER.
    ENDCASE.
    Copy selection criteria for future extractor calls (fetch-calls)
    APPEND LINES OF I_T_SELECT TO S_S_IF-T_SELECT.
    Fill parameter buffer for data extraction calls
    S_S_IF-REQUNR = I_REQUNR.
    S_S_IF-DSOURCE = I_DSOURCE.
    S_S_IF-MAXSIZE = I_MAXSIZE.
    S_S_IF-INITFLAG = I_INITFLAG.
    APPEND LINES OF I_T_FIELDS TO S_S_IF-T_FIELDS.
    ELSE.
    First data package -> OPEN CURSOR
    IF S_COUNTER_DATAPAKID = 0.
    LOOP AT S_S_IF-T_SELECT INTO L_S_SELECT
    WHERE FIELDNM = 'AUFNR'.
    MOVE-CORRESPONDING L_S_SELECT TO L_R_AUFNR.
    APPEND L_R_AUFNR.
    ENDLOOP.
    LOOP AT S_S_IF-T_SELECT INTO L_S_SELECT
    WHERE FIELDNM = 'ERDAT'.
    MOVE-CORRESPONDING L_S_SELECT TO L_R_ERDAT.
    APPEND L_R_ERDAT.
    ENDLOOP.
    LOOP AT S_S_IF-T_SELECT INTO L_S_SELECT
    WHERE FIELDNM = 'DATUM'.
    L_DATE = L_S_SELECT-LOW.
    ENDLOOP.
    OPEN CURSOR WITH HOLD S_CURSOR FOR
    SELECT AUFNR AUART ERDAT
    AEDAT OBJNR AEZEIT
    STDAT ERFZEIT IDAT1
    IDAT2 IDAT3 LOEKZ
    FROM AUFK
    WHERE AUFNR IN L_R_AUFNR AND
    ERDAT IN L_R_ERDAT.
    ENDIF.
    Fetch records into interface table LT_AUFK
    FETCH NEXT CURSOR S_CURSOR
    APPENDING CORRESPONDING FIELDS OF TABLE LT_AUFK
    PACKAGE SIZE S_S_IF-MAXSIZE.
    IF SY-SUBRC <> 0.
    CLOSE CURSOR S_CURSOR.
    RAISE NO_MORE_DATA.
    ENDIF.
    Determining the number of lines of the table LT_AUFK .
    L_LINES = LINES( LT_AUFK ).
    IF L_LINES IS NOT INITIAL.
    Sort the internal table LT_AUFK
    SORT LT_AUFK BY OBJNR ASCENDING.
    Selecting the records from JCDS depending upon the entries in LT_AUFK.
    SELECT OBJNR STAT CHGNR USNAM
    UDATE UTIME INACT CHIND
    INTO TABLE LT_JCDS
    FROM JCDS
    FOR ALL ENTRIES IN LT_AUFK
    WHERE OBJNR EQ LT_AUFK-OBJNR
    AND ( CHGNR EQ '001'
    OR UDATE >= L_DATE ).
    Sort the internal table lt_jcds.
    SORT LT_JCDS BY OBJNR STAT CHGNR ASCENDING.
    Select records from table JEST depending upon the entries in LT_AUFK.
    SELECT OBJNR STAT INACT CHGNR
    INTO TABLE LT_JEST
    FROM JEST
    FOR ALL ENTRIES IN LT_AUFK
    WHERE OBJNR = LT_AUFK-OBJNR.
    SELECT OBJNR STSMA
    INTO TABLE LT_JSTO
    FROM JSTO
    FOR ALL ENTRIES IN LT_AUFK
    WHERE OBJNR = LT_AUFK-OBJNR.
    SORT LT_JSTO BY OBJNR.
    ENDIF.
    LOOP AT LT_JEST.
    CLEAR: LT_AUFK,
    l_changed.
    CLEAR L_CHANGED.
    CLEAR LT_JSTO.
    READ TABLE LT_JSTO WITH KEY OBJNR = LT_JEST-OBJNR BINARY SEARCH.
    IF SY-SUBRC EQ 0.
    E_T_DATA-STSMA = LT_JSTO-STSMA.
    ENDIF.
    *End**
    Read the data from LT_AUFK.
    READ TABLE LT_AUFK WITH KEY OBJNR = LT_JEST-OBJNR BINARY SEARCH.
    E_T_DATA-AUFNR = LT_AUFK-AUFNR.
    E_T_DATA-AUART = LT_AUFK-AUART.
    E_T_DATA-ERDAT = LT_AUFK-ERDAT.
    E_T_DATA-AEDAT = LT_AUFK-AEDAT.
    E_T_DATA-AEZEIT = LT_AUFK-AEZEIT.
    E_T_DATA-ERFZEIT = LT_AUFK-ERFZEIT.
    E_T_DATA-IDAT1 = LT_AUFK-IDAT1.
    E_T_DATA-IDAT2 = LT_AUFK-IDAT2.
    E_T_DATA-IDAT3 = LT_AUFK-IDAT3.
    E_T_DATA-LOEKZ = LT_AUFK-LOEKZ.
    E_T_DATA-INACT = LT_JEST-INACT.
    E_T_DATA-CHGNR = LT_JCDS-CHGNR.
    e_t_data-datum = lt_aufk-erdat.
    READ TABLE LT_JCDS WITH KEY OBJNR = LT_JEST-OBJNR
    STAT = LT_JEST-STAT
    CHGNR = '001'
    BINARY SEARCH.
    IF SY-SUBRC EQ 0.
    E_T_DATA-UDATE = LT_JCDS-UDATE.
    E_T_DATA-AEDAT = LT_JCDS-UDATE.
    E_T_DATA-UTIME = LT_JCDS-UTIME.
    E_T_DATA-AEZEIT = LT_JCDS-UTIME.
    E_T_DATA-CHIND = LT_JCDS-CHIND.
    E_T_DATA-USNAM = LT_JCDS-USNAM.
    e_t_data-chgnr = lt_jcds-chgnr.
    IF LT_JCDS-UDATE GE L_DATE
    AND L_DATE IS NOT INITIAL.
    L_CHANGED = 'X'.
    ENDIF.
    IF LT_JEST-CHGNR NE '001'.
    CLEAR LT_JCDS.
    READ TABLE LT_JCDS WITH KEY OBJNR = LT_JEST-OBJNR
    STAT = LT_JEST-STAT
    CHGNR = LT_JEST-CHGNR
    BINARY SEARCH.
    IF SY-SUBRC EQ 0.
    L_CHANGED = 'X'.
    E_T_DATA-AEDAT = LT_JCDS-UDATE.
    E_T_DATA-AEZEIT = LT_JCDS-UTIME.
    E_T_DATA-CHIND = LT_JCDS-CHIND.
    E_T_DATA-USNAM = LT_JCDS-USNAM.
    e_t_data-chgnr = lt_jcds-chgnr.
    ENDIF.
    ENDIF.
    IF LT_JEST-STAT(1) EQ 'I'.
    E_T_DATA-ISTAT = LT_JEST-STAT.
    ELSEIF LT_JEST-STAT(1) EQ 'E'.
    E_T_DATA-ESTAT = LT_JEST-STAT.
    ENDIF.
    IF L_CHANGED EQ 'X'
    AND L_DATE IS NOT INITIAL.
    APPEND E_T_DATA.
    ELSEIF L_DATE IS INITIAL.
    APPEND E_T_DATA.
    ENDIF.
    ENDIF.
    CLEAR: LT_AUFK,
    E_T_DATA.
    ENDLOOP.
    CLEAR LT_JEST.
    REFRESH LT_JEST.
    next data package
    S_COUNTER_DATAPAKID = S_COUNTER_DATAPAKID + 1.
    ENDIF. "Initialization mode or data extraction ?
    ENDFUNCTION.

    hi,
    for that quantity SALK3 is referring a different table in the reference field for your table fields. check that field and refer the correct table for that field in table view.
    table join has a problem in that. i think your joined some other table with that field. replace the field assignment or create a new view.
    hope this help you
    regards
    harikrishna N

  • Unserialized delta

    Hi all
    I read a weblog of Mr robero regarding the delta types. In V3 unserialized delta.
    What does document sequence means. In that its given that V3 unserialized delta doesnt consider sequence and for Inventory sequence is not important.
    Can anyone plz explain what is sequence. Why sequence is not considered in MM but not in SD or PP. It would be great if someone explain me with an example.
    Regards
    Annie

    The following text is copied from one of the forums in SDN and this answer is given by <b>Ajay Das</b>. Sorry I dont have the link to that forum, but hopefully this will throw some more light on the V3 update. Try to search for this forum though...it will definitely help you understand the process of deltas (thanks to Ajay Das for his excellent post).
    The requirement in 'delta' capturing on R/3 side is to be able to capture the delta 'exactly once in order'.
    Take an example of the same PO item changing many times in quick succession.
    V1 (with enqueue mechanism) ensures that the OLTP tables are updated consistently. Update table gets these update records, which may or may not end up in correct sequence (as there is no locking) when it reaches BW. 'Serialized V3' was to ensure this correct sequence of update records going from update tables to delta queue (and then to BW).
    Since update table records have the timestamp, when the V3 job runs, it can sequence these records correctly and thus achieve 'serialization'. However, there is a technical problem with this. The timestamp recorded in update record is sent by the application server (where user executed the txn) and if there are multiple app servers there might be some inconsistency in their system time which can cause incorrect serialization.
    Another problem is in the fundamental design of the V3 process. V3 Job sequences the updates on timestamp, and then processes the update records from update table (to send it to delta queue), but it does so for one language at a time (update record also has user logon language stored). Why this is done is not clear to me, but it is a basic design feature and can not be subverted.
    This causes a potential issue if multiple logon languages are used by users. Serialization may not happen correctly in such a case. Take a case where the first update (based on earliest timestamp) to be processed is in language EN (for same PO item). V3 job is then going to process all the update records of language EN in chronological sequence before going to next language records. If another language update (for same PO item) happened in between two EN language updates, this is going to be processed later after all EN updates are processed and thus become out of sequence. The weblog mentions this scenario.
    These two constraints remain for 'serialized V3' where 'serialization' couldn't be truly achieved. Hence newer PIs have discarded 'serialized V3' altogether and now you do not have this option (if you are using a newer PI).
    If you use 'serialized V3', you have to be clear that the 'serialization' may not always work in the above two scenarios (multi language environment, and multiple app servers or updates to same records in same second(as timestamp has granularity upto second level only)).

  • Delta update for Info cube, How it will be works, what are mode to select.

    Hi all,
                  How delta update works for info cube, Generic data source delta update options are
                  1) time stamp
                  2) Calday
                  3) Nemeric pointer
    Please explain briefly delta update for infocube and delta init and delta update.
    Thanks,
    Gowda

    Hi,
    How delta update works for info cube
    When you are updateing the data to cube from ods then data would extracted from change log their delta pointer get setup in the source request (in Source you can check the option beside the reqest says updated to target object.... it means the records are updated to target hence we would not update the same request )
    Incase of PSA to cube Also in the PSA  request get stamped as last updated .
    Regarding Generic Datasource
    When we don't find any standard extractor then we can go for Generic(if i want information sales along with finance information in a data source then generally we dont get standard one hence we can go for generic DS)
    Check the below link for More about generic DS .
    http://wiki.sdn.sap.com/wiki/display/BI/Generic+Extraction
    for Delta capturing you can use
    Timestamp(if the table has time stamp filed so it can stamp the last changed record in that table hence it is easy to get delta based on the time stamp)Calday- (If the table doesn't have the Timestamp filed then search for Calday where you can stamp the delta based on the date when documents are changed )
    Numericpointer : If the table doesn't above both then we go for this option where you can the numeric value change stamp )
    Re: Extraction, Flat File, Generic Data Source, Delta Initialization
    Regards,
    Satya

  • Regarding Delta Record

    Hi,
    Is the delta record is capture for any field changes (Including KF, CHAR) for a particular sales/Delivery/Billing document or on changes of specific fields for sales/Delivery/Billing document.
    If delta capture for specific fields change then where (Tables) these fields are maintained and how to find out this.
    Points asssured....
    Thanks,
    Debasish

    I meant a ZFIELD added in VBRP (billing docs items) for instance... If you use a standard SD exit to populate this field, the change will be captured by the delta mechanism just before the system COMMITS the transaction (this particular billing document change)...
    Adding a ZFIELD in your extract structure will not suffice: suppose that you are adding a ZFIELD in your billing items extract structure; let's assume that your enhancement is populating this field with a simple lookup to VBRK (the header of a billing document).
    In few words you want to capture a change in you billdoc head and extract it with your billitm DSource.
    This won't work: if a user change the header of a billdoc, only 2LIS_13_VDHDR will capture a delta. 2LIS_13_VDITM won't get any delta record since no item has been changed... Note that enabling this feature would considerably "complexify"  the extraction in oddition to bring way much more records... indeed a before image reversal and an after image of the record (before and after COMMIT) would have to be extracted in order to avoid double records in BW....
    The above stuff should better be done in BW itself by staging your data with DSOs or even with modelling...
    hope this shed light...
    Olivier.

  • Using Bulk Scenario to Capture Data from Source for First time

    Hi,
    I need to setup ODI for a table between two Database. Genereally, I follow practice of importing oracle  dump of table into target database for the first time before delta capture preocess starts. However, this time there is space constraint at source system & we can not create the oracle dump as it will result into 120-130 GB.
    Is it possible to import these one time data into target Database via ODI Bulk Schenario. Does ODI have any timeouts set at the database connetion as this will take long running session to copy across such a huge data. Will it cause any time outs at Database.
    Please suggest.
    Thanks.

    Hi All,
    Thanks a lot for your responses. Moving ahead and trying out different things, i figured out that the EPMA/Planning had to be on the same machine as the ERPI. After installing EPMA/Planning on the same machine, the ERPI target application registration page started displaying all the applications created using the EPMA and i also successfully registered my target planning application with ERPI. But you know life is not that fair, right after the resolution of this problem, another problem appeared :(. While registering the target planning application, EPMA was by default selected as the Metadata Load method and i selected FDM as the data load method.
    After registering the application, i happily started adding the metadata rule as instructed in the knowledge document 951369.1 (Thanks user735469 for the reference), but when i clicked on the Add Dimension button, the next page gave an error stating There are no more dimensions remaining to be mapped under this application. and the dimensions drop down under the target application area was empty.
    According to this link http://download.oracle.com/docs/cd/E12825_01/epm.111/readme/fdm_11113_readme.html , this error can be fixed by clicking on the refresh button but i couldnt find a refresh button on the entire page. I tried refreshing the browser but to vain. I also downloaded and applied latest ERPI patch but still no positive results.
    Any idea why is this happening ?
    Desperately waiting for your help. :(
    Thanks & Regards,
    Muhammad Jamshaid Nawaz

  • Infoset functionality in BI-HR

    HI all,
    I worked on infoset, but functionality of Infoset with time dependent objects is different. Can any one share there experience with infoset scenarios in HR module implementations.
    0HR_PA_0 and 0HR_PA_1 doesn't support delta , then how to capture deltas for these datasources. Can any one suggest approach to achieve delta capture mechanism with BI_HR implementation.
    What is the functionality of KEY DATE IN HR-BI, I know that key date will be applicable for time dependent master data objects.I need some more inputs about this concept.
    Regards,
    rvc

    Since most of HR reports requires EMPLOYEE, HRPOSITION, JOB and ORGUNIT information, so u need to create InfoSets with couple of these master data objects along with cube/DSO.
    u2022 Inner join: A record can only be in the selected result set if there are entries in both joined tables
    u2022 Left outer join: If there is no corresponding record in the right table, the record is part of the result set
    (fields belonging to the right table have initial values)
    u2022 Temporal join: A join is called temporal if at least one member is time-dependent.
    If there are InfoSets with time-dependent master data, do not restrict the data by the fields Valid from (0DATEFROM) and Valid to (0DATETO). See also an example in the Data Models using InfoSets section of this document.
    When u implement Time Management & CATS need to set up Time types and quota types etc in source system. For PA & OM I do not think U need to do any config.
    Most of HR datasources (except time management and CATS) doesn't support delta. So, daily load process chain would delete complete data and do a full load.
    The key date contains the date for which time-dependent master data is selected. You determine the key date either in the query definition (query properties) or supply the value using a variable. The default value for the key date is the date on which the query is executed, that is <today>. If you select 01.01.1999 for example, time-dependent data is read up to 01.01.1999.

  • Differences between the types of DSO's and DTP's

    Hi Friends,
    1. pls help me in telling different types of DSO's and their differences?
    2. pls help me in telling different types of DTP's and their differences?
    Thanks in Advance.
    Jose Reddy

    Hi,
    Please check :
    1. pls help me in telling different types of DSO's and their differences?
    There are three types of DSO in BI 7.0........
    Standard DSO
    It consists of three tables.......activation queue,active table,and change log table.....It is completly integrated in the staging process.....Using change log table means that all changes are also written and are available as delta uploads for connected data targets.......
    Write optimized DSO
    It is integrated for warehouse level of architecture..........and has the advantage of quicker loads.has only active data table .(with delta capture capability) ...... SIDs are not generated
    Check this :
    http://www.biportal.org/Default.aspx?pageId=90410&mode=PostView&bmi=17541
    Direct update DSO
    It consists of active data table only. This means it is not easily integrated in the staging process...SIDs are not generated ......This DSO type is filled using API and can be read via BAPI......It is generally use for external application and APD...
    Also check this :
    http://help.sap.com/saphelp_nw04s/helpdata/en/f9/45503c242b4a67e10000000a114084/content.htm
    2. pls help me in telling different types of DTP's and their differences?
    http://www.sdn.sap.com/irj/scn/index?rid=/library/uuid/10339304-7ce5-2c10-f692-fbcd25915c36
    -Vikram

  • Local currency showing higher value in BW compared to ERP - Sales report

    Hi,
    My sales report is showing higher value in BW report compared to ERP e.g
    the report is showing 567,01 and in ERP it is showing 566,94.
    is there a way I can resolve this in the query.
    thanks

    Hi  Bhat,
    there may be lot many factors like
    1)comparision between the Two i.e ERP and BIW.Are the Factors same.
    2)Have u  Deltas captured properly.
    3)have u missed any deltas any day.
    4)Any order that has been modified .
    5)hope u have considered ONE days Lag between BIW and ERP.
    6)To what is ur 0Recordmode mapped to.
    7)Is it because of Rounding OFF Values.
    Rgds
    SVU

  • Purchase order datasource ?

    Hello
    COuld you please let me know which datasource ll support me to capture purchase order changes ?
    Do i have to enhance any datasoruce for it ? Let me know if AEDAT fields need to be added in any datasource and how can I find out this field laying in any table related to purchase order .
    Thanks in advance
    Regards

    There are several standard content DataSources for purchasing that you can activate and capture deltas, without having to create a generic delta. They are all LO extractors, so the setup and delta capture are a little different. Here are the DataSources available:
    2LIS_02_HDR - Purchase Order Header
    2LIS_02_ITM - Purchase Order Item
    2LIS_02_SCL - Purchase Order Schedule Line
    2LIS_02_ACC - Purcahse Order Account Assignment (available starting with ECC6)
    The activation, setup and scheduling of delta jobs can be done in tcode LBWE on your source system.

  • Validating data/Test  for dso

    Hello Guys
    I am writing a TD for a DSO and need to put Test conditions and validating procedures.
    so what could i do to prove my dso is working fine.
    1. Is this a true test to run extraction and check number of records in dso/psa equals source table or rsa3 or the extractor.
    2. And how do u verify delta load. would that be number of records loaded during the delta load equal to the new records in source or what else can i do to verify delta load test
    i kind of need this soon so please if u can help. i will assign points right away.
    Thanks

    Hello Mark,
    1. You can check the number of records of the given selection in RSA3 in the source system.
    2. Its not easy to find the new records as delta captures both new and modified records, but if you still want to find you can check the DSO change log table for NEW Record Flag.
    Hope it helps
    Thanks
    Chandran

  • 2lis_03_um setup tables issue

    Hi experts,
    we have taken downtime to fill the setup tables for inventory.
    MCNB - 2lis_03_bx
    OLI1BW - 2lis_03_bf
    OLIZBW - 2lis_03_um
    we have the company codes 1100,1200,1300,2000,2800 at the moment in ECC.
    but we will have more company codes coming up in next 2 months.
    here my doubt is if we fill the setup tables for the above company codes for material revaluation (OLIZBW), do we need to take downtime again to fill for the upcoming company codes??
    OR if u know the upcoming company codes then we can fill the setup tables for those company codes also, even there is no data once the new company codes starts the deltas will be coming into BW, instead of taking downtime again and fill the setup tables for the new company codes?
    Please clarify these issues.
    Regards
    venuscm

    Filling setup table wouldn't be required for the upcoming company codes..
    Since delta captures new and changed records, the new records for the upcoming company codes will be captured in the delta loads, hence there is no need to fill up setup tables again for the upcoming company codes.
    The only thing you have to do this either dont give any selection for company code during initialization or include the upcoming company codes also in the selection if required during initialization.
    --- Thanks...

Maybe you are looking for

  • Unable to capture video from Sony HD HDR-HC7

    I am trying to import video from a Sony HD HDR-HC7 into Final Cut Pro 5.1.1 using the log an capture feature. I have tried most combinations of the presets, but the device is never seen. iMove (both iLife8 and the earlier version) has no problem at a

  • Problem installing Flash

    I am having three problems with Adobe Flash and would like some assistance in resolving them. 1)  Problem number ONE:  Several months ago installing an antivirus program broke my Internet Explorer.  All attempts to fix or upgrade IE have failed.  Not

  • OBIEE 11g "BI Scheduler" is not starting?

    Hi Experts, As usual I tried to started the services today but surprisingly BI Scheduler is not running. C:\OBIEE11g\instances\instance1\bin>opmnctl status Processes in Instance: instance1 -------------------------------------------------------------

  • Does Aperture use Adobe Camera Raw in any way?

    Does Aperture use Adobe Camera Raw in any way? Is it important than I upgrade ACR if I do not use Photoshop? Jerry

  • Place image in jtable

    Hello I want to place an image in a jtable. How can I do this because when I tried it I only saw the link to the file