Datamart - Generate Datasource DELTA flag check

We are on BW 3.5 (source system) and trying to send data across to target system (SCM / BI 70).
When I perform "generate datasource" for a cube, the resulting datasource " 8* " is generated correctly but with the DELTA FLAG CHECKED (txn .RSA6 ) for that datasource
With this flag setting, we are forced to run an INIT and then DELTA from the target system to extract the data
Question :
1) How do we "uncheck"  the Delta Flag setting ? Is it possible ?
2) If we cannot uncheck the flag , how do we run Full loads from our target system ?
Thanks in advance .
Hitesh

Hi,
Even if the datasource is delta enabled it means you can run delta loads for it but it doesn't mean that you cannot run the full loads.
For unchecking the flag in the Delta for a datasource you have make that particular delta field to be disabled in the RODELTAM table for the particular data source. This can be done by using a custom ABAP program.
Hope it helps !
Regards,
Rose.

Similar Messages

  • RSA3 check generated datasource error, ODS data can't be updated into cube

    Hi Experts,
    I create an ODS( T_ODA01 ), and loaded data from flat file into this ODS. Of course, I activated the ODS after the loading process.
    Now Created an Infocube( T_CUBE01 ) similiar with the ODS.
    Next I want to update the data from ODS T_ODS01 into T_CUBE01.
    Following are my steps:
    1, Create Update rule from the InfoCube T_CUBE01, input the ODS T_ODS01 as the datasource.
    2, Activate the update rule
    3, Selected the 'Update ODS data in data target' in the context menu of ODS T_ODS01
    I checked the status of this process in the monitor. But it failed informed me that 'No IDoc generated in BW', and sent 0 records.
    Then I wanted to check the Generated DataSource 8T_ODS01 in RSA3. But when executed the check process, it noted me that Runtime Error.
    Syntax error in program "SAPLXRSA ".
    What happened?
    ------------
    The following syntax error occurred in the program SAPLXRSA : "The type "RSR_T_RANGESID" is unknown." Error in ABAP application program.
    The current ABAP program "SAPLRSAP" had to be terminated because one of the statements could not be executed.
    This is probably due to an error in the ABAP program.
    Error analysis
    The following syntax error was found in the program SAPLXRSA : "The type "RSR_T_RANGESID" is unknown."
    Information on where termination occurred
    The termination occurred in the ABAP program "SAPLRSAP" in  "CALL_DATA_CUSTOMER_FUNCTION". The main program was "RSFHGEN2 ".
    The termination occurred in line 117 of the source code of the (Include)  program "LRSAPF06" of the source code of program "LRSAPF06" (when calling the editor 1170).
    Source code extract
    001140   * Call Customer-Exit
    001150     message s299 with 'BEGIN EXIT' sy-uzeit.
    001160     clear sy-subrc.
    >     call customer-function '001'
    001180          exporting
    001190               i_datasource             = l_datasource
    001200               i_isource                = l_12b_source
    001210               i_updmode                = p_updmode
    001220          tables
    001230               i_t_select               = p_t_select
    001240               i_t_fields               = p_t_fields
    001250               c_t_data                 = p_t_data
    001260               c_t_messages             = l_t_messages
    001270          exceptions
    001280               rsap_customer_exit_error = 1
    001290               others                   = 2.
    001300     message s299 with 'END EXIT' sy-uzeit.
    Would you pls guide me how to crorrect this error?
    Many thanks for your help.

    Hi Kenneth,
    this user exit is called everytime you request transactional data, as you said, system wide.
    Then in the exit you have usually a CASE WHEN I_SOURCE = .... where you code your enhancement for a particular datasource.
    Needless to say that if you enable the exit, even for testing, you shall write something in; at least
    CASE I_SOURCE.
       WHEN 'your_data_source'.
    ENDCASE.
    hope that sheds light
    Olivier

  • DataMarts, Generated Export DataSources, Connecting DS to Infosources

    I try to build a Data Mart Architecture and thereby an ODS with their Export DataSource gets connected with a InfoSource of an reporting cube. After generating the export DataSource, I replicated it. After this I tried to connect the InfoSource with the DataSource, but the system didn’t display any custom generated DataSources (starting with  8*****) when I pressed the button “select DS” in InfoSource.
    I checked the DataSource  with RSA6 and it exist on BW myself system. I also tried “insert lost nodes” in Infosource and “display generated objects” in RSA1. 
    Has someone an idea what other problem it could be?
    Thanks for your answer.
    Cheers

    hello heiko,
    There is a setting that enables you to view generated objects. Go to RSA1->Settings Menu->Display generated objects->Show generated objects.

  • Unable to activate Transfer Structure for Generated Datasource in BW3.5

    HI Experts,
    We have a scenario where the data needs to moved from DSO A to Cube B
    we have clicked the Export datasource and created 8(ODS A)
    when created IP for this it gave us all selection in selection screen but we required only 3 Characters
    <br> so we maintained the export datasource and saved it when we again tried to generate datasource it went to short dump       
    <br> with following messages:
    <br>  Short text of error message:                                               
    <br>  Serious internal error:              <br>   
    <br><h4> Transfer structure got deactivated and we are unable to activate it again even by RS_TRANSTRU_ACTIVATE_ALL <br>
    </h2>
    <br>                                                                               
    <br>Technical information about the message:                                   
    <br>  Diagnosis                                                                 
    <br>     A serious internal error occurred. It could not be corrected.         
    <br>  Procedure                                                                 
    <br>      The following information is available on this error:                 
    <br>      1.                                                                    
    <br>       2.                                                                    
    <br>      3.                                                                    
      <br>     4.   OSS note                                                         
       <br>    Check the OSS for corresponding notes and create a new problem        
    <br>      message if necessary.                                                 
    <br>  Message classe...... "RSAR"                                                
    <br>  Number.............. 001                                                   
    <br>  Variable 1.......... " "                                                   
    <br>  Variable 2.......... " "                                                                               
    <br>Source Code Extract                                                                               
    <br>Line  SourceCde                                                                               
    <br> 514         IF g_subrc NE 0.                                                 
    <br>  515           MESSAGE x001.                                                  
    <br>  516         ENDIF.                                                           
    <br>  517 *       Check new version neccessary                                     
    <br>  518         l_curr_version = c_s_is_admin-odsversion.                        
    <br> 519                                                                          
    <br>  520         IF l_new_version EQ rs_c_true.                                   
    <br>  521           IF i_without_versioning EQ rs_c_true.                          
    <br>  522             MESSAGE e023 WITH c_s_is_admin-odsname_tech                  
    <br>  523                               c_s_is_admin-isource                       
    <br>  524                     RAISING no_psa_version_allowed.                      
    <br>  525 *         ELSEIF i_without_versioning EQ 'C' AND                         
    <br>  526 *                i_without_transport  EQ rs_c_true.                      
    <br>  527 *--         special handling for migration.                              
    <br>  528 *-          current system is target system no transport request         
    <br>  529 *-          necessary if a new version is needed                         
    <br>  530           ELSEIF i_without_versioning EQ 'C'        AND                  
    <br>  531                  i_without_transport  EQ rs_c_false.                     
    <br>  532             RAISE inconsistency.                                         
    <br>  533           ENDIF.                   .                                     
    <br> 534                                                                          
    <br>  535           IF i_objtype = rsa_c_istype-data.                              
    <br>  536 *           check type of InfoSource first                               
    <br>  537             SELECT SINGLE issrctype FROM rsis                            
    <br>  538                            INTO l_isrctype                               
    <br>  539                           WHERE isource = c_s_is_admin-isource           
    <br>  540                             AND objvers = rs_c_objvers-active.           
    <br> 541             IF sy-subrc <> 0 OR l_isrctype = rsarc_c_issrctype-ods.          
    <br> 542 *             if generated ODS InfoSource,                                   
    <br> 543 *             no PSA versioning possible                                     
    <br>>>>>               MESSAGE x001.                                                  
    <br> 545             ENDIF.                                                           
    <br> 546           ENDIF.                                                             
    <br> 547           l_next_version = l_curr_version + 1.                               
    <br>548           c_s_is_admin-odsversion = l_next_version.                          
    <br> 549           l_transtru_odsname+13(3) = l_next_version.                         
    <br> 550 *         Don't use old progname but create new one                          
    <br> 551           CLEAR l_progname_ods.                                              
    <br> 552         ELSE.                                                                
    <br> 553           l_next_version = l_curr_version.                                   
    <br>554         ENDIF.                                                               
    <br>555                                                                               
    <br> 556       ENDIF.                                                                 
    <br> 557 *     set transtru_names for IDoc                                            
    <br> 558       c_s_is_admin-odsname       = i_transtru_name.                          
    <br>559       c_s_is_admin-odsname_tech  = l_transtru_odsname.                       
    <br>560     WHEN OTHERS.                                                             
    <br> 561   ENDCASE.                                                                   
    <br> 562
    Edited by: Sachin Shankarrao Dehey on Oct 20, 2010 8:59 AM
    Edited by: Sachin Shankarrao Dehey on Oct 20, 2010 9:03 AM

    Hi,
    Delete all the DS and try creating from starting, to export DS, with correct selections, and then Activate the Transfer Structure..
    Hope Helps,
    Regards,
    Laj

  • How to generate field on UCM check-in form???

    Hi,
    How to generate field on UCM check-in form???
    I mean in check-in new document form i have an option list metadata field (ex: Type)
    When i select Person like Type value, it generate me input text depended on it for exemple
    - Name (input text)
    - Adress (input text)
    - phone (input text)
    And when i click on check-in button it save the input value on Person table
    Please help me
    Khadim

    Hi jiri.machotka
    I Know this link but what i want to do is generating dynamically a sub form (in checkin form) depending on choice
    For example when my choice is Person it generate a sub form containing
    Name (input with its label)
    Firstname
    Lastname
    Age
    Regard
    CABDiop

  • How to generate datasource for inforsource in BW 7

    Hi all:
         Could you please tell me how to generate datasource  for inforsource  in BW7, it seems that the way
    is different from other lower edition BW 3.x.   Should I  choose   Tranformation  ? and then how ?
    Thank you very much!
    Edited by: jingying Sony on May 9, 2009 8:41 AM

    1) First thing is that in BI 7.x the infosource is optional.You can directly create a transformation between datasource and data target (e.g. cube). If you still want to use a infosource then you will require 2 transformations. 1 will act as a transfer rule and other one will act as a update rule. This is a crude example showing the comparision between 3.x and 7.x.
    2) Infopackage in 7.x will load the data only till PSA. From PSA to data target you will need DTP. DTP will load data from PSA to data target. So you need to ensure you create a complete flow from datasource to data target and create infopackage & DTP.

  • Red Traffic Light for Datasources Delta Queues

    Hi all,
    Our R/3 source system is connected to our BW system.
    Between the two systems we operated standard logistics datasources delta queues, by filling the setup tables and performing an initial update.
    The datasources delta queues were created and used over a month (according to the RSA7 they all marked with green traffic light).
    Now, we copied the R/3 source system to a new one.
    After doing so, all the delta queues traffic light turned to be red.
    Does anyone can provide a technical explanation/reason to this problem?
    Also, is there something we could do to "save" this delta queue, without needed to delete it and create it all over again?
    Thanks ahead,
    Shani Bashkin

    Hi Eddo,
    Thanks for your help.
    Yes, I'm using the same system name and system id. The new copied system has the same name and id like the productive system.
    Also, it seems like the RFC connection to the BW is lost.
    The question is why?
    Thanks,
    Shani Bashkin

  • Issue with SID's flag check on DSO

    Hello,
    I have a DSO with the SID's flag checked. With this check on, when I run a query, the query dumps or doesnt show data.
    In RSRV when I do a test on this DSO, I get the following errors (which dont get corrected after running "correct error")
    The SID values are missing for 1 specifications for characteristic ZCH_STDAT
    The SID values are missing for 1 specifications for characteristic ZCH_TGFDT
    The above two characteristics are referenced on 0DATE. Many records in the DSO have blank values from the source for the above date objects, which is expected as they might not be populated all the time.
    We didnt have these issues before. It seems its a data related issue. But I am not able to pinpoint, as the load and activation goes smooth.
    Would appreciate anyone's thoughts on this.
    Thanks
    Prakash

    The message is telling you that you have a value in the DSO for InfoObjects ZCH_STDAT and ZCH_TGFDT  which are not found in ZCH_STDAT and ZCH_TGFDT themselves.
    This can happen when if you have chosen NOT to create master data as data is loaded to the InfoProvider.
    Insure that all values found in the DSO are also found in the InfoObject, and everything will be fine.

  • Sales order with material having environmentally rlavant flag checked

    Hi All,
    I have a requirement where i have to develop a ALV for sales order details with material having environmentally rlavant flag checked during material creation.
    Problem: The details of sales orders are picked, whic dont have material.
    So I have used the following logic:
    1) data retrieval:
    select vbeln
           erdat
           auart
           kunnr
      from vbak
      into table it_vbak
      where erdat in s_erdat
        and auart in s_auart.
      if it_vbak is not initial.
        select vbeln
               matnr
          from vbap
          into table it_vbap
          FOR ALL ENTRIES IN it_vbak
          where vbeln = it_vbak-vbeln.
          if it_vbap is not initial.
            select matnr
                   kzumw
              from mara
              into table it_mara
              FOR ALL ENTRIES IN it_vbap
              WHERE matnr = it_vbap-matnr
                and kzumw = 'X'.
            endif.
        select vbeln
               bstkd
          from vbkd
          into table it_vbkd
          FOR ALL ENTRIES IN it_vbak
          where vbeln = it_vbak-vbeln.
         select vbeln
                parvw
                kunnr
           from vbpa
           into table it_vbpa
           FOR ALL ENTRIES IN it_vbak
           where parvw = 'WE'
             and vbeln = it_vbak-vbeln.
      endif.
            if it_vbpa is not initial.
            SELECT kunnr
                   adrnr
              from kna1
              into table it_kna1
              FOR ALL ENTRIES IN it_vbpa
              where kunnr = it_vbpa-kunnr.
              if sy-subrc eq 0.
                select addrnumber
                       name1
                       name2
                       street
                       city1
                       region
                       post_code1
                  from adrc
                  into TABLE it_adrc
                  FOR ALL ENTRIES IN it_kna1
                  where addrnumber = it_kna1-adrnr.
                   if it_adrc is NOT INITIAL.
                     select addrnumber
                            smtp_addr
                       from adr6
                       into TABLE it_adr6
                       FOR ALL ENTRIES IN it_kna1
                       where addrnumber = it_kna1-adrnr.
                  endif.
                    endif.
              endif.
    2) Reading the data:
    loop at it_vbak into wa_vbak.
    clear wa_final.
      read table it_vbap into wa_vbap with key vbeln = wa_vbak-vbeln.
      read table it_mara into wa_mara with key matnr = wa_vbap-matnr.
      read table it_vbkd into wa_vbkd with key vbeln = wa_vbak-vbeln.
      read table it_vbpa into wa_vbpa with key vbeln = wa_vbak-vbeln.
      read table it_kna1 into wa_kna1 with key kunnr = wa_vbpa-kunnr.
      read table it_adrc into wa_adrc with key addrnumber = wa_kna1-adrnr.
      read table it_adr6 into wa_adr6 with key addrnumber = wa_kna1-adrnr.
        wa_final-matnr = wa_mara-matnr.
        wa_final-bstkd = wa_vbkd-bstkd.
        wa_final-erdat = wa_vbak-erdat.
        wa_final-kunnr = wa_vbpa-kunnr.
        wa_final-name1 = wa_adrc-name1.
        wa_final-name2 = wa_adrc-name2.
        wa_final-street = wa_adrc-street.
        wa_final-city1 = wa_adrc-city1.
        wa_final-region = wa_adrc-region.
        wa_final-post_code1 = wa_adrc-post_code1.
        wa_final-smtp_addr = wa_adr6-smtp_addr.
        append wa_final to it_final.
        clear wa_final.
    endloop.
    Problem: The details of sales orders are picked, whic dont have material.
    So please suggest me the solution for this problem.
    With Regards,
    S.Asha.

    You are not filtering your VBAK records, after getting data from MARA table, Moreover I can see your consolidation logic is also wrong ..you can refer following pseudo code .
    LOOP at IT_VBAP.
    READ TABLE IT_MARA ..... based on material no in line item
    if sy-subrc = 0 .
    ...get the data from other internal table and pass it to WA_FINAL..
    APPEND WA_FINAL TO IT_FINAL..
    else .
       CONTINUE..
    endif .
    ENDLOOP .

  • Master datasource delta problem.

    Hello,
    We are in process of implementing logistics module for our client. We are in BI7 and using all BI7 data flow for master and transaction data .
    While working with all master datasource which is supporting delta functionality, we have faced following issue. All this datasources are based on ALE pointer. somehow we are getting the same number of records in delta load...basically it is bringing the same records in delta load everytime.
    E.g - 0CUSTOMER_ATTR . when i extracted the data first time in delta load, it has brough 1200 records and from that day it is bringing 1000+ records in delta load. even if i start the delta in one min, it is bringing that much data. I am very confident that, there is not much changing in ECC ..but here it is bringing it from change pointer table and not updating the same table once the extraction is completed.
    When we dig into this issue, we realize that, somehow change pointer is not getting updated in change pointer table .  We haven't changed anything in standard extractor but still it is happening with all master datasource (delta based ) extractor .
    Can you please help me solving this issue ?
    Regards,

    Hi Maasum
    Not sure whether the below will help.. but give it a try:
    /people/simon.turnbull/blog/2010/04/08/bw-master-data-deltas-demystified
    Thanks
    Kalyan

  • Generating DDL Deltas

    Is there an easy way to generate a delta DDL file after I've made changes to a schema?

    Hi John,
    I'm experiencing the same issue at this time. As a general workaround, I create a new model importing the database schema as it is in my develop env (if you run DM under SVN is still easier) and I run the "compare / merge" function, and this returns the delta DDL (alter modify instead of drop create ...).
    I hope Oracle will introduce soon a direct way, but for now try this ...
    Regards,
    Simon

  • Generic DataSource & Delta setup

    Hello Folks,
    I am trying to get my Generic DataSources and Delta concepts straight and got more confused after reading a white paper titled "how to ...create generic delta". <b>I am on SAP R/3 4.6c and BW 3.0b.</b>
    <b>This white paper talks about creating generic datasource and delta using RSO2 in BW</b>. It explains how to  specify  delta-specific field and then select a delta type (AIE or ADD) which sets the "delta update" flag after generating.
    My question is
    1)should we create a generic datasource and delta from BW as suggested by this white paper and not in SAP R/3 directly using RSO2 and RSA3? what are the pros and cons.
    I tried creating a generic datasource in SAP R/3 using RSO2 since these transactions are available in SAP R/3. I noticed the "delta update" checkbox was always(or for the cases I tried) blank and protected. Also noticed a menu item under "Datasource" called "setup delta" to be grayed out as well. So looks like, a delta can be set up in SAP R/3 but does not let me do it.
    My question is
    1) why can't I set up a delta?

    Hi Jay,
    First, you do not create a Datasource from BW. A Datasource is created in R/3. You replicate them to BW afterwards so that you can create/map them to Transfer Rules.
    Datasource creation is in R/3 only. You can of course create  BW as a source of data for other systems but thats something else.
    Creating a <b>GENERIC DELTA</b> is based only on <b>TIMESTAMP</b>, <b>NUMERIC POINTER</b> and <b>CALENDAR DAY</b>.  <i>Delta Check box</i> is really grayed out for <b>GENERIC DATASOURCES</b> because once you specify that your generic data source is <i>DELTA</i> capable by using any of the three options above as DELTA, then the <i>Delta Check Box</i> is automatically checked.
    What <i>Setup Delta</i> are you referring that you cannot do?
    --Jkyle

  • 0FI_AR_4 Datasource, Delta

    Hi Experts,
    we are using 0FI_AR_4 datasource, this is delta enable, but the problem is we can run delta just once a day.
    Can any one please let me know how to change this so that i can run the delta more than once a day.
    Any document or a link would be of great help.
    Thanks in advance.
    Ananth

    hi Ananth,
    take a look Note 991429 - Minute Based extraction enhancement for 0FI_*_4 extractors
    https://websmp203.sap-ag.de/~form/handler?_APP=01100107900000000342&_EVENT=REDIR&_NNUM=991429&_NLANG=E
    Symptom
    You would like to implement a 'minute based' extraction logic for the data sources 0FI_GL_4, 0FI_AR_4 and 0FI_AP_4.
    Currently the extraction logic allows only for an extraction once per day without overlap.
    Other terms
    general ledger  0FI_GL_4  0FI_AP_4  0FI_AR_4  extraction  performance
    Reason and Prerequisites
    1. There is huge volume of data to be extracted on a daily basis from FI to BW and this requires lot of time.
    2. You would like to extract the data at a more frequent intervals in a day like 3-4 times in a day - without extracting all the data that you have already extracted on that day.
    In situations where there is a huge volume of data to be extracted, a lot of time is taken up when extracting on a daily basis. Minute based extraction would enable the extraction to be split into convenient intervals and can be run multiple times during a day. By doing so, the amount of data in each extraction would be reduced and hence the extraction can be done more effectively. This should also reduce the risk of extractor failures caused because of huge data in the system.
    Solution
    Implement the relevant source code changes and follow the instructions in order to enable minute based extraction logic for the extraction of GL data. The applicable data sources are:
                            0FI_GL_4
                            0FI_AR_4
                            0FI_AP_4
    All changes below have to be implemented first in a standard test system. The new extractor logic must be tested very carefully before it can be used in a production environment. Test cases must include all relevant processes that would be used/carried in the normal course of extraction.
    Manual changes are to be carried out before the source code changes in the correction instructions of this note.
    1. Manual changes
    a) Add the following parameters to the table BWOM_SETTINGS
                             MANDT  OLTPSOURCE    PARAM_NAME          PARAM_VALUE
                             XXX                  BWFINEXT
                             XXX                  BWFINSAF            3600
                  Note: XXX refers to the specific client(like 300) under use/test.
                  This can be achieved using using transaction 'SE16' for table
                             'BWOM_SETTINGS'
                              Menue --> Table Entry --> Create
                              --> Add the above two parameters one after another
    b) To the views BKPF_BSAD, BKPF_BSAK, BKPF_BSID, BKPF_BSIK
                           under the view fields add the below field,
                           View Field  Table    Field      Data Element  DType  Length
                           CPUTM       BKPF    CPUTM          CPUTM      TIMS   6
                           This can be achieved using transaction 'SE11' for views
                           BKPF_BSAD, BKPF_BSAK , BKPF_BSID , BKPF_BSIK (one after another)
                               --> Change --> View Fields
                               --> Add the above mentioned field with exact details
    c) For the table BWFI_AEDAT index-1  for extractors
                           add the field AETIM (apart from the existing MANDT, BUKRS, and AEDAT)
                           and activate this Non Unique index on all database systems (or at least on the database under use).
                           This can achived using transaction 'SE11' for table 'BWFI_AEDAT'
                               --> Display --> Indexes --> Index-1 For extractors
                               --> Change
                               --> Add the field AETIM to the last position (after AEDAT field )
                               --> Activate the index on database
    2. Implement the source code changes as in the note correction instructions.
    3. After implementing the source code changes using SNOTE instruction ,add the following parameters to respective function modules and activate.
    a) Function Module: BWFIT_GET_TIMESTAMPS
                        1. Export Parameter
                        a. Parameter Name  : E_TIME_LOW
                        b. Type Spec       : LIKE
                        c. Associated Type : BKPF-CPUTM
                        d. Pass Value      : Ticked/checked (yes)
                        2. Export Parameter
                        a. Parameter Name  : E_TIME_HIGH
                        b. Type Spec       : LIKE
                        c. Associated Type : BKPF-CPUTM
                        d. Pass Value      : Ticked/checked (yes)
    b) Function Module: BWFIT_UPDATE_TIMESTAMPS
                        1. Import Parameter (add after I_DATE_HIGH)
                        a. Parameter Name  : I_TIME_LOW
                        b. Type Spec       : LIKE
                        c. Associated Type : BKPF-CPUTM
                        d. Optional        : Ticked/checked (yes)
                        e. Pass Value      : Ticked/checked (yes)
                        2. Import Parameter (add after I_TIME_LOW)
                        a. Parameter Name  : I_TIME_HIGH
                        b. Type Spec       : LIKE
                        c. Associated Type : BKPF-CPUTM
                        d. Optional        : Ticked/checked (yes)
                        e. Pass Value      : Ticked/checked (yes)
    4. Working of minute based extraction logic:
                  The minute based extraction works considering the time to select the data (apart from date of the document either changed or new as in the earlier logic).The modification to the code is made such that it will consider the new flags in the BWOM_SETTINGS table ( BWFINEXT and BWFINSAF ) and the code for the earlier extraction logic will remain as it was without these flags being set as per the instructions for new logic to be used(but are modified to include new logic).
    Safety interval will now depend on the flag BWFINSAF (in seconds ; default 3600) and has  a default value of 3600 (1 hour), which would try to ensure that the documents which are delayed in posting due to delay in update modules for any reason. Also there is a specific coding to post an entry to BWFI_AEDAT with the details of the document which have failed to post within the safety limit of 1 hour and hence those would be extracted as a changed documents at least if they were missed to be extracted as new documents. If the documents which fail to ensure to post within safety limit is a huge number then the volume of BWFI_AEDAT would increase correspondingly.
    The flag BWFINSAF could be set to particular value depending on specific requirements (in seconds , but at least 3600 = 1 hour)  like 24 hours / 1 day = 24 * 3600 => 86400.With the new logic switched ON with flag BWFINEXT = X the other flags  BWFIOVERLA , BWFISAFETY , BWFITIMBOR are ignored and BWFILOWLIM , DELTIMEST will work as before.
    As per the instructions above the index-1 for the extraction in table BWFI_AEDAT would include the field AETIM which would enable the new logic to extract faster as AETIM is also considered as per the new logic. This could be removed if the standard logic is restored back.
    With the new extractor logic implemented you can change back to the standard logic any day by switching off the flag BWFINEXT to ' ' from 'X' and extract as it was before. But ensure that there is no extraction running (for any of the extractors 0FI_*_4 extractors/datasources) while switching.
    As with the earlier logic to restore back to the previous timestamp in BWOM2_TIMEST table to get the data from previous extraction LAST_TS could be set to the previous extraction timestamp when there are no current extractions running for that particular extractor or datasouce.
    With the frequency of the extraction increased (say 3 times a day) the volume of the data being extracted with each extraction would decrease and hence extractor would take lesser time.
    You should optimize the interval of time for the extractor runs by testing the best suitable intervals for optimal performance. We would not be able to give a definite suggestion on this, as it would vary from system to system and would depend on the data volume in the system, number of postings done everyday and other variable factors.
    To turn on the New Logic BWFINEXT has to be set to 'X' and reset back to ' ' when reverting back. This change has to be done only when there no extractions are running considering all the points above.
                  With the new minute based extraction logic switched ON,
    a) Ensure BWFI_AEDAT index-1 is enhanced with addition of AETIM and is active on the database.
    b) Ensure BWFINSAF is atleast 3600 ( 1 hour) in BWOM_SETTINGS
    c) Optimum value of DELTIMEST is maintained as needed (recommended/default value is 60 )
    d) A proper testing (functional, performance) is performed in standard test system and the results are all positive before moving the changes to the production system with the test system being an identical with the production system with settings and data.
    http://help.sap.com/saphelp_bw33/helpdata/en/af/16533bbb15b762e10000000a114084/frameset.htm

  • APO-DP Datasource - Delta

    We have a requirement to enable delta for a pdatasource which is based on a planning area, it seems SAP is inactivated this functionality for these kind of datasources...
    Did anyone tried to enable delta for planning area based datasource ? if so can you please share your experience.
    Thanks,
    Raman

    We are having full load..as SAP said delta will take more time than full...
    MIght be in latest versions they have upgraded delta functionality..Check OSS notes/help..

  • Bw datasource activate- extractor check

    I have activate datasource: 2LIS_11_VAHDR      t-code: SBIW
    2LIS_11_VAHDR       Sales Document Header Data
    and activate LO-cockpit    t-code: LBWE
    Then , I use RSA3 extractor check , prompt: 0 data records selected .
    image: http://www.itpub.net/thread-1162136-1-1.html
    This is why? please advise.
    No data in sap erp, and use replicate metadata is no effect in BW, my flow is right?
    Many thanks.
    Regards,
    Steve

    You need to fill setup tables first.
    have a look these link...
    [Setting up Delta Process for LO Extractors for First Time Using Queued Delta|https://www.sdn.sap.com/irj/scn/go/portal/prtroot/docs/library/uuid/d019f683-eac1-2b10-40a6-cfe48796a4ca]
    [Logistic Cockpit Configuration|https://www.sdn.sap.com/irj/scn/go/portal/prtroot/docs/library/uuid/50326ace-bac2-2b10-76bb-bd1a20ed2b57]
    Regards,
    Ashish
    Edited by: Ashish Tewari on May 9, 2009 1:12 PM

Maybe you are looking for

  • How to define the query condition of Image Theme in MapBuilder?

    Hi, I'm trying to add an dynamic Image Theme in OracleMap, but there's a problem.I defined the query condition of Image Theme like this: select scanimage,shap from image where scanimgid=467 The shap column is the MBR column And the XML CODE: <?xml ve

  • Edit Flat file in apllication server

    Dear all, when i upload data from excel with  process chain it gives me an error message that i should place my data in bw application server. My question is about the way to edit data in application server; how can i edit data which is in applicatio

  • Application not listed in finder

    I have an application in the dock, but it is not listed in the application folder in the Finder.

  • How do i get music on my iphone to go through the audio earbuds, not speaker?

    I'm trying to use my earbuds to listen to my songs on my iPhone.  When i put the jack into the audio jack hole I assumed it would stop playing music via the speaker and go through the earbuds....but, it doesn't.  What am I doing wrong?  Thanks.

  • Syncing to a new computer

    I was recently forced to format my computer and reinstall itunes,now it's not letting me transfer files between my computer and iphone. I can't imagine this being an uncommon question, does anyone know how to get around this? thanks!