ODS Delta Process, DataSource and Delta Queue

Hi to all,
loading data from a SAP source system into SAP BW 7.0 via generic 3.x datasource causes problems.
Here is a short description:
The data flow is from a source table using a generic extractor into a target ODS in full update mode.
Update rules should move data from table structure into ODS structure in this way:
Source table structure
CustKey1     Key2     
13386          C23     
13386          B14     
13387          A13
13387          E25
ODS structure
CustKey1     col1     col2     
13387          A13     E25     
This works pretty well - as long as all records with the same CustKey1 are transfered in the same data package. Data Browser (SE16) shows the situation in ODS-"New data" view  (data is not activated):
Request    Data_packet_number     Data_record_number      CustKey1
112            000003                  1.061              0000013386
112            000004                      1              0000013386
112            000004                      2              0000013387
There are two records for CustKey1 0000013386 with
data record 1.061 in data packet 000003   and
data record       1 in data packet 000004.
The obove constellation is the cause for errors in the ODS Delta Queue and subsequent data processing.
I think there may be one solution to solve the problem by changing the Delta Process of the Data Source.
The properties are:
- Record type: Full, delta, before, after images
- Delta type: Delta from delta queue or from extractor
- Serialization: No serialization, serialization of request, serialization of data packages
But how can I change these settings? Transactions RSA2 RSO2 RSO6 RSO8 don't do this.
What is the right delta process to use?
I have tried changing the delta process generating several 3.x datasources with the following delta processes (see table RODELTAM)
- " " (Full-upload)
- AIE
- ADD
Unfortunately with no effect.
Who can help?
Regards
Josef
Edited by: Josef L. on Mar 20, 2009 7:44 PM

hi venkat,
whenever you load a delta from ods to cube , whatever the data changed in ods since last delta will be updated, hence from you case, both req1 and req2 will be loaded to cube.
Assign Points if useful
Ramesh

Similar Messages

  • Delta get once and delta request wise?

    hi friends,
    what scenerio we use delta get once and delta by request at DTP in BI 7.0? i studied many threads, i didnt understand , get delta once some snapshot like? what is the meaning?
    can u give me any example , its great?
    regards
    ss

    Indicator: Only Get Delta Once
    Source requests of a DTP for which this indicator is set are only transferred once, even if the DTP request is deleted in the target.
    Use
    If this indicator is set for a delta DTP, a snapshot scenario is built.
    A scenario of this type may be required if you always want an InfoProvider to contain the most up-to-date dataset for a query but the DataSource on which it is based cannot deliver a delta (new, changed or deleted data records) for technical reasons. For this type of DataSource, the current dataset for the required selection can only be transferred using a 'full update'.
    In this case, a DataStore object cannot usually be used to determine the missing delta information (overwrite and creation of delta). If this is not logically possible because, for example, data is deleted in the source without delivering reverse records, you can set this indicator and perform a snapshot scenario. Only the most up-to-date request for the DataSource is retained in the InfoProvider. Earlier requests for the DataSource are deleted from the (target) InfoProvider before a new one is requested (this is done by a process in a process chain, for example). They are not transferred again during the DTP delta process. When the system determines the delta when a new DTP is generated, these earlier (source) requests are seen as 'already fetched'.
    Setting this indicator ensures that the content of the InfoProvider is an exact representation of the source data.
    Dependencies
    Requests that need to be fetched appear with this indicator in the where-used list of the PSA request, even is they have been deleted. Instead of a traffic light you have a delete indicator.
    Get Data by Request
    This indicator belongs to a DTP that gets data from a DataSource or InfoPackage in delta mode.
    Use
    If you set this indicator, a DTP request only gets data from an individual request in the source.
    This is necessary, for example, if the dataset in the source is too large to be transferred to the target in a single request.
    Dependencies
    If you set this indicator, note that there is a risk of a backlog: If the source receives new requests faster than the DTP can fetch them, the amount of source requests to be transferred will increase steadily.
    Source : https://forums.sdn.sap.com/click.jspa?searchID=5462958&messageID=3895665
    Re: Diff between Only Get Delts Once and Get Data by Request
    Check this blog as well : /people/community.user/blog/2007/06/21/sap-netweaver-70-bi-data-transfer-process-with-147only-get-delta-once148

  • Compare creation of Datasources and JMS Queues : SAP vs (Weblogic/Websphere

    I am used to creating JDBC Datasources and JMS Queues on Weblogic/Websphere thru their Admin applications.
    Can someone compare/contrast that process to that on SAP's netweaver ( either using NWA or Visual Administrator).
    Thanks

    Hi Parag,
    For the process of creating JDBC datasources and JMS resources @ SAP NetWeaver you can refer to the documents here on SDN and help.sap.com, and compare that process for yourself, thus not being influenced by others' biased or unbiased opinions.
    For NetWeaver 04 and 04s these would be:
    <a href="http://help.sap.com/saphelp_nw04/helpdata/en/b0/6e62f30cbe9e44977c78dbdc7a6b27/frameset.htm">JDBC Connector Service</a>
    <a href="http://help.sap.com/saphelp_nw04/helpdata/en/22/cf4e71c46cdb4da31153be96c5389f/frameset.htm">JMS Connector Service</a>
    For the <a href="https://www.sdn.sap.comhttp://www.sdn.sap.comhttp://www.sdn.sap.com/irj/sdn/javaee5">Java EE 5 Edition</a>:
    <a href="https://www.sdn.sap.comhttp://www.sdn.sap.comhttp://www.sdn.sap.com/irj/servlet/prt/portal/prtroot/docs/library/uuid/7bb9751d-0e01-0010-febd-c3adce2c408c">Working with Database Tables, DataSources and JMS Resources</a>
    <a href="https://www.sdn.sap.comhttp://www.sdn.sap.comhttp://www.sdn.sap.com/irj/servlet/prt/portal/prtroot/docs/library/uuid/806e75a0-0e01-0010-2587-fc518de8ac1a">Administration Guide</a> -> section "Application Resources Management" (pages 89-104)
    Hope that helps!
    Your feedback/findings are very welcome!
    -Vladimir

  • Gerneric Datasources and Outbound Queue

    Dear Experts,
    I've got some generic Datasources (Delta and Full Update) that work properly with productive BW-System.
    Now I'm starting to build up an developement system (wrong order I know) and want to load data using the mentioned Datasources from time to time to get relevant data for testing.
    Problem now is, that the generic delta Datasources fail accourding to doublicate keys (loading to dev-system) although in the basic tables for these datasources no doublicate keys are existing (or even are possible due to the table key).
    I realized in smq1, that every time when data is transfered to productive BW-System, the counter to the queue for dev-System is incremented. When I delete all the entries in smq1 all the updates for dev-System work fine.
    In one of the threads here I read, that generic Datasources do not have outbound queues. (Re: Generic Data Source issue).
    So is there any way to turn of the outbound queues for specific (generic) Datasources? I do not want to turn them of in general as we use standard delta Datasources for Controlling.
    Or how is the "normal" way to transfer data to a dev-System?
    (One more information: productive system is 3.5; dev-system is 7.1.)
    Thank you in advance
    Kind regards
    Philip
    Edited by: Philip Munz on Nov 17, 2009 8:50 AM

    ok,
    after some days of searching and reading I now understand my problem.
    For all, who face the same problems in understanding delta-updates, qrfc, rsa7 and so on, please have a look at
    http://www.sdn.sap.com/irj/scn/index;jsessionid=(J2EE3417800)ID0216771750DB11245246034303112354End?rid=/library/uuid/40427814-376a-2c10-5589-bc1aaa6692c3&overridelayout=true
    again thanks for the help
    kind regards
    Philip

  • Query on delta process

    Hi BW Experts,
    For AP(Accounts payable),AR(Accounts receivable) we can run delta process to pick delta records. How?
    Could anyone please let me know?
    Thanks

    FI extractors are worked on after image delta. Delta records are diractly selected from the R/3 tables using a time stamp mechanism.delta records are diractly transfored to bw no need to writen to the bw Plz go through reg FI OFI_Gl_4,OFI_AP_4,OFI_AR_4 0FI_GL_4 (G/L Accounts: line items) No redundant fields are transferred into BW: Only fields from the FI document tables (BKPF/BSEG) that are relevant to general ledger accounting (compare table BSIS), No customer or vendor related fields. 0FI_AP_4 (AP: line items) and 0FI_AR_4 (AR: line items) Vendor / Customer related information (e.g. payment/dunning data). “Coupled” consistent “snapshot” of FI data in BW:extraction G/L account extraction determines selection criteria (comp.code, fiscal period) and upper time limit of all extracted FI line-items. AP and AR extraction: no further selection criteria necessary / possible. “Uncoupled” extraction possible with PlugIn PI 2002.2, see OSS note 551044. 0FI_GL_4, 0FI_AP_4, 0FI_AR_4 use an After Image Delta Delta type “Extractor“: Delta records are directly selected from the R/3 tables using a timestamp mechanism. Delta records are directly transferred to BW. No record is written to the BW delta queue. After Image Delta: FI line items are transferred from the source system in their final state (= “After Image“). This delta method is not suitable for direct InfoCube update. ODS object is obligatory to determine the delta for InfoCube update. Delta queue and BW scheduler ensure correct serialization of the records (e.g. inserts must not pass changes) Distribution of delta records to multiple BW systems. Selection criteria of Delta-Init upload are used to “couple” the datasources logically. time mechanism... New FI documents Posted in R/3 since the last line-item extraction. Selection based on the field BKPF-CPUDT. Hierarchy Extractor for Balance Sheet & P&L Structure Technical name: 0GLACCEXT_T011_HIER Technical Data Type of DataSource Hierarchies Application Component FI-IO Use The extractor is used for loading hierarchies (balance sheetl/P&L structures) for the characteristic (InfoObject) 0GLACCEXT. Fields of Origin in the Extract Structure Field in Extract Structure Description of Field in the Extract Structure Table of Origin Field in Table of Origin .INCLUDE ROSHIENODE RFDT CLUSTD FIELDNM RSFIELDNM RFDT CLUSTD GLACCEXT GLACCEXT RFDT CLUSTD RSIGN RR_RSIGN RR_PLUMI RFDT CLUSTD PLUMI ROSHIENODE RFDT CLUSTD Features of Extractor Extractor: FBIW_HIERARCHY_TRANSFER_TO Extraction structure: DTFIGL_HIERNODE_1 Financial Accounting: Line Item Extraction Procedure General Information About the Line Item Extraction Procedure BW Release 3.1 makes consistent data extraction in the delta method possible for line items in General Ledger Accounting (FI-GL), and selected subsidiary ledgers (Accounts Receivable FI-AR and Accounts Payable FI-AP) and tax reporting. The extraction procedure delivered with BW Release 2.0B, based on DataSources 0FI_AR_3 and 0FI_AP_3 , can be replaced. This is described in note 0410797. The decisive advantage of choosing R/3 line item tables as the data source is that extra fields can be transferred to the BW. These were not available with transaction figures from table GLTO, the previous R/3 data source in General Ledger Accounting FI-GL. This provides more extensive and flexible analysis in BW. To enable you to assure consistent Delta data, four new InfoSources are provided in the OLTP system (with the corresponding DataSources and extractors in the SAP R/3 system): Application InfoSource Description FI-GL 0FI_GL_4 General Ledger: Line Items FI-AP 0FI_AP_4 Accounts Payable: Line Items (Extraction Linked to 0FI_GL_4) FI-AR 0FI_AR_4 Accounts Receivable: Line Items (Extraction Linked to 0FI_GL_4) FI-TX 0FI_TAX_4 General Ledger: Data for Tax on Sales/Purchases For the General Ledger, selection is made from tables BKPF and BSEG, while selection for the subsidiary accounts is made from tables BSID/BSAD (Accounts Receivable) and BSIK/BSAK (Accounts Payable). InfoSource 0FI_GL_4 transfers only those fields that are relevant for General Ledger Accounting from the Financial Accounting document (tables BKPF and BSEG) to the BW system. The consisten recording of data from General Ledger Accounting and Subledger Accounting is provided by means of coupled delta extraction in the time stamp procedure. General ledger accounting is the main process in delta mode and provides subsidiary ledger extraction with time stamp information (time intervals of previously selected general ledger line items). This time stamp information can also be used for a loading history: this shows which line items have previously been extracted from the SAP R/3 system. Delta Method Delta extraction enables you to load into the BW system only that data that has been added or has changed since the last extraction event. Data that is already loaded and is not changed is retained. This data does not need to be deleted before a new upload. This procedure enables you to improve performance unlike the periodic extraction of the overall dataset. Financial Accounting line items are read by extractors directly from the tables in the SAP R/3 system. A time stamp on the line items serves to identify the status of the delta data. Time stamp intervals that have already been read are then stored in a time stamp table. The delta dataset is transferred to the BW system directly, without records being transferred to the delta queue in the SAP R/3 system (extraktor delta method). The Financial Accounting line items are extracted from the SAP R/3 system in their most recent status (after-image delta method). This data method is not suitable for filling InfoCubes directly in the BW system. To start with therefore, the line items must be loaded in the BW system in an ODS object that identifies the changes made to individual characteristics and key figures within a delta data record. Other data destinations (InfoCubes) can be provided with data from this ODS object. Time Stamp Method With Financial Accounting line items that have been posted in the SAP R/3 system since the last data request, the extractors identify the following delta dataset using the time stamp in the document header (BKPF-CPUDT). When a delta dataset has been selected successfully, the SAP R/3 system logs two time stamps that delimit a selection interval for a DataSource in table BWOM2_TIMEST: Field Name Key Description MANDT X Client OLTPSOURCE X DataSource AEDAT X SYSTEM: Date AETIM X SYSTEM: Time UPDMODE Data update mode (full, delta, deltainit) TS_LOW Lower limit of the time selection (time stamp in seconds since 1.1.1990) TS_HIGH Upper limit of the time selection (time stamp in seconds since 1.1.1990) LAST_TS Flag: 'X' = Last time stamp interval of the delta extraction TZONE Time zone DAYST Daylight saving time active? The time stamps are determined from the system date and time and converted to the format seconds since 1.1.1990, taking into account the time zone and daylight saving time. To ensure correct and unique reconversion to date and time, the time zone and daylight saving time must be stored in table BWOM2_TIMEST. Table BWOM2_TIMEST therefore serves to document the loading history of Financial Accounting line items. It also provides defined restart points following incorrect data requests. To provide a better overview, the time stamps in the example table are entered in the date format. The columns TZONE and DAYST were left out. OLTPSOURCE AEDAT/AETIM UPD DATE_LOW DATE_HIGH LAST_TS 0FI_GL_4 16 May 2000/20:15 Init 01 Jan. 1990 15 May 2000 0FI_GL_4 24 May 2000/16:59 Delta 16 May 2000 23 May 2000 0FI_GL_4 02 June 2000/21:45 Delta 24 June 2000 01 June 2000 0FI_GL_4 15 June 2000/12:34 Delta 02 June 2000 14 June 2000 0FI_GL_4 21 June 2000/18:12 Delta 15 June 2000 20 June 2000 X 0FI_AP_4 18 May 2000/21:23 Init 01 Jan. 1990 15 May 2000 0FI_AP_4 30 May 2000/12:48 Delta 16 May 2000 23 May 2000 0FI_AP_4 10 June 2000/13:19 Delta 24 June 2000 01 June 2000 X 0FI_AR_4 17 May 2000/18:45 Init 01 Jan. 1990 15 May 2000 0FI_AR_4 04 June 2000/13:32 Delta 16 May 2000 01 June 2000 0FI_AR_4 16 June 2000/15:41 Delta 02 June 2000 14 June 2000 X 0FI_TX_4 17 May 2000/18:45 Init 01 Jan. 1990 15 May 2000 0FI_TX_4 04 June 2000/13:32 Delta 16 May 2000 01 June 2000 0FI_TX_4 16 June 2000/15:41 Delta 02 June 2000 14 June 2000 X Constraints Per day, no more than one delta dataset can be transferred for InforSource 0FI_GL_4. The extracted data therefore has the status of the previous day. For further data requests on the same day, the InfoSource does not provide any data. In delta mode, data requests with InfoSource 0FI_AR_4 and InfoSource 0FI_AP_4 do not provide any data if no new extraction has taken place with InfoSource 0FI_GL_4 since the last data transfer. This ensures that the data in the BW system for Accounts Receivable and Accounts Payable Accounting is exactly as up to date as the data for General Ledger Accounting. If you delete the initialization selection in the source system for InfoSource 0FI_GL_4 in the BW system Administrator Workbench, the time stamp entries for InfoSources 0FI_GL_4, 0FI_AP_4, 0FI_AR_4 and OFI_TX_4 are also removed from table BWOM2_TIMEST. Recording Changed Line Items In the case of Financial Accounting line items that have been changed since the last data request in the SAP R/3 system, there is no reliable time stamp that can document the time of the last change. For this reason, all line items that are changed in a way relevant for BW must be logged in the SAP R/3 system

  • Initialize Delta Process with Data Transfer - Split the job in two/three

    Hello,
    I am trying to load master data into BI (DEV) and there are 700,000 master data records in my source system. When I tried to load Initialize Delta Process with data transfer, at some point (after 300,000 records) my load failed in source system and job log I found that this issue "ABAP/4 processor: TSV_TNEW_OCCURS_NO_ROLL_MEMORY"
    When I informed this issue to BASIS consultant, they suggest me to split the job in 3 (Three). My question: how can I split the job in three? I can do one Initialize Delta Process load and followed by delta load.
    Regards,
    Md

    Hi Md.
    Even at Infopackage level you can reduce the number of records per package.
    Go to Infopackage Schedular Menu -> DataS. Default Data Transfer. you can reduce the Maximum size of a data packet in kByte.
    Regards,
    Pratap Sone

  • Explication about the Delta process in SAP BI - ROCANCEL and 0RECORDMODE

    Hello,
    I use the delta process with cockpit datasources (2LIS_17_I3HDR and 2LIS_17_I0NOTIF).
    I have transfered all data from the extraction queue to the BW queue.
    After, when I will launch the delta process in SAP BI (with the infopackage), two ODS will be updated. I wanted to know how SAP will know what is really the delta? I have seen that there is a ROCANCEL field in the PSA, how does SAP choose the good row? Does-it recognize the ROCANCEL and so replace the row ROCANCELED with the new one? Have we to do a special manipulation (like a mapping of ROCANCEL with 0RECORDMODE?)?
    Can you explain me a little how does SAP work (ROCANCEL values, 0RECORDMODE, etc. and what I have to do? ).
    Thanks a lot,
    Regards,
    Julien

    Check :
    Re: Indicator: Cancel Data Record
    Re: 0RECORDMODE, 0STORNO, ROCANCEL

  • Diff between update queue and delta queue

    hi all
    can anyone tell me wats is the diff between update queue and delta queue? wats is delta queue and update queue?
    wats are the possible system generated errors and custom generated errors?
    Thanks,
    Shreya

    Hi Shreya,
       Update queue(LBWQ) comes into the picture when you choose Queued Delta for the Datasource. the data first comes to the Update queue and than goes to the Delta queue, however if you have choosen Direct Delta than the posted record will directly go to the Delta queue(RSA7).
    Here is the URLS for Roberto's weblog to understand the whole LO-Extraction process.
    /people/sap.user72/blog/2004/12/23/logistic-cockpit-delta-mechanism--episode-two-v3-update-when-some-problems-can-occur
    /people/sap.user72/blog/2005/01/19/logistic-cockpit-delta-mechanism--episode-three-the-new-update-methods
    /people/sap.user72/blog/2005/02/14/logistic-cockpit--when-you-need-more--first-option-enhance-it
    /people/sap.user72/blog/2005/04/19/logistic-cockpit-a-new-deal-overshadowed-by-the-old-fashioned-lis
    Hope it helps you!!!!
    Regards,
    Amit
    Pls do not forget to assign points, if helpful.

  • Specific Datasource in a source system : how to modify the delta process ?

    Hello,
    I have a specific datasource in a Sap CRM5.0 system that sends sales data by year and Business Partner to a Sap BI7 system.
    The Delta Process used (AIE     After-Images Via Extractor (FI-GL/AP/AR)) do not manage the deletion : if you delete the 2009 sales data for a Business Partner, these data won't be in the CRM system anymore, but they'll remain in the Sap BI7 system after the delta.
    My client would like to change the delta process to send the deletions in CRM to BW.
    In CRM, when i use the RSA2 transaction, i can display the delta process, but apparently, there's no way to modify it
    Moreover, the RSA6 and RSO2 transactions are usefull to modify my datasource, but not directly  its delta process.
    I found the RSOOSOURCE table where i could manually modify the delta process for my datasource, but it doesn't seem right to do it that way.
    Can anyone help ? Is there anyway to modify the delta process for my datasource ? Or will i have to create a new one ?

    You have to create new one
    check these links
    Changing Delta Process of DataSource
    Changing Delta Process Type
    Rgds
    sateesh

  • Before designing cube/ODS in BW how can weknow that the datasource is delta

    HI All
    Before designing cube/ODS in BW how can we know that the datasource in source R/3 is delta enable or not? I mean what type of datasource is this?
    And aslo what do we mean by Standard Extractors? will this mean like for this there wont be any datasource or table? Iam nill in this part? can anyone explain me regarding this
    Regards
    Balji

    Hi Balaji,
    To find whether a datasource is delta enabled or not goto R/3 -> se11/se16 -> roosource table -> your datasource -> check the "DELTA" field value. If it is blank then no delta is supported.
    All extractors would have a datasource and extract structure associated with it. It does not matter whether it is business content or custom.
    Bye
    Dinesh

  • Difference between Direct delta and delta Queue

    Hi,
          1.ease explain me the concept of Direct delta and delta queue in LO extraction
    2.Why set up tables are used and initially why the data is deleted for th setup tables.
    3.Please explain me the LO extraction with an example.
    Thanks
    Alok

    Dear Aloka,
    1.
    Direct delta:
    transactional data or application posting will be directly available in the delta queue table with out any intermediate tables. then that data will be extracted from r/3 to bw.
    Each document posting is directly transferred into the BW delta queue
    • each document posting with delta extraction leads to exactly one LUW in the respective BW delta queues
    Transaction postings lead to:
    1. Records in transaction tables and in update tables
    2. A periodically scheduled job transfers these postings into the BW delta queue
    3. This BW Delta queue is read when a delta load is executed.
    Pros:
    • Extraction is independent of V2 update
    • Less monitoring overhead of update data or extraction queue
    Cons:
    • Not suitable for environments with high number of document changes
    • Setup and delta initialization have to be executed successfully before document postings are resumed
    • V1 is more heavily burdened
    Queued delta:
    first the data posting will available in the extraction queue table. then there periodic job run take that data to queued delta table.
    • Extraction data is collected for the affected application in an extraction queue
    • Collective run as usual for transferring data into the BW delta queue
    Transaction postings lead to:
    1. Records in transaction tables and in extraction queue
    2. A periodically scheduled job transfers these postings into the BW delta queue
    3. This BW Delta queue is read when a delta load is executed.
    Pros:
    • Extraction is independent of V2 update
    • Suitable for environments with high number of document changes
    • Writing to extraction queue is within V1-update: this ensures correct serialization
    • Downtime is reduced to running the setup
    Cons:
    • V1 is more heavily burdened compared to V3
    • Administrative overhead of extraction queue
    2.
    Setup tables are used for Full/Initial Update.
    Instead of pulling data from DB Tables directly, we r filling Setup tables and extracting data from the Setup tables, which dont have performance impact on DB Tables.
    Setup tables are application specific.
    It is cautious method to delete Setup tables before filling it for a specific DataSource.
    3.
    Re: LO-Cockpit  V1 and V2 update
    http://www.sap-img.com/business/lo-cockpit-step-by-step.htm
    Regards,
    Ram.

  • Generic datasource with Delta queue

    Hi All,
    I 've created a generic datasource on prices tables A* and KONP. Next, I used HowTo document "create generic datasource that uses delta queue", finding the right BTE that could catch prices modification. Up to this point, everything works fine.
    I've extracted an Init delta to BW and now, on R/3, in RSA7, I can see a delta queue for my generic datasource.
    When I modify price conditions in VK12, BTE works fine and I can see the counter in RSA7 increasing. I also can see new entries in table TRFCQOUT with status READY. (either in SE16 or in SMQ1 )
    The problem is :
    - in RSA7, if I want to display the posted data, system returns an empty list (whereas the counter is different from 0)
    - when I run a delta InfoPackage, no data is sent to BW, but in table TRFCQOUT, status has been changed to READ for entries that previously had status READY (and old entries that had status READ due to a previous delta upload are deleted)
    Does anyone has any idea about the reason of the issue ?
    Thanks for any help
    AJ

    Sure, I can.
    The problem was in the specific coding that update the delta queue : not all the fields were updated and some of these fields were used as selection criteria in th delta infopackage...
    in RSA7, when you try to see the data in the queue, the system only shows you the records that match the selection criterias of your delta IP...so in our case, because of those empty fields, no records were selected...
    AJ

  • How to delete the data in update queue and delta queue for Queued delta?

    Dear BWers,
    How do i delete the delta queue and update queue data before i fill the setup tables for a extraction based on Queued delta. Please help.
    Thanks
    Raj

    Hi Raj,
    I think you need some ground work for the LO extraction same as others here. Please read the 3 blogs expliciltly created for LIS by Robert Negro.
    /people/sap.user72/blog/2004/12/16/logistic-cockpit-delta-mechanism--episode-one-v3-update-the-145serializer146
    /people/sap.user72/blog/2004/12/23/logistic-cockpit-delta-mechanism--episode-two-v3-update-when-some-problems-can-occur
    /people/sap.user72/blog/2005/01/19/logistic-cockpit-delta-mechanism--episode-three-the-new-update-methods
    As well, the OSS 380078 would clear your doubts reagrding the the BW QUEUE mainatinance. 
    Please let me know if these material has been suffecient enough.
    regarda,
    raj

  • What is differences extraction queue, delta queue and uddate queue ?

    hi guru's
    What is differences  between extraction queue, delta queue and uddate queue ? can u describe briefly?
    Thanks & Regards
    nandi

    Dear Prabha,
    Basically when any document is posted in R/3, it is updated to the update table, from there it is taken to our delta queue for send it to BW side.
    When extraction starts, data is sent to BW from delta queue. then again this cycle starts.
    When you post any document in OLTP system (eg SAP R3),
    say create sales order by VA01, then posting is made to application tables (VBAK/VBAP) through V1 and also to sm other tables through V2, Communication structure written to update queue/extraction queue/delta queue(directly) as per the update mode selected. V3 is always followed by V2 and we are supposed to schedule it.
    From this delta queue, data is extracted by BW infopackages.
    There are various update methods according to which extraction or delta queue are used, so when document posting takes place it also write data into extraction queue (through V1 update) and if we use queued delta method then this data is collected in collection run and written to delta queue and from this delta queue we request for data from BW.
    There are lots of posts on SDN for this, please have a look on those.
    one for your reference...
    https://www.sdn.sap.com/irj/sdn/profile?userid=3507509
    Hope it helps...
    Message was edited by:
            Ashish Tewari

  • Generic Datasource with Delta and functionmodule

    Hi together,
    who can help me ??
    Ihave created a generic datasource with function module and
    delta.
    the extractor runs well while i use full update and also initialization.
    If i start the delta extraction, the extractor crashed with short-dump.
    the message is SAPSQL_INVALID_FIELDNAME
    What can i do, and what is wrong.
    regards
    thorsten Weiss

    Hi Roberto,
    here is the code from the function-module:
    FUNCTION zbw_mm_get_eket.
    ""Lokale Schnittstelle:
    *"  IMPORTING
    *"     VALUE(I_REQUNR) TYPE  SBIWA_S_INTERFACE-REQUNR
    *"     VALUE(I_DSOURCE) TYPE  SBIWA_S_INTERFACE-ISOURCE OPTIONAL
    *"     VALUE(I_MAXSIZE) TYPE  SBIWA_S_INTERFACE-MAXSIZE DEFAULT 1000
    *"     VALUE(I_INITFLAG) TYPE  SBIWA_S_INTERFACE-INITFLAG OPTIONAL
    *"     VALUE(I_READ_ONLY) TYPE  SBIW_BOOL DEFAULT SBIW_C_FALSE
    *"  TABLES
    *"      I_T_SELECT TYPE  SBIWA_T_SELECT OPTIONAL
    *"      I_T_FIELDS TYPE  SBIWA_T_FIELDS OPTIONAL
    *"      E_T_DATA OPTIONAL
    *"  EXCEPTIONS
    *"      NO_MORE_DATA
    *"      ERROR_PASSED_TO_MESS_HANDLER
      INCLUDE lrsalk01.
    DataSource for table EKET
      TABLES: zv_mm_eket.
    interne Tabelle für Bearbeitung
      DATA:   itab_0 TYPE TABLE OF zstr_eket WITH HEADER LINE.
      TYPES: BEGIN OF typ_categ,
              j_4kbwef    TYPE atnam,
              /afs/bwel   TYPE j_4kbwef,
             END OF typ_categ.
      DATA: l_s_data_eket  TYPE zstr_eket,
            ld_cat_struct  TYPE j_4kcsgr,
            lt_cat_fields  TYPE TABLE OF j_4kcif001,
            ls_cat_fields  TYPE j_4kcif001,
            ls_mara        TYPE mara,
            l_tabix        LIKE sy-tabix,
            itab_cat       TYPE TABLE OF typ_categ ,
            ls_cat         TYPE typ_categ,
            h_feldsize1(8)        TYPE c,"wegen Typ-konflikt im FB
            h_feldsize2(8)        TYPE c."wegen Typ-konflikt im FB
    Auxiliary Selection criteria structure
      DATA: l_s_select TYPE rsselect.
    Maximum number of lines for DB table
      STATICS: s_t_select     LIKE rsselect OCCURS 0 WITH HEADER LINE,
               s_t_fields     LIKE rsfieldsel OCCURS 0 WITH HEADER LINE,
    counter
              s_counter_datapakid LIKE sy-tabix,
    cursor
              s_cursor TYPE cursor.
    Select ranges
      RANGES: l_r_ebeln       FOR zv_mm_eket-ebeln,
              l_r_ebelp       FOR zv_mm_eket-ebelp,
              l_r_bsart       FOR zv_mm_eket-bsart.
    Initialization mode (first call by SAPI) or data transfer mode
    (following calls) ?
      IF i_initflag = sbiwa_c_flag_on.
    Initialization: check input parameters
                    buffer input parameters
                    prepare data selection
    Check DataSource validity
        CASE i_dsource.
          WHEN 'ZDS_V_MM_EKET'.
          WHEN OTHERS.
            IF 1 = 2. MESSAGE e009(r3). ENDIF.
    this is a typical log call. Please write every error message like this
            log_write 'E'                  "message type
                      'R3'                 "message class
                      '009'                "message number
                      i_dsource            "message variable 1
                      ' function modul was created for DS ' &
                      'ZDS_V_MM_EKET"!'.
            "message variable 2
            RAISE error_passed_to_mess_handler.
        ENDCASE.
        APPEND LINES OF i_t_select TO s_t_select.
    Fill parameter buffer for data extraction calls
       S_T_SELECT-REQUNR    = I_REQUNR.
       S_T_SELECT-DSOURCE   = I_DSOURCE.
       S_T_SELECT-MAXSIZE   = I_MAXSIZE.
    Fill field list table for an optimized select statement
    (in case that there is no 1:1 relation between InfoSource fields
    and database table fields this may be far from beeing trivial)
        APPEND LINES OF i_t_fields TO s_t_fields.
      ELSE.                 "Initialization mode or data extraction ?
    Data transfer: First Call      OPEN CURSOR + FETCH
                   Following Calls FETCH only
    First data package -> OPEN CURSOR
        IF s_counter_datapakid = 0.
    Fill range tables BW will only pass down simple selection criteria
    of the type SIGN = 'I' and OPTION = 'EQ' or OPTION = 'BT'.
          LOOP AT s_t_select INTO l_s_select WHERE fieldnm = 'EBELN'.
            MOVE-CORRESPONDING l_s_select TO l_r_ebeln.
            APPEND l_r_ebeln.
          ENDLOOP.
          LOOP AT s_t_select INTO l_s_select WHERE fieldnm = 'EBELP'.
            MOVE-CORRESPONDING l_s_select TO l_r_ebelp.
            APPEND l_r_ebelp.
          ENDLOOP.
          LOOP AT s_t_select INTO l_s_select WHERE fieldnm = 'BSART'.
            MOVE-CORRESPONDING l_s_select TO l_r_bsart.
            APPEND l_r_bsart.
          ENDLOOP.
    Determine number of database records to be read per FETCH statement
    from input parameter I_MAXSIZE. If there is a one to one relation
    between DataSource table lines and database entries, this is trivial.
    In other cases, it may be impossible and some estimated value has to
    be determined.
          OPEN CURSOR WITH HOLD s_cursor FOR
          SELECT (s_t_fields) FROM zv_mm_eket
          WHERE
          ebeln      IN               l_r_ebeln           AND
          ebelp      IN               l_r_ebelp         AND
          bsart    IN             l_r_bsart.
        ENDIF.                             "First data package ?
    Fetch records into interface table.
      named E_T_'Name of extract structure'.
       FETCH NEXT CURSOR s_cursor
                  APPENDING CORRESPONDING FIELDS
                  OF TABLE e_t_data
                  PACKAGE SIZE i_maxsize.
        FETCH NEXT CURSOR s_cursor
                   APPENDING CORRESPONDING FIELDS
                   OF TABLE itab_0
                   PACKAGE SIZE i_maxsize.
        LOOP AT itab_0 INTO l_s_data_eket.
          l_tabix = sy-tabix.
    Lesen Erstellungsdatum aus EKKO
          SELECT SINGLE aedat FROM ekko INTO l_s_data_eket-sydat
                         WHERE ebeln = l_s_data_eket-ebeln.
    Lesen Partner aus EKPA
          SELECT SINGLE lifn2 FROM ekpa INTO l_s_data_eket-plief
                         WHERE ebeln = l_s_data_eket-ebeln AND
                               ebelp = l_s_data_eket-ebelp AND
                               ekorg = l_s_data_eket-ekorg AND
                               werks = l_s_data_eket-werks .
          IF NOT l_s_data_eket-matnr IS INITIAL .
    *A Lesen material für Kategoriestruktur j_4kcsgr(F001 oder R002)
            CLEAR ls_mara.
            CALL FUNCTION 'J_3A1_LESEN_MARA_SINGLE'
                 EXPORTING
                      i_matnr         = l_s_data_eket-matnr
                 IMPORTING
                      e_mara          = ls_mara
                 EXCEPTIONS
                      param_not_valid = 1
                      OTHERS          = 2.
            IF sy-subrc NE 0.
            ENDIF.
    *E Lesen material für Kategoriestruktur j_4kcsgr(F001 oder R002)
    *A Aufsplitten Bestandskategorie
            REFRESH lt_cat_fields.
            CALL FUNCTION 'J_4KG_SPLIT_CAT'
              EXPORTING
                client                            = sy-mandt
                csgr                              = ls_mara-j_4kcsgr
                cat_appl                          = 'S'
                cat_value                         = l_s_data_eket-j_4kscat
      NECESSARY_SPECIFIED               = ' '
              TABLES
                cat_fields_tab                    = lt_cat_fields
              EXCEPTIONS
                no_category_structure_found       = 1
              OTHERS                            = 2.
            IF sy-subrc <> 0.
    MESSAGE ID SY-MSGID TYPE SY-MSGTY NUMBER SY-MSGNO
            WITH SY-MSGV1 SY-MSGV2 SY-MSGV3 SY-MSGV4.
            ELSE."sy-subrc <> 0
    Verarbeitung der Ergebnisse
              LOOP AT lt_cat_fields INTO ls_cat_fields.
                IF ls_cat_fields-j_4kbwef EQ 'BW_CAT_SCONFIG'.
                  l_s_data_eket-zz_bwel_sconfig = ls_cat_fields-j_4kcatv.
                ENDIF.
                IF ls_cat_fields-j_4kbwef EQ 'BW_CAT_CONFIG'.
                  l_s_data_eket-zz_bwel_config = ls_cat_fields-j_4kcatv.
                ENDIF.
                IF ls_cat_fields-j_4kbwef EQ 'BW_CAT_COUNTRY'.
                  l_s_data_eket-j_3abwel_country = ls_cat_fields-j_4kcatv.
                ENDIF.
                IF ls_cat_fields-j_4kbwef EQ 'BW_CAT_COUNTRYGRP'.
                  l_s_data_eket-zz_bwel_coungrp = ls_cat_fields-j_4kcatv.
                ENDIF.
                IF ls_cat_fields-j_4kbwef EQ 'BW_CAT_STOCKTYPE'.
                  l_s_data_eket-zz_bwel_stktype = ls_cat_fields-j_4kcatv.
                ENDIF.
                IF ls_cat_fields-j_4kbwef EQ 'BW_CAT_ORDER'.
                  l_s_data_eket-zz_bwel_order = ls_cat_fields-j_4kcatv.
                ENDIF.
                IF ls_cat_fields-j_4kbwef EQ 'BW_CAT_QUALITY'.
                  l_s_data_eket-j_3abwel_qual = ls_cat_fields-j_4kcatv.
                ENDIF.
              ENDLOOP."lt_cat_fields
            ENDIF.
    *E Aufsplitten Bestandskategorie
    *A Aufsplitten MAtrix
            IF NOT l_s_data_eket-j_3asize IS INITIAL.
              CALL FUNCTION 'J_3A_SPLIT_SIZES'
                   EXPORTING
                        matnr              = l_s_data_eket-matnr
                        j_3asize           = l_s_data_eket-j_3asize
                   IMPORTING
                        j_3akord1          = l_s_data_eket-j_3abwel_color
                        j_3akord2          = h_feldsize1
                        j_3akord3          = h_feldsize2
                   EXCEPTIONS
                        no_grid_determined = 1
                        OTHERS             = 2.
              IF sy-subrc <> 0.
    MESSAGE ID SY-MSGID TYPE SY-MSGTY NUMBER SY-MSGNO
            WITH SY-MSGV1 SY-MSGV2 SY-MSGV3 SY-MSGV4.
              ELSE.
                l_s_data_eket-zz_bwel_size1 = h_feldsize1.
                l_s_data_eket-zz_bwel_size2 = h_feldsize2.
              ENDIF.
            ENDIF."not l_s_data_eket-J_3ASIZE is initial
    *E Aufsplitten MAtrix
            MODIFY itab_0 FROM l_s_data_eket INDEX l_tabix.
          ENDIF."not l_s_data_eket-matnr is initial
        ENDLOOP.                                                "itab_0
      An Ausgabe-Tabelle übergeben
        APPEND LINES OF itab_0 TO e_t_data.
        IF sy-subrc <> 0.
          CLOSE CURSOR s_cursor.
          RAISE no_more_data.
        ENDIF.
        s_counter_datapakid = s_counter_datapakid + 1.
      ENDIF.              "Initialization mode or data extraction ?
    ENDFUNCTION.
    regards
    thorsten

Maybe you are looking for

  • How can I display my Mail lists and msgs. as usual??

    I just lost the capability to show my inbox list or to click on a title to view the email content.   When i open Mail there's the body of one msg the right, which I can't do anything with (the usual toolbar is missing); and when click on Inbox, Read

  • AP invoice with different tax rate and cost center?

    Dear all, We are just upgrade from 11i to R12.1.3. Before in 11i, we can change distribution account in AP invoice distribution line if the type is "Tax". However, after updated to R12, we define tax code (e.g. P1 for particular account combination),

  • New hard drive and backing up

    I just purchased a new hard drive.  I currently use Time Machine to back everything up.  Can I use this clone my new drive or should I use another program.  Also my current drive has a partiton for Windows but it's not big enough.  I would like to in

  • Down payment in FBB1

    Hi, when I am trying to post down payment in FBB1 T.code, there is no provision to enter special GL indicator and i am getting below error message. Special G/L indicator  not defined or incorrect Message no. F5838 I understand that it is not possible

  • Is it possible to flag e-mails and/or right click an e-mail to mark as read/unread, etc. when using Microsoft Outlook Web Access in Firefox on a Mac?

    Is it possible to flag e-mails and/or right click an e-mail to mark as read/unread, etc. when using Microsoft Outlook Web Access in Firefox on a Mac? == This happened == Every time Firefox opened == I've never been able to figure out how to do those