Customized SD Datasource deltas

Hi all,
I have a requirement to append some fields to the Billing Item data source.  These fields are generally from other VB** tables.
If I add them and then write and exit in EXIT_SAPLRSAP_001 is that enough? 
Or will I have an issue with deltas for changes to those appended fields as is described in OSS note 576886?
Will I have to do complex code to find the before and after image for all the fields?
Anyone have more details steps to a solution?

Hi,
a generic datasource is the one created by yourself via transaction RSO2.
Generic DS can be based on a table, a view or an extract structure.
In the latter, you'll have as well to write a function module (the extractor); you can see an example in SE37, FM RSAX_BIW_GET_DATA_SIMPLE.
You can delta enable your generic data source and the challenge here is to find a suitable field; the easiest is when your source table has a TIMESTAMP field tellong which record has been changed; otherwise a calendar day or sequential number is also supported... In your case, since you don't have any information regarding when the document has been changed, you'll have to use at least a view with VBPA and VBUK (the header) in order to get the "changed on" and base your delta on this field.
This means that you'll potentially extract duplicate information (if you request a delta twice during the same day); therefore you'll have to model an ODS with overwrite update in order to not cumulate this data....
hoping this will shed light
Olivier.

Similar Messages

  • CUSTOM ADDRESS DATASOURCE

    Hi Experts,
                      I want to know if anyone had developed a Custom Address Datasource (Delta of course) using ADRC Table to include all the BP Addresses and Contract Account Addresses. I need to know if its achievable. I am working on BI 7.0 environment and we are implementating IS-U CCS project.
    Thanks,
    SB.

    It is a very common requirement.
    Check following datasources if they match your requirement:
    http://help.sap.com/saphelp_nw70/helpdata/EN/2f/fe9a13db67a24dad2766acc03ec0df/frameset.htm
    http://help.sap.com/saphelp_nw70/helpdata/EN/c6/8d80fc1597417787cf548d04216881/frameset.htm
    http://help.sap.com/saphelp_nw70/helpdata/EN/77/64c1d51e964f5fbc31f3eefac46000/frameset.htm

  • Red Traffic Light for Datasources Delta Queues

    Hi all,
    Our R/3 source system is connected to our BW system.
    Between the two systems we operated standard logistics datasources delta queues, by filling the setup tables and performing an initial update.
    The datasources delta queues were created and used over a month (according to the RSA7 they all marked with green traffic light).
    Now, we copied the R/3 source system to a new one.
    After doing so, all the delta queues traffic light turned to be red.
    Does anyone can provide a technical explanation/reason to this problem?
    Also, is there something we could do to "save" this delta queue, without needed to delete it and create it all over again?
    Thanks ahead,
    Shani Bashkin

    Hi Eddo,
    Thanks for your help.
    Yes, I'm using the same system name and system id. The new copied system has the same name and id like the productive system.
    Also, it seems like the RFC connection to the BW is lost.
    The question is why?
    Thanks,
    Shani Bashkin

  • Master datasource delta problem.

    Hello,
    We are in process of implementing logistics module for our client. We are in BI7 and using all BI7 data flow for master and transaction data .
    While working with all master datasource which is supporting delta functionality, we have faced following issue. All this datasources are based on ALE pointer. somehow we are getting the same number of records in delta load...basically it is bringing the same records in delta load everytime.
    E.g - 0CUSTOMER_ATTR . when i extracted the data first time in delta load, it has brough 1200 records and from that day it is bringing 1000+ records in delta load. even if i start the delta in one min, it is bringing that much data. I am very confident that, there is not much changing in ECC ..but here it is bringing it from change pointer table and not updating the same table once the extraction is completed.
    When we dig into this issue, we realize that, somehow change pointer is not getting updated in change pointer table .  We haven't changed anything in standard extractor but still it is happening with all master datasource (delta based ) extractor .
    Can you please help me solving this issue ?
    Regards,

    Hi Maasum
    Not sure whether the below will help.. but give it a try:
    /people/simon.turnbull/blog/2010/04/08/bw-master-data-deltas-demystified
    Thanks
    Kalyan

  • Creating Custom Hierarchy dataSource

    hi all,
    i want to Creating Custom Hierarchy dataSource from Ztable to extract data from r/3 side to bw.(not for attr and text)
    I know we have two approach :
    1) by creating a xls file.
    2) by making a Z function Module.
    Can anybody tell which one is better? and Why?
    Wat the steps to do if i m using second approach means via function Module
    Regards,
    San!

    hi,
    Hi,
    Pls check out the whitepaper below -
    https://www.sdn.sap.com/irj/sdn/nw-bi?rid=/library/uuid/50cbb737-b36f-2c10-c78b-b63d116ce313

  • Datamart - Generate Datasource DELTA flag check

    We are on BW 3.5 (source system) and trying to send data across to target system (SCM / BI 70).
    When I perform "generate datasource" for a cube, the resulting datasource " 8* " is generated correctly but with the DELTA FLAG CHECKED (txn .RSA6 ) for that datasource
    With this flag setting, we are forced to run an INIT and then DELTA from the target system to extract the data
    Question :
    1) How do we "uncheck"  the Delta Flag setting ? Is it possible ?
    2) If we cannot uncheck the flag , how do we run Full loads from our target system ?
    Thanks in advance .
    Hitesh

    Hi,
    Even if the datasource is delta enabled it means you can run delta loads for it but it doesn't mean that you cannot run the full loads.
    For unchecking the flag in the Delta for a datasource you have make that particular delta field to be disabled in the RODELTAM table for the particular data source. This can be done by using a custom ABAP program.
    Hope it helps !
    Regards,
    Rose.

  • 0FI_AR_4 Datasource, Delta

    Hi Experts,
    we are using 0FI_AR_4 datasource, this is delta enable, but the problem is we can run delta just once a day.
    Can any one please let me know how to change this so that i can run the delta more than once a day.
    Any document or a link would be of great help.
    Thanks in advance.
    Ananth

    hi Ananth,
    take a look Note 991429 - Minute Based extraction enhancement for 0FI_*_4 extractors
    https://websmp203.sap-ag.de/~form/handler?_APP=01100107900000000342&_EVENT=REDIR&_NNUM=991429&_NLANG=E
    Symptom
    You would like to implement a 'minute based' extraction logic for the data sources 0FI_GL_4, 0FI_AR_4 and 0FI_AP_4.
    Currently the extraction logic allows only for an extraction once per day without overlap.
    Other terms
    general ledger  0FI_GL_4  0FI_AP_4  0FI_AR_4  extraction  performance
    Reason and Prerequisites
    1. There is huge volume of data to be extracted on a daily basis from FI to BW and this requires lot of time.
    2. You would like to extract the data at a more frequent intervals in a day like 3-4 times in a day - without extracting all the data that you have already extracted on that day.
    In situations where there is a huge volume of data to be extracted, a lot of time is taken up when extracting on a daily basis. Minute based extraction would enable the extraction to be split into convenient intervals and can be run multiple times during a day. By doing so, the amount of data in each extraction would be reduced and hence the extraction can be done more effectively. This should also reduce the risk of extractor failures caused because of huge data in the system.
    Solution
    Implement the relevant source code changes and follow the instructions in order to enable minute based extraction logic for the extraction of GL data. The applicable data sources are:
                            0FI_GL_4
                            0FI_AR_4
                            0FI_AP_4
    All changes below have to be implemented first in a standard test system. The new extractor logic must be tested very carefully before it can be used in a production environment. Test cases must include all relevant processes that would be used/carried in the normal course of extraction.
    Manual changes are to be carried out before the source code changes in the correction instructions of this note.
    1. Manual changes
    a) Add the following parameters to the table BWOM_SETTINGS
                             MANDT  OLTPSOURCE    PARAM_NAME          PARAM_VALUE
                             XXX                  BWFINEXT
                             XXX                  BWFINSAF            3600
                  Note: XXX refers to the specific client(like 300) under use/test.
                  This can be achieved using using transaction 'SE16' for table
                             'BWOM_SETTINGS'
                              Menue --> Table Entry --> Create
                              --> Add the above two parameters one after another
    b) To the views BKPF_BSAD, BKPF_BSAK, BKPF_BSID, BKPF_BSIK
                           under the view fields add the below field,
                           View Field  Table    Field      Data Element  DType  Length
                           CPUTM       BKPF    CPUTM          CPUTM      TIMS   6
                           This can be achieved using transaction 'SE11' for views
                           BKPF_BSAD, BKPF_BSAK , BKPF_BSID , BKPF_BSIK (one after another)
                               --> Change --> View Fields
                               --> Add the above mentioned field with exact details
    c) For the table BWFI_AEDAT index-1  for extractors
                           add the field AETIM (apart from the existing MANDT, BUKRS, and AEDAT)
                           and activate this Non Unique index on all database systems (or at least on the database under use).
                           This can achived using transaction 'SE11' for table 'BWFI_AEDAT'
                               --> Display --> Indexes --> Index-1 For extractors
                               --> Change
                               --> Add the field AETIM to the last position (after AEDAT field )
                               --> Activate the index on database
    2. Implement the source code changes as in the note correction instructions.
    3. After implementing the source code changes using SNOTE instruction ,add the following parameters to respective function modules and activate.
    a) Function Module: BWFIT_GET_TIMESTAMPS
                        1. Export Parameter
                        a. Parameter Name  : E_TIME_LOW
                        b. Type Spec       : LIKE
                        c. Associated Type : BKPF-CPUTM
                        d. Pass Value      : Ticked/checked (yes)
                        2. Export Parameter
                        a. Parameter Name  : E_TIME_HIGH
                        b. Type Spec       : LIKE
                        c. Associated Type : BKPF-CPUTM
                        d. Pass Value      : Ticked/checked (yes)
    b) Function Module: BWFIT_UPDATE_TIMESTAMPS
                        1. Import Parameter (add after I_DATE_HIGH)
                        a. Parameter Name  : I_TIME_LOW
                        b. Type Spec       : LIKE
                        c. Associated Type : BKPF-CPUTM
                        d. Optional        : Ticked/checked (yes)
                        e. Pass Value      : Ticked/checked (yes)
                        2. Import Parameter (add after I_TIME_LOW)
                        a. Parameter Name  : I_TIME_HIGH
                        b. Type Spec       : LIKE
                        c. Associated Type : BKPF-CPUTM
                        d. Optional        : Ticked/checked (yes)
                        e. Pass Value      : Ticked/checked (yes)
    4. Working of minute based extraction logic:
                  The minute based extraction works considering the time to select the data (apart from date of the document either changed or new as in the earlier logic).The modification to the code is made such that it will consider the new flags in the BWOM_SETTINGS table ( BWFINEXT and BWFINSAF ) and the code for the earlier extraction logic will remain as it was without these flags being set as per the instructions for new logic to be used(but are modified to include new logic).
    Safety interval will now depend on the flag BWFINSAF (in seconds ; default 3600) and has  a default value of 3600 (1 hour), which would try to ensure that the documents which are delayed in posting due to delay in update modules for any reason. Also there is a specific coding to post an entry to BWFI_AEDAT with the details of the document which have failed to post within the safety limit of 1 hour and hence those would be extracted as a changed documents at least if they were missed to be extracted as new documents. If the documents which fail to ensure to post within safety limit is a huge number then the volume of BWFI_AEDAT would increase correspondingly.
    The flag BWFINSAF could be set to particular value depending on specific requirements (in seconds , but at least 3600 = 1 hour)  like 24 hours / 1 day = 24 * 3600 => 86400.With the new logic switched ON with flag BWFINEXT = X the other flags  BWFIOVERLA , BWFISAFETY , BWFITIMBOR are ignored and BWFILOWLIM , DELTIMEST will work as before.
    As per the instructions above the index-1 for the extraction in table BWFI_AEDAT would include the field AETIM which would enable the new logic to extract faster as AETIM is also considered as per the new logic. This could be removed if the standard logic is restored back.
    With the new extractor logic implemented you can change back to the standard logic any day by switching off the flag BWFINEXT to ' ' from 'X' and extract as it was before. But ensure that there is no extraction running (for any of the extractors 0FI_*_4 extractors/datasources) while switching.
    As with the earlier logic to restore back to the previous timestamp in BWOM2_TIMEST table to get the data from previous extraction LAST_TS could be set to the previous extraction timestamp when there are no current extractions running for that particular extractor or datasouce.
    With the frequency of the extraction increased (say 3 times a day) the volume of the data being extracted with each extraction would decrease and hence extractor would take lesser time.
    You should optimize the interval of time for the extractor runs by testing the best suitable intervals for optimal performance. We would not be able to give a definite suggestion on this, as it would vary from system to system and would depend on the data volume in the system, number of postings done everyday and other variable factors.
    To turn on the New Logic BWFINEXT has to be set to 'X' and reset back to ' ' when reverting back. This change has to be done only when there no extractions are running considering all the points above.
                  With the new minute based extraction logic switched ON,
    a) Ensure BWFI_AEDAT index-1 is enhanced with addition of AETIM and is active on the database.
    b) Ensure BWFINSAF is atleast 3600 ( 1 hour) in BWOM_SETTINGS
    c) Optimum value of DELTIMEST is maintained as needed (recommended/default value is 60 )
    d) A proper testing (functional, performance) is performed in standard test system and the results are all positive before moving the changes to the production system with the test system being an identical with the production system with settings and data.
    http://help.sap.com/saphelp_bw33/helpdata/en/af/16533bbb15b762e10000000a114084/frameset.htm

  • Execute infopackage error, using custom FM Datasource

    hi, experts,
    I have a problem with the data load from FM datasource to psa. (sourcesystem is oneself BI system).
    Moitor status message:
    Request still running
    Diagnosis
    No errors found. The current process has probably not finished yet.
    System Response
    The ALE inbox of BI is identical to the ALE outbox of the source system
    or
    the maximum wait time for this request has not yet been exceeded
    or
    the background job has not yet finished in the source system.
    Current status
    in the source system .
    Moitor Details message:
    Requests (messages): Everything OK
    Extraction (messages): Missing messages
    Data request received
    Data selection scheduled
    104 Records sent ( 104 Records received )
    Missing message: Selection completed
    Transfer (IDocs and TRFC): Everything OK
    Processing (data packet): Everything OK
    job log:
    Job started
    Step 001 started (program SBIE0001, variant &0000000001394, user ID sunsh)
    Asynchronous transmission of info IDoc 2 in task 0001 (0 parallel tasks)
    DATASOURCE = ZDST_DETAILDAYS
    Call customer enhancement EXIT_SAPLRSAP_001 (CMOD) with 104 records
    Result of customer enhancement: 104 records
    Asynchronous send of data package 1 in task 0002 (1 parallel tasks)
    tRFC: Data Package = 0, TID = , Duration = 00:00:02, ARFCSTATE =
    tRFC: Start = 2009/10/13 09:43:22, End = 2009/10/13 09:43:24
    tRFC: Data Package = 1, TID = 0A00920601194AD3DB3C19D6, Duration = 00:00:00, ARFCSTATE = RECORDED
    tRFC: Start = 2009/10/13 09:43:24, End = 2009/10/13 09:43:24
    Asynchronous transmission of info IDoc 3 in task 0003 (0 parallel tasks)
    I have validate the data using RSA3, that is OK, and the data already load to PSA table and can be display, but the status aways "yellow".
    I have re-active and replicate the datasource .
    can someone tell me how to solve this issue ?
    thanks in advance
    xwu.

    hi,    Ansel
        if not return any data, the exception NO_MORE_DATA will be raised correctly.
    but it contain 104 records,  the FM code like , :
       call function GET_DATA
        IF SY-SUBRC <> 0.
          RAISE NO_MORE_DATA.
        ENDIF.
        S_COUNTER_DATAPAKID = S_COUNTER_DATAPAKID + 1.
    maybe the FM contains some error,  verify using RSA3 that is ok.
    thanks a lot.
    Edited by: xwu wu on Oct 14, 2009 8:23 AM

  • APO-DP Datasource - Delta

    We have a requirement to enable delta for a pdatasource which is based on a planning area, it seems SAP is inactivated this functionality for these kind of datasources...
    Did anyone tried to enable delta for planning area based datasource ? if so can you please share your experience.
    Thanks,
    Raman

    We are having full load..as SAP said delta will take more time than full...
    MIght be in latest versions they have upgraded delta functionality..Check OSS notes/help..

  • Customizing the DataSource for my media player application

    Hi,
    I am trying to build a j2me player which is fed from a list of files (e.g. providing few mp3 files and having playing it as one continues stream).
    I understand that DataSource can be customized for such needs but I don't really know how to do that and I can't find good and simple explanation.
    Please let me know what are the steps, or point me to good source.
    Thank in advance

    Hello,
    I think that maybe there's nothing wrong with your jar. If you don't install jmf, it will not find the codecs.
    "Q: What is JMF Registry?
    JMF 2.1.1 maintains a registry of available plugins, package prefixes and other settings in a file called jmf.properties. This is a binary file and should only be modified using the provided JMFRegistry application. This application is a part of the JMF 2.1.1 jmf.jar file and can be run as "java JMFRegistry". It requires that you have Swing-1.1 in your CLASSPATH (or you can use JDK 1.2 or later). "
    I suppose JMF searchs for the codecs in the jmf.properties. If you don't want to install JMF, try to add only the jmf.properties to your classpath, or keep it in the same directory of your jar.
    Hope it helps.
    ANeto

  • How to modify customer generated DataSource?

    Hi Experts,
    I have a DataSource "1ZICM_DOCSE" which currently extracts some of the fields from source table "ZICM_DOCSE". After some investigation, I discover that some necessary fields from the table are missing in the extraction structure.
    Questions:
    1. Since the name of the DS starts with 1 (SAP namespace), I supposed that it is a business content DS. However, I could not find it in RSA5 but only in RSA6. Why is it so? Is it a customer generated DS?
    2. In RSA2, it shows that the extraction method is "F1 Function Module (Complete Interface)". When I double click on the extractor "ZICM_EXTR_DOCSE", ABAP editor is shown with the header indicating "This file has been generated. Never change it manually!". How do we generate an extractor in source system?
    3. In this case, how can I modify the extraction structure to include more fields from the source table?
    Thanks in advance for the effort!
    Regards,
    Meng

    Hi Venkatesh, Hi Durgesh,
    first of all thanks for your reply. However, I could not carry out the procedures mentioned by you and I hope that you can help me clear the doubts
    1. Judging from your input I assume that this DS "1ZICM_DOCSE" is a customer generated DS. Am I right? How is the DS generated? via a standard SAP program?
    2. When I go to RSO2 and try to display the DS, I get following message.
    DataSource 1ZICM_DOCSE is extracted using functional module ZICM_EXTR_DOCSE
    Message no. RJ042
    Diagnosis
    Data from DataSource 1ZICM_DOCSE has been extracted using function module ZICM_EXTR_DOCSE.
    If you edit and save the DataSource by maintaining generic extraction, extraction no longer takes place
    using this function module, but is performed by a database view, a transparent table, a functional area
    of SAP query, or a self-defined extraction module.
    If 1ZICM_DOCSE is included in the deliverd SAP Content, you can activate extraction using function module
    ZICM_EXTR_DOCSE again by transferring this Content with transaction RSA5 or the IMG.
    I cannot display as well as change the DS in RSO2.
    3. Since the FM is generated, why should we change it? The changes will be overwritten once the FM is generated again, right?
    Looking forward to your reply!
    Thanks,
    Meng

  • Generic Datasource Delta Settings Change

    Hi Experts,
    Currently the delta field in generic datasource is set to timestamp (LOCAL) and Upper limit is blank.
    I need to change the delta settings to timestamp(UTC) or set the upper limit to 1000 seconds.
    Can anyone tell me how to change this in production. Do i need to change this in dev and transport this to production.
    If i do the changes to generic datasource, do i have to initialise the delta again?
    Pls provide me the step by step solution.
    Regards,
    Anand

    Ananda, What about the delta initialization? After transporting the datasource from R/3 dev to R/3 production, I have to replicate the datasource right.
    In that case what will happen to my delta?.
    One more thing incase i need to change only the upper limit value.  Even for this, do i have to change it in dev and transport to production?
    Regards,
    Anand

  • FI datasources delta query

    Hi ALL
    I want to know the difference between delta in FI and delta in LO.
    1. I know that LO datasources send after and before images (ABR) and can feed ODS or cube.
    2. I know that delta enabled FI DS have after image and in such cases cant feed a cube and would need an ODS in between.
    3. I know that RSA7 queue holds records for delta records. This is true for both FI and LO datasources.
    But in case of LO cokcpit, we select queued delta/direct delta options, we run LBWQ jobs etc to push data into the RSA7 queue.
    What neesd to be done for FI datasources? I do see them in the RSA7 queue. Does this mean they pull data directly and we dont need to run any jobs for FI DS??
    Please clarify my doubts.
    SP

    Hi
    Why dont you look at this link?
    http://help.sap.com/saphelp_nw04/helpdata/en/af/16533bbb15b762e10000000a114084/content.htm.
    This should answer your question
    Prakash

  • Generic Datasource Delta upload Issue

    Hi all,
    I have created a Generic Datasource for Solution Manager ODS 0crm_process_srv_h which contains the userstatus, transaction number, transaction desc and GUID of CRM Order Object (ods key field),only these fields.
      I have taken transaction number as delta field and it is as timestamp. But delta upload is not happening. If I give GUID or Status also delta upload is not happening . Please suggest which is the best way to give to delta upload.
    Thanks and Regards,
    SGK.

    Hi,
    in your case you need to create a function module as a extractor. Create a extract structure as well and add a timestamp field (lets call it zdeltahelp) to be used as the delta relevant field of the datasource. After successful init the last extraction timestamp will be passed to the fm and you can start selecting all resb entries for orders created and/or changed from that time on as well as the ones deleted (but therefore you need to read the change documents as well). For more information about generic extraction using fm search the forum or check out my business card. There you will find a link to a weblog about that issue.
    regards
    Siggi

  • CRM Datasource-Delta not Working

    Folks,
    I am using CRM Standard Datasource 0CMS_RTCM_TRAN-Resale Tracking and Claims Management data..
    I have enhanced this Datasource with four fields and transported to Quality system..
    no problem with the new additions..Datasource is extracting properly..
    but Delta update is not working..Delta Queue is not picking any records..but new transactions are getting updated in the respective source tables..
    when i checked in BWA1- am getting these messages:
    error message:
    (121) MS_RTCM_TRAN: Table SMOXHEAD contains no BDOC name.
    Warning messages:
    (111) MS_RTCM_TRAN: No mapping found for extract field SLSR_ADDRESS_I..similarily for all the Datasource fields, its showing the same no mapping warning..
    its working fine with Full update..but for Delta update am facing these issues..
    Can you please explain what could be the reason for this and suggest a proper solution..
    Thanks and Regards,
    Yog

    Hi,
    Are there any transactions to change the BDOC name for Standard Datasources??..
    In BWA1, when i tried to change these ADAPTER settings, i am getting this message:
    "DataSource 0CMS_RTCM_TRAN cannot be edited with this transaction"..
    i hope if i specify the BDOC name, Extraction should be happening..
    so please suggest where i can perform these changes??..
    Note : currently no extraction is happening because of this..
    Regards,
    Yoga

Maybe you are looking for

  • XML content

    Hi I write web sites for a part time living ( very minimal stuff). Because Dreamweaver is expensive and unwieldy (in my mind) I use MS Expression Web 1 (and 2) . I have a xml based database of national members of the organisation I work for - the xml

  • Deactive cash discounts during EBS

    Dear colleagues, I would like to clear vendors open items atomatically during the electronic bank statements but without cash discounts. The vendor line items have cash discount terms but I want to deactivate the cash discounts without manual interac

  • OAF: LOV and Dependant poplist(based on LOV input)???

    Hi all, i have one LOv and depending on this i need to develop one dependent picklist. the poplist should display the values depends on LOV. Example: LOv is displaying "ename" and picklist should display "deptno" of employee based on the "eno"(corres

  • Link to Subclips!

    I really have to get back to the client if this can be done! If not they are going to take it to Final Cut, and use Glue Tools! PLEASE HELP! I'm sub-titling a movie and I made every sub-title into a 1 frame subclip. It  worked great. I moved the proj

  • Why use oracle instead of some other db?

    I need some help. I'm writing a masters thesis about porting a system from PostgreSQL to Oracle 10g. They idea is to get increased performance on the system I need to argument for why I chose Oracle instead of some other DB. So I wondered if anyone h