Append extract  structure of CAPA data source

Hi,
  I have created a COPA data source and need to append the extract structure with field 'ZZAUDAT'. Can anyone please suggest how to do this?
Thanks,
Kim.

U will also need to write a user exit in CMOD to populate the values for that enhanced field.
See below thread..
Re: enhancements
cheers,
Vishvesh

Similar Messages

  • View V/s Extract structure in generic data source!

    Hi,
    Is it mandatory that view and extract structure must contain same number of fields in generic extraction?
    can we enhance the extract structure with a field which is not existing in view?
    clarify me with exmaple
    Thanks,
    Ravi

    Thanks Diego!
    Actaully, the view which is used in the definition of the data source has four tables
    namely AFRU,CRHD,AUFK and AFKO.
    Join conditions are:
    AFRU     MANDT     =     CRHD     MANDT
    AFRU     ARBID     =     CRHD     OBJID
    AFRU     MANDT     =     AUFK     MANDT
    AFRU     AUFNR     =     AUFK     AUFNR
    AFRU     MANDT     =     AFKO     MANDT
    AFRU     AUFNR     =     AFKO     AUFNR
    Now, i want to know from which table is the field AUFNR is mapped to BW info object!
    As the field is in both tables AFRU and AUFK, i am confused of from which table the data is getting filled up to the BW info object!
    I need to make a documentation of of each info object of how is it getting filled up and from which table?
    Thanks,
    Ravi

  • Extraction, Flat File, Generic Data Source, Delta Initialization

    Extraction, Flat File, Generic Data Source, Delta Initialization
    I have couple of questions regarding data extraction.
    1.     If you have Data Source a Flat File e.g. Excel file I know that you have to create Data Source at BW side. How do you upload updates, by selecting Delta Update when executing next Data Load? Do you ever u201Cconvertu201D this Excel file into Application Tables to become SAP Source 
    2.     Can you please give me example of situation where you had to create Generic Data Source? What is difference between Time Stamp, Calend. Day and Numeric Pointer. Which one is most common to select?
    3.     I read somewhere that Generic Data Source does not have Setup Table. I thought that you have to have Setup Table in order to load transaction Data otherwise you will lock the Application Tables. Where am I going wrong im my thinking please?
    4.     What are steps in terms of IP before, under and after Delta Initialization. I belive that you can do two ways:
    Full Update - Initialize Delta Process (without Data Transfer) u2013 Delta Update  or
    Initialize Delta Process (with Data Transfer) u2013 Delta Update
    Am I right? What is most common method and why?
    5.     If you want to add a filed in Data Source after 6 month using it, you want to do it without re-init Delta Queue. You add field in RSA6, then provide info for ABAP to populate new filed (info u2013 name of Data Source, Extract Structure, field added, name of Application Table which contains the field). How does it work now as there is no SetUp table it has been deleted after Initialisation? How does Delta Queue know that it is going to receive data which has been  expanded by one field or it may does not need to know at all?
    THANKSSSSSSSSSs

    Hi,
    1. If you have Data Source a Flat File e.g. Excel file I know that you have to create Data Source at BW side. How do you upload updates, by selecting Delta Update when executing next Data Load? Do you ever u201Cconvertu201D this Excel file into Application Tables to become SAP Source
    Once you create Datasource for A flat file extraction then it is file source system specific hence you cont change to Application table source Data source
    In info package you can change the source as application server instead of desktop no need to change the DS
    2. Can you please give me example of situation where you had to create Generic Data Source? What is difference between Time Stamp, Calend. Day and Numeric Pointer. Which one is most common to select?
    When we don't find any standard extractor then we can go for Generic(if i want information sales along with finance information in a data source then generally we dont get standard one hence we can go for generic DS)
    Check the below link for More about generic DS .
    http://wiki.sdn.sap.com/wiki/display/BI/Generic+Extraction
    for Delta capturing you can use
    Timestamp(if the table has time stamp filed  so it can stamp the last changed record in that table hence it is easy to get delta based on the time stamp)
    Calday- (If the table doesn't have the Timestamp filed then search for Calday where you can stamp the delta based on the date when documents are changed )
    Numericpointer : If the table doesn't above both then we go for this option where you can the numeric value change stamp )
    3. I read somewhere that Generic Data Source does not have Setup Table. I thought that you have to have Setup Table in order to load transaction Data otherwise you will lock the Application Tables. Where am I going wrong im my thinking please?
    Generic datasource nothing but we extracting data directly from the database table without any interface between the application/systems
    4. What are steps in terms of IP before, under and after Delta Initialization. I belive that you can do two ways:
    Full Update - Initialize Delta Process (without Data Transfer) u2013 Delta Update or
    Initialize Delta Process (with Data Transfer) u2013 Delta Update
    Am I right? What is most common method and why?
    Correct
    5. If you want to add a filed in Data Source after 6 month using it, you want to do it without re-init Delta Queue. You add field in RSA6, then provide info for ABAP to populate new filed (info u2013 name of Data Source, Extract Structure, field added, name of Application Table which contains the field). How does it work now as there is no SetUp table it has been deleted after Initialisation? How does Delta Queue know that it is going to receive data which has been expanded by one field or it may does not need to know at all?
    Once you add the new field to structure(DS) you will get the data as on date onwards not historical data hence what is the concept of setup table  ( delta records come from the Delta Que not from the setup table )
    If you want histaric data to new field then you need to setp table deletion ...etc...
    Hope it is clear..
    Regards,
    Satya

  • Can generic function module extractions possible with CRM data sources.

    Hi Friends,
    can generic function module extractions possible with CRM data sources.
    90% extractions are generic function module extractions.
    How is it possible without BW adapters..
    It's very urgent ...Please.
    Thanks,
    Basava Raju

    Hi Madhu,
       Just curious ... if you already have a generic FM extractor then just go into it and find out where it is reading the data from. If its reading data from the CRM system then thre is no need for any BW adapter ... just in case you need any ABAP help ... post here ... I may be of use to you.
    Best regards,
    Kazmi

  • Data Extraction issue from the data source 2LIS_08TRTLP

    Hi All,
            We are facing one issue in time of extraction from data source 2LIS_08TRTLP(Shipment Delivery Item Data per Stage) in BW. For some out bound delivery no. Gross weight(BRGEW) is coming zero though there is value showing in the source system(ECC) in VL03 and in LIPS table. BRGEW (Gross weight) which is directly mapped in BW infosource. There is no restriction in the extraction and 0 is coming for this field in the first layer extraction.Even in the PSA all values are zero.
            We have checked those delivery , delivery is being split into batch . But not the all split  delivery is
    giving this problem.
    With Thanks
    Shatadru

    Yes I have done the same  means I have filled setup table for those shipment no related to that delivery no and  checked in the RSA 3 and required value  is coming  in RSA3 a. That means there is no problem in the Extractor  .But I can not pull the data now in BW as it is in Production and every day delta comes .But the in the first layer ODS there is no value for that entry which was previously loaded(Though there is no routine and any restriction).
    But I have one observation in the data source that  particular field is in INVERSION marked in the Source. But that particular delivery for which it is giving is not canceled or returned or rejected. PGI created for that delivery and delivery status is completed for that.

  • Error in Data Extraction for Asset Accouting Data Source

    Hi all,
    We have install the new flow of the Asset Accounting i.e 0FI_AA_20 data source. We are trying to load the data in BW. The problem is the volume of data is so high. Around 35 lacs records.
    I am getting below error while loading in the BW.
    "Error 7 when sending an IDoc"
    "No more storage space available for extending an i 3 "
    Please suggest..
    Regards,
    Macwan James.

    Hi Macwan James,
    Not only the requests from DS to which you are loading data. You can delete the requests from other DS--> manage screens also if they are no longer required.
    Or else use selection screen of Infopackage to load the data. Change the selections & try to load the data.
    Hope it helps!
    Regards,
    Pavan

  • APD using Query with multiple structures as a data source

    All,
    I want to set up an automatic process which executes a query and exports it to a shared drive as a csv file. I have tried various options , when I try to use APD to set up the extract, I get an error and this is because the query that I am trying to use has Strucutres in both rows and columns. Hence, I am unable to use this option. I tried RSCRM_BAPI, It works well, but there is an issue with scheduling this in Process chain. I created an event and scheduled this as a job to trigger after "event" as per SAP instructions, but the job does not exist and it is not possible to trigger it through the Process chain unless the variables are hard coded in the query which I do not want to do.
    Can any one tell me if there is a way to deal with APD using Query with multiple structures?
    Would really appreciate if some one can give me the right solution...
    Thanks

    Hi Tanu ,
    APD is an option but its not very good with large amount of data or hiearachies or if you have attributes in you query structure .
    One more option for this requirement is use of report program using function module RRW3_GET_QUERY_VIEW_DATA .
    This will work fine with multiple structure etc .
    There are some overheads with this FM  ex: if amount of data is too much then program will give dump .Solution for that is we call the FM in LOOP by diving amount of data need to be fetched .ex:  we can read data quarter wise.
    For using this function module what you can do is write an ABAP program (At SE38 ) .which will call this FM and then write the output into a flat file which you can save at application server (AL11) .From there other system can read it .
    To automate this whole process you can further add all the report programs into a process chain (RSPC) which can be schedule as per requirement .
    To pass input parameters you can use variants that will pass the values to the report .
    Check thi link for sample code :
    [http://www.tricktresor.de/content/index.php?navID=696&aID=496]
    Hope this will be helpful .
    Regards,
    Jaya Tiwari

  • SM37 - "pause" V3-update job while importing data source and extract struct

    Hi
    Insted of deleting an hourly schedulated V3-update job and the recreating it (after a import of a extract structure and a data source) I came to think about whether it is possible to, kind of, pause the job that I otherwise would delete and recreate.
    Is this possible some how?
    My thought:
    1. Find the job in released status.
    2. Select the job and change it (top menu Job-->change)
    3. Change the scheduling to some day in the future
    4. Change job status from released to scheduled
    5. Import the extract structure and data source.
    6. Change the schedule job back to hourly run.
    7. Now the job should run on hourly basis again....
    By the way: what do I import first, the data source or the extract structure?
    Thanks in advance and kind regards,
    Torben

    We did delete and recreate...

  • How to find Table Name and Field Names given a Data Source

    Hi,
    I tried ROOSOURCE table in R/3 to find the Extract Structure and Extractor names for a specific data source, let us say 2LIS_02_ITM (PO Item Level).
    I know the extract structure for this data source is MCO2M_0ITM
    I am not able to find where this structure is extracting the data for every field.. I wanted to know the corresponding table name and the respective field names.. Both the existing and Appended fields..
    Thanks,
    Naren

    Hi,
    Check in in LBWE and Click on Maintenance and see the table names
    EKKO
    EKPA
    EKPO
    Tables are use for this DS
    Eg:
    MCEKKO  BEDAT  Document Date
    MCEKKO  BSART  Document Type
    MCEKKO  BSTYP  Doc. Category
    MCEKKO  BUDAT  Accounting date
    MCEKKO  EBELN  Purchasing Doc.
    MCEKKO  EKGRP  Purch. Group
    Note: here EKKO is table
    https://wiki.sdn.sap.com/wiki/display/BI/BW%20SD%20MM%20FI%20DATASOURCES
    thanks
    Reddy
    Edited by: Surendra Reddy on Mar 10, 2010 8:18 AM
    Edited by: Surendra Reddy on Mar 10, 2010 8:19 AM

  • Datasources and Extract structure

    Hi guys,
    I am pretty confused about DataSources and Extract structure.
    Can someone please explain it me in simple words.

    Hi Zubin,
    Data Source is a consoliated list of fields available and the extract structure is a data dictionary structure which illustrates the fields with additional technical elements like DATA ELEMENT , DOMAIN ans so on...
    To make it more clear look at this description from F4 help:
    A DataSource is an object for retrieving data. The DataSource is localized in the OLTP system.
    It has
    an extract structure,
    an extraction type, and
    an extraction method.
    The extract structure describes the fields of the internal table that contain the extracted data. The extraction type describes which type of extractor the DataSource uses. The extraction method extracts the data and transfers it into an internal table with the same type as the extract structure.
    The DataSource also contains information on the type of data that it stages, for example, attributes, transactional data, hierachies, or texts. It can also support different types of data update.
    Extract Structure for a DataSource
    The extraction structure for a data source shows the format in which the DataSource, or the extractor for the DataSource, transfers its data.
    A data element must be assigned to each field of the extraction structure. This allows in the Business Information Warehouse an intelligent mapping between field names and InfoObjects using just this data element.
    The extract structure must be created in the DDIC as a dictionary structure or transparent table. A view is not permitted here since it would then not give you the option to add an append.
    Appends enable you to convert your individual requirements and own "Business Logic" in the extraction process. You can fill the fields in the append using function enhancements.
    Hope this helps
    Thanks,
    Raj

  • Enhancing the extract structure for 0FI_GL_4,  0FI_AP_4,  0FI_AR_4

    Hi all,
          Does anyone know how to enhance the extraction structures with additional fields for datasources 0FI_GL_4 (General ledger: line item), 0FI_AP_4 (vendors: line item) and 0FI_AR_4 (customers: line item). 
    Thanks,
    Sabrina.

    Hi
        Here are the two scenario's described in the note:
    1. All the fields of the customer enhancement in the customer include are contained in the read structure (see the table above).  Then no additional action is required. The fields of the customer enhancement are filled automatically by the datasource from the assigned read structure via "move-corresponding".
    Example:   The extraction structure DTFIGL_4 for datasource 0FI_GL_4 (General ledger: line item) should be enhanced by the VALUT (value date) field.
               To do this, create structure CI_BSIS in the data dictionary of the R/3 source system and include the VALUT field in it. The data dictionary in the R/3 source system shows that the VALUT field is contained in the read structure of the datasource (table BSIS). For that reason the VALUT field is automatically filled by datasource 0FI_GL_4.
    2. Fields of the customer enhancement in the customer include are not contained in the read structure (see the table above).  In this case you have to program a function module to fill the field of the customer enhancement. To do this, there is a Business Transaction Event available (open FI interface for process 00005021). Create any function module you like and use function module SAMPLE_PROCESS_00005021 as a template for the interface (input parameter, changing parameter).
               Interface of function module SAMPLE_PROCESS_00005021:
                Input-parameter I_OLTPSOURCE: datasource that is currently extracting data from the R/3 source system.
                Changing-Parameter C_STRUCTURE: Extraction structure of the data source currently extracting including fields from the assigned customer include. When you call this customer defined function module, all the fields of the extract structure are transferred filled.
               Check whether the type pool SBIWA is declared in the TOP include of the function group.
    If not, add it with the statement TYPE-POOLS: SBIWA.
               Then maintain table TPS31 with Transaction SM30. Create the following entry:
               PROCS LAND APPLK FUNCT
               00005021 <fname>
               <fname> stands for the customer defined function module.
                By doing so, the function module you defined is called for each extracted record. Note that in this case the performance of the extraction may be reduced significantly regardless of the table read and the complexity of the programmed logic.
               Example:
               You want to enhance the extraction structure DTFIAR_3 for datasource 0FI_AR_4 (customers: line item) by the ORT01 (city) field from the customer master record.
                To do this, create structure CI_BSID in data dictionary of the R/3 source system and include the ORT01 field in that. The data dictionary in the R/3 source system displays that the ORT01 field is NOT contained in the read structure of datasource 0FI_AR_4 (Table BSID).
                Therefore you have to program a function module that reads the customer master in the R/3 system (table KANN1) and fills the ORT01 field of the customer enhancement. In the changing parameter C_STRUCTURE the filled fields of the extraction structure DTFIAR_3 are available to you. With the KUNNR field of extraction structure DTFIAR_3, you can select the respective master data record from table KNA1 and determine field ORT01 of the customer enhancement.
    1. All the fields of the customer enhancement in the customer include are contained in the read structure (see the table above).
               Then no additional action is required. The fields of the customer enhancement are filled automatically by the datasource from the assigned read structure via "move-corresponding".
               Example:
               The extraction structure DTFIGL_4 for datasource 0FI_GL_4 (General ledger: line item) should be enhanced by the VALUT (value date) field.
               To do this, create structure CI_BSIS in the data dictionary of the R/3 source system and include the VALUT field in it. The data dictionary in the R/3 source system shows that the VALUT field is contained in the read structure of the datasource (table BSIS). For that reason the VALUT field is automatically filled by datasource 0FI_GL_4.
               ATTENTION:
               By using Note 430303 you can enhance DataSource 0FI_GL_4 by all fields from table BSEG (instead of BSIS); then the fields are filled automatically in the extractor.
    1. Fields of the customer enhancement in the customer include are not contained in the read structure (see the table above).
                In this case you have to program a function module to fill the field of the customer enhancement. To do this, there is a Business Transaction Event available (open FI interface for process 00005021). Create any function module you like and use function module SAMPLE_PROCESS_00005021 as a template for the interface (input parameter, changing parameter).
               Interface of function module SAMPLE_PROCESS_00005021:
                Input-parameter I_OLTPSOURCE: datasource that is currently extracting data from the R/3 source system.
                Changing-Parameter C_STRUCTURE: Extraction structure of the data source currently extracting including fields from the assigned customer include. When you call this customer defined function module, all the fields of the extract structure are transferred filled.
               Check whether the type pool SBIWA is declared in the TOP include of the function group.
    If not, add it with the statement TYPE-POOLS: SBIWA.
               Then maintain table TPS31 with Transaction SM30. Create the following entry:
               PROCS LAND APPLK FUNCT
               00005021 <fname>
               <fname> stands for the customer defined function module.
                By doing so, the function module you defined is called for each extracted record. Note that in this case the performance of the extraction may be reduced significantly regardless of the table read and the complexity of the programmed logic.
               Example:
               You want to enhance the extraction structure DTFIAR_3 for datasource 0FI_AR_4 (customers: line item) by the ORT01 (city) field from the customer master record.
                To do this, create structure CI_BSID in data dictionary of the R/3 source system and include the ORT01 field in that. The data dictionary in the R/3 source system displays that the ORT01 field is NOT contained in the read structure of datasource 0FI_AR_4 (Table BSID).
                Therefore you have to program a function module that reads the customer master in the R/3 system (table KANN1) and fills the ORT01 field of the customer enhancement. In the changing parameter C_STRUCTURE the filled fields of the extraction structure DTFIAR_3 are available to you. With the KUNNR field of extraction structure DTFIAR_3, you can select the respective master data record from table KNA1 and determine field ORT01 of the customer enhancement.
    After you create the customer include in the R/3 source system you have to post process the accompanying datasource with Transaction RSA6. Select the application component according to the above table and change the datasource that fits the customer include. The "Hide fields" flag should be removed in the field list for the fields of the customer include. Then save the field list.
    If the fields of the customer include is not displayed in Transaction RSA6, refer to note 415530.
    1.  After you create the customer include in the R/3 source system you have to post process the accompanying datasource with Transaction RSA6. Select the application component according to the above table and change the datasource that fits the customer include. The "Hide fields" flag should be removed in the field list for the fields of the customer include. Then save the field list.
    If the fields of the customer include is not displayed in Transaction RSA6, refer to note 415530.
    Please let me know.
    Thanks,
    Sabrina.

  • Data source active/inactive button grayed out in LBWE

    Hello,
    When I was activating the Logistics datasource in LBWE, my data source status became red and the Active/Inactive button grayed out? how can I make this data source active again?
    Thanks,
    KK

    Hi Kumar,
    The status will become Red and Active/Inactive is disable due to Extract structure activated but data source is not maintained. This happens once you change the extract structure to Include additional fields. Please do the below step:
    in LBWE, Click on Data source (Technical name of your data source beside maintainance) and you will get maintainance screen, change if you want some thing or else leave as it is and save, Now  the status for data source in LBWE will become yellow and activate/deactivate will enable.
    Regards,
    Kams

  • Data source Enhancement with Function Module

    Hi all!
    I have a requirement like I have to enhance my Generic Data source.
    Let me make it clear!
    I have 10 field in R/3 which I am able to get into extract structure using Generic Data Source on the corresponding Table.
    I have another field on my extract structure, say ZEXMFLD1 which is updated by a function module ZZ_FUNC_MODULE_SAMPLE.
    I came to know that Data source Enhancement can be done, but don't know how.
    Request your guidance in this. Can u please let me know how I can achieve this?
    Thanks,
    Sri

    Hi Sri,
    You can enhance generic datasources also.
    For that you have to use below function modules based on your datasource type.
    EXIT_SAPLRSAP_001  - Transactiona data
    EXIT_SAPLRSAP_002  - Master
    EXIT_SAPLRSAP_003  - text
    EXIT_SAPLRSAP_004  - Hier
    First check in CMOD(TCODE) whether component RSAP0001 is assigned to any project. If it is not assigned to any project(fresh system), assign it to a project by creating it.
    Ex. ZBW. If it is already assigned, go SE37.
    For transaction data go to function module EXIT_SAPLRSAP_001(Trans data) and start your logic.
    CASE i_datasource.
    WHEN 'your data source name'.
    call you Function module  -  CALL  'ZZ_FUNC_MODULE_SAMPLE'.
    pass the imported value from above FM to c_t_data.
    close case with ENDCASE.
    Let me know if you need more info.
    - Kalyan.

  • Data source and Extract Structure

    Hi all,
    I have a doubt on Data source and Extract Str,
    when i used one info cube with some char and kf's
    after i did extraction shall we change the extract strcuture prequently other wise better to use all the predefined extract strcuture hide in data source.
    if i hide in datasource after extraction i want to use some of hide fields, so we can use it? if we can use how can we extract data for that specified hide fields to bw side.

    If you wont to create the genaric data source at that time
    you can create cube with some char and kf.
    after extraction you wont to modify the structure ? if you mofify
    the structure then replicate the data source and delete the data
    and upload the data from r/3 to bw.its not correct way every time modifying the
    structure.if you put hide means that fields are not coming to
    bw side.if put modify the hide field ,first goto rsa6 edit data source
    remove the hide check box and save after come to bw replicate the data source
    delete the data and load the data (if you wont full data for hide field)
    if you dont load full load means only available upto data only what ever field you modify the
    hide to unhide,

  • Standard Extract Structure Data to SAPXI

    Hi
    I am looking for some possibility where standard extract structures can be used from  R/3 system and they can directly transfer the data to SAP XI system in some way or the another.
    This means unlike you replicate the data source in BW System and transfer the data there, I wish to utilize the already present standard extract structure which is delta enabled to transfer (Master Data/ Transaction data to SAP XI) system.
    I know there are standard Idoc/RFC and also proxy is a way, but wish to know if we can the standard extract structure to pass data to SAPXI.
    Best Regards
    Paresh

    Hi:
    The extract structures and the S-API technology in place for transferring data to BI are different from EAI technology.  Essentially, the extract structures required extraction, a job to read the data and transfer it; XI expects an inbound message.  Thus, you need an "extractor.  So to answer your question directly, "no", a direct technique to accomplish a direct R/3 - data exchange that is delta enabled and uses the existing extract structures is not readily available.
    Your best bet in this scenario is to utilize the ETL (extraction transformation and load) technology that is already available with your BI system and the S-API, as an intermediary.
    The following paper should be quite helpful for you:
    https://www.sdn.sap.com/irj/servlet/prt/portal/prtroot/docs/library/uuid/e698aa90-0201-0010-7982-b498e02af76b
    Thanks for any points you choose to assign.
    Best Regards -

Maybe you are looking for

  • Difference between DOM and SAX

    Difference between DOM and SAX

  • Is there a way to solve itunes/Gracenote problems?

    At least I think they are Gracenote problems. Let's do one issue at a time. Many albums I've loaded with CDs appear in iTunes as separate albums. For example, Brian Eno's "Apollo" album. If I look on Album View and Click on "Apollo" the album track l

  • More than one dataprovider for a datagrid

    Is there a way that i can provide more than one dataprovider with a comma or a semicolon seperated values in the datagrid? Since i need to display values from two different providers. Thanks, Geetha

  • 4890 Cyclone OC problem?

    M4A79Deluxe Asus mobo 750W PC&C Red Crossfire Silencer PSU Phenom X4 955BE cpu(140w rated) memory(either 4x1GB Kingston Hyperx or 2x2GB OCZ reaper PC2-8500) 1- sata 2 hdd 1- dvd water cooling  cpu only 4x 120mm case fans 2x 80mm case fans USB wireles

  • FSRM [Source File Path] and DFS

    Hi, When i use the [Source File Path] in a notification email it does not show the source file path but instead shows d:\xxxx which is the local drive of the server hosting the destination share. The users has a lot of problems with file types that a