Combining Data Sources

Post Author: Sandy100
CA Forum: WebIntelligence Reporting
I am needing to combine data into a Web Intelligence table.  Two different universes, one based on Sybase and one based on SQL Server.  I've been told I cannot link the data sources in a universe.  Is there any way to link the data in Web Intelligence if I have a common field (Employee ID). 

Hi,
when a follow your description a new problem appears. As you describe I am able to merge two dimensions (coming from two different queries and two different universes) into one. Therefore I can display this merged dimension in one single column in a table in WebI. However, the problem that remains is that I still have two separate columns for each key figure (one for each universe).
Now, I would like to merge the key figures that belong to one another. Example: merge "revenue_universeA"  with "revenue_universeB". In other words, I would like to do a union so that the revenue coming from two universes is displayed in one column in a table in WebI.
Thanks in advance, Marc.

Similar Messages

  • Can CLOUD data source be joined to a combined data source?

    Hi, experts,
           I have created a cloud data source and I am trying to join the cloud data source with a combined data source.
           I am having a problem.
               This is how I joined the two data sources. I checked the Joined Data Source. It is ok.
              When I open the report,
              The error is just like this.
              The error is just one line . So , I simply dont know how to solve this.
               Any help is greatly appreciated.
    Thanks in advance.
    Regards,
    Fred.

    Hi Murali,
    I saw the solutions you provide seems to work for the user.
    What I want to do is to transfer the Hierarchy from 0GL_ACCOUNT to another InfoObject 0GLACCEXT, however, when I right clicked the Hierarchy and tried to create transfer rules, but the Transfer Rules option prompted only DataSource in Source selections, I cannot select 0GL_ACCOUNT_HIER as my source.
    Any advice?
    Many thanks,
    Vince

  • Data Source  combine - Customer Invoice and Outbound Delivery Item Details

    Hello,
    I need create report where getting the outbound delivery item details ( like ship date, quantities, product, identified_stock)
    and also need invoice number ( customer invoice number) which is available in the Customer Invoice Header or Detail source.
    But try to create combined data source not able to join  Customer Invoice and Outbound Delivery Item Details.
    Since after adding any one of them as first datasource and try to add another data source from above then it doesn't shows up.
    Anybody have idea how to get the invoice information (invoice number) and shipping data.
    Thanks,
    Viral.

    Hi Viral,
    you only can comine data sources of the same access context. The customer invoice data sources have access context "company" and the outbound delivery sources have "site" or "sales". So unfortunately, you can`t combine these data sources.
    Best regards,
    Andreas

  • XMLdatasets: How to combine multiple xml data sources??

    What I'm trying to do (without any results so far...) is to combine data from two different xml sources.
    I have one source with a list of cultural events (agenda.xml) and another with a list of contacts (contacts.xml). Each source has a column with a contact name <co_name>.
    I use a MasterDetail layout. When you choose an event in the MasterContainer (agenda.xml), the DetailContainer should display not only details from agenda.xml but when the name <co_name> in agenda.xml MATCH exactly a name <co_name> in contacts.xml, the DetailContainer also display the telephone number <co_phone> from contact.xml.
    I will put my code down here, I hope somebody can help me find out what I'm doing wrong. (I tried something with "getdata" but obviously, it doesn't works...)
    Thank you very much in advance!!!!!!
    Véro
    FIRST DATATSET: agenda.xml
    <events>
    <event>
        <ev_title>Title event 1</ev_title>
        <co_name>vero</co_name>
    </event>
    <event>
        <ev_title>Title event 2</ev_title>
        <co_name>marc</co_name>
    </event>
    <event>
        <ev_title>Title event 3</ev_title>
        <co_name>vero</co_name>
    </event>
    <event>
        <ev_title>Title event 4</ev_title>
        <co_name>marc</co_name>
    </event>   
    </events>
    SECOND DATASET: contacts.xml
    <contacts>
    <contact>
        <co_name>marc</co_name>
        <co_phone>123 phone of marc</co_phone>
    </contact>
    <contact>
        <co_name>vero</co_name>
        <co_phone>456 phone of vero</co_phone>
    </contact>
    </contacts>
    HTML DOCUMENT
    <html>
    <head>
    <link href="SpryAssets/SpryMasterDetail.css" rel="stylesheet" type="text/css" />
    <script src="SpryAssets/xpath.js" type="text/javascript"></script>
    <script src="SpryAssets/SpryData.js" type="text/javascript"></script>
    <script type="text/javascript">
    var dsAgenda = new Spry.Data.XMLDataSet("agenda.xml", "events/event");
    </script>
    <script type="text/javascript">
    var dsContacts = new Spry.Data.XMLDataSet("contacts.xml", "contacts/contact");
    function matchTheName() {
    var rows = dsContacts.getData();
    for (var i = 0; i < rows.length; i++)
      if (rows[i]["co_name"] == "{dsAgenda::co_name}")
        return rows[i]["dsContacts::co_name"];
    </script>
    </head>
    <body>
        <div class="MasterDetail">
            <div spry:region="dsAgenda" class="MasterContainer">
                <div class="MasterColumn" spry:repeat="dsAgenda" spry:setrow="dsAgenda" spry:hover="MasterColumnHover" spry:select="MasterColumnSelected">
                    <div>{ev_title}</div>
                    <div>{co_name}</div>
                </div>
            </div>
            <div spry:detailregion="dsAgenda dsContacts" class="DetailContainer">
                <div class="DetailColumnTitle">{dsAgenda::ev_title}</div>
                <div class="DetailColumn">{dsAgenda::co_name}</div>
                <div spry:if="'{dsAgenda::co_name}' == '{dsContacts::co_name}'" class="DetailColumn">{dsContacts::co_phone}</div>
            </div>
        </div> 
    </body>
    </html>

    create a new empty SpryDataSet.
    var dsAll = new Spry.Data.DataSet(); // new base dataset
    on your existing datasets, add onPostLoad observers, this are events that will be notified once the datasets has been loaded.
    var default_obs = {
         onPostLoad: function(){
              // when both datasets are loaded, call out init fn
             if( ds1.getData() && ds2.getData() ){
                   init();
    Than we add it both datasets:
    ds1.addObserver( default_obs ); // add the observer object to the datasets
    ds2.addObserver( default_obs );
    Now they will both call the init function when both the datasets has been loaded. The init function will create the actual new dataset. The data in the SpryDataSet is basically just an array with objects. And Spry has a method that loads an array in to the datasets.setDataFromArray
    We are going to use that function to construct a new dataset:
    function init(){
         if( dsAll.getData() ){
              return; // we already have data in our dsAll dataset, so no use to do it all over again
         var source = ds1.getData();
         var length = source.length;
         var result = [];
         for( var i = 0; i < length; i++ ){
              var matched_row = ds2.findRowsWithColumnValues({id: source[i]['@id']}); // checks if ds2. has a column with id: value
              if( matched_row ){
                   var row = result[ result.length ];
                   row = source[i];
                   Spry.Utils.setOptions( row, matched_row ); // includes the matched_row object inside the new row object
         dsAll.setDataFromArray( result ); // set the new array as data
    Something like should be needed to create it..

  • Can we combine multiple data sources in single report?

    <span style="font-size: 10pt; font-family: Verdana">Can we combine multiple data sources in single report?</span>

    If you can&#39;t do this at the Metalayer (BVs or Universe) - then subreports and shared variables are the method for CR to use multiple datasources in the same report.

  • Combining Data into one Cube from two Data-sources..

    Dear Experts,
    I am pulling data from two data sources and trying to combine in one Info-Cube. The data are like
    Data-Source 01
    1. GUID  --Common
    2.Document No ( User Entry)
    3.Dist. Channel
    4.Transaction Type
    5.Date and Quantity
    Data-Source 02
    1.GUID -- Common
    2.Billing Document ( If User drill down according to Document No , Billing Document should come in the report )
    3.Billing date
    4.Net Value
    Out of the datas , The GUID is common between the 2 data-sources.  I was thinking that, tha data will take according to its place and If i select the Document No in Report, it will atomatically fetch all the data like Tran type, dist ch, Billing Document No , Billing date.. .
    The problem is , in the report Tha data is not coming as I was thinking.
    And Another problem is , In future I need to create a Multiprovider between the above mentioned Info-cube and One ODS. And  DOCUMENT NO is common in Cube and ODS.
    Please Suggest,
    How can I proceed for the following requirement.
    Thanks,
    Sanjana

    Hi Sanjana,
    In your case cube will create a problem because it will have multiple records . For example :
    Data-Source 01 :
    1. GUID -- 101
    2.Document No - 999
    3.Dist. Channel - DL
    4.Transaction Type - GPRO
    5.Date and Quantity - 20.02.2011 & 20
    Data-Source 02
    1.GUID -- 101
    2.Billing Document - 6000
    3.Billing date - 03.03.2011
    4.Net Value - 500
    Your cube will have 2 records . And your requirement is to show above two records in 1 record in the report .
    Why dont you make an ODS in between , where you can put GUID as the Key field and rest all the fields as data fields. Create 2 transformations to this DSO from the 2 datasources . And let it get updated one by one . Your DSO will have 1 record only . Now either you do reporting on this DSO or take the data to the cube .
    Hope the above reply was helpful.
    Kind Regards,
    Ashutosh Singh
    Edited by: Ashutosh Singh on May 19, 2011 1:34 PM

  • Combining heterogeneous data sources in a single data target

    Dear Experts,
    I have a requirement as below for combining multiple data sources (different key fields) into a single target
    Target Key Fields : sales doc, item, schdl line, delivery no and delivery item.
    1 source key fields : sales doc and item  (VAITM)
    2 source key fields :sales doc, item, schdl line, delivery no and delivery item (V_SSL)
    3. source key fields : delivery number and delivery item. (VCITM)
    Is there an innovative way of combining all these records in a single record in data target rather then creating multiple entries (due to different source keys) rather than using a conventional look up routines.
    We are on BI 7.0.
    Thanks,
    Rita
    Edited by: Rita Tripathi on Jan 19, 2012 7:05 PM

    Hi,
    Target Key Fields : sales doc, item, schdl line, delivery no and delivery item.
    1 source key fields : sales doc and item (VAITM)
    2 source key fields :sales doc, item, schdl line, delivery no and delivery item (V_SSL)
    3. source key fields : delivery number and delivery item. (VCITM)
    You can actually create a View on (1) and (2) tables in SE11 and then create a generic datasource in RSO2.
    You create one more generic DS from (3) table.
    Combine these two DS in an Infosource and update to DSO.
    Regards,
    Suman

  • Regarding combining data from 2 data sources

    Hi All,
    I want to represent data from two data sources. Orders (2lis_11_vaitm) and billing (2lis_13_vditm). We have in common Plant, Material customer etc. I want the query to be executed based on Billing date. However, when I am executing the report based on billing date, I am getting all orders as 0 and I am only getting Inv qty and value. The reason for this is there is no billing date in 2lis_11_vaitm. So, how to merge these two data sources??
    Regards
    Jay

    Hi,
    Refer the lnik:
    http://help.sap.com/saphelp_nw04/helpdata/en/67/7e4b3eaf72561ee10000000a114084/frameset.htm
    With rgds,
    Anil Kumar Sharma .P

  • Issue with table ROOSPRMSF entries for data source 0FI_AP_4

    Hi Experts,
    I am facing with an issue where we found incosistencies with table ROOSPRMSF in R/3 system.
    In BW , we have done initializations based on fiscal period selections (none of the selections overlap) for data source 0FI_AP_4.
    We have done in total 7 initializations. So in BW system in table RSSDLINITSEL we have 7 initialization requests.
    But in R/3 system we have 49 records for data source 0FI_AP_4 in ROOSPRMSF table out of which 42 are invalid records.
    I suspect that these 42 invalid records are created due to the execution of program RSSM_OLTP_INIT_DELTA_UPDATE when the tables ROOSPRMSF are actually holding the 7 initialization request entries.   Due to this each and every initialization request is linked to rest of the other intialization requests and ended with 49 records in table ROOSPRMSF table.
    Now our data loads are running fine but daily a short dump is raised . In the daily loads, BW init records in RSSDLINITSEL are compared with ROOSPRMSF entries and all the other 42 records which are invalid are written into system log and a short dump is raised.
    In order to fix these inconsistencies i checked for OSS note 852443. (Point 3 in OSS note)
    But it is specified to delete the delta queue for data source 0FI_AP_4 in RSA7 and instructed to execute the program RSSM_OLTP_INIT_DELTA_UPDATE so that the ROOSPRMSF table will be reconstructed with valid records available in RSSDLINITSEL. 
    From OSS note 852443 point 3
    "3. If the RSSDLINIT table in the BW system already contains entries, check the requests listed there in the RNR column in the monitor (transaction RSRQ). Compare these entries with the entries in the ROOSPRMSF and ROOSPRMSC tables with the INITRNR field. If, in the ROOSPRMSF and ROOSPRMSC tables for your DataSource source system combination, there are more entries with different INITRNR numbers, use transaction RSA7 in an OLTP source system to delete all entries and then use the RSSM_OLTP_INIT_DELTA_UPDATE report mentioned in the next section. For a DataMart source system, delete the entries that you cannot find in the RSSDLINIT table using the procedure described above."
    My question is if we delete the delta queue in RSA7 then all the tables in R/3 (ROOSPRMSF, ROOSPRMSC, Time stamp table) and BW (RSSDLINITSEL, initialization requests will be deleted) will be cleared. Then how will the program RSSM_OLTP_INIT_DELTA_UPDATE  copy entries into ROOSPRMSF table in R/3 ?
    Could any one please clarify this ?
    Thanks
    Regards,
    Jeswanth

    Hi Amarnath,
    Did you unhide the new field in RSA6 and regenerated the DataSource?
    Often SAP will populate newly added fields (belonging to the same (set) of table(s) used for extraction) automatically (e.g. SAP uses 'move-corresponding' in it's extractor-code, or, in this case, reading all fields from the DD, FM BWFIU_TRANSFORM_FIELDLIST).
    If the DataSource looks fine to you and the field is still not populated in RSA3 you can't go without a user-exit.
    Grtx,
    Marco

  • Any examples of a data template using multiple data sources?

    I'm looking for an example report using multiple data sources. I've seen one where they do a master/detail but I'm just looking to combine results in sorted order (sorted across all data sources). The master/detail used a bind variable to link the two defined queries, I'm thinking what I want won't have that, so I'm lost on how to make that happen. I have reports using multiple sql queries and there is a way in the data source pulldown to tell it to combine the data sources. It appears to be a more manual process with data templates, if it's even possible.
    Any pointers/links would be appreciated.
    Gaff

    Hi Vetsrini :
    That's just it. Mine is simpler than that. There is no master/detail relationship between the two queries. I have the same exact query that I run in two databases and I want to merge the results (ordered, by, say eventTime) in one report. So I think my results are going to be two separate groups (one for each data source) which I'll have to let BI merge vis XSLT or whatever it uses. That's fine for small result sets but for larger ones, it would be nice if the database did the sorting/merging.
    Gaff

  • Re: Data Source: 0CRM_PROD_OBJ_LIST

    Dear All
    We are using CRM data source:0CRM_PROD_OBJ_LIST(Delta) and ODS 0CRM_OL.
    Issue: Data is loaded in to PSA correctly but when it is staged to ODS it is skipping some of the records.
    Between this PSA and ODS their is a Start routine in Update rule.
    Update rule name; 4ETV81OKAKCXHUFPSRYXMIVAT
    Code in Start routine is attached.
    1. Can Some one please check whether it is standard or Z? Because I had debugged the program I found no problem in it.
    Sorry to attach the code here it is just to verify.
    YPES: BEGIN OF new_data_structure,
             item_guid(32) TYPE c,
             item_changed_ts(8) TYPE p,
             recordmode TYPE c,
           END OF new_data_structure.
        This global table contains information about relevant update
        records. This table is filled when the first data package is
        processed
    DATA:  gt_new_data TYPE SORTED TABLE OF new_data_structure
           WITH NON-UNIQUE KEY item_guid item_changed_ts recordmode.
    DATA: IN    TYPE F,
          OUT   TYPE F,
          DENOM TYPE F,
          NUMER TYPE F.
    data: rate like tcurr-UKURS.
    constants: c_msgty_e value 'E'.
    $$ end of global - insert your declaration only before this line   -
    The follow definition is new in the BW3.x
    TYPES:
      BEGIN OF DATA_PACKAGE_STRUCTURE.
         INCLUDE STRUCTURE /BIC/CS0CRM_PROD_OBJ_LIST.
    TYPES:
         RECNO   LIKE sy-tabix,
      END OF DATA_PACKAGE_STRUCTURE.
    DATA:
      DATA_PACKAGE TYPE STANDARD TABLE OF DATA_PACKAGE_STRUCTURE
           WITH HEADER LINE
           WITH NON-UNIQUE DEFAULT KEY INITIAL SIZE 0.
    FORM startup
      TABLES   MONITOR STRUCTURE RSMONITOR "user defined monitoring
               MONITOR_RECNO STRUCTURE RSMONITORS " monitoring with record n
               DATA_PACKAGE STRUCTURE DATA_PACKAGE
      USING    RECORD_ALL LIKE SY-TABIX
               SOURCE_SYSTEM LIKE RSUPDSIMULH-LOGSYS
      CHANGING ABORT LIKE SY-SUBRC. "set ABORT <> 0 to cancel update
    $$ begin of routine - insert your code only below this line        -
    fill the internal tables "MONITOR" and/or "MONITOR_RECNO",
    to make monitor entries
    **declaration
    DATA: lt_active_data TYPE TABLE OF /bi0/acrm_ol00.
    DATA: ls_active_data TYPE /bi0/acrm_ol00.
      DATA: lt_active_data TYPE TABLE OF DATA_PACKAGE_structure.
      DATA: ls_active_data TYPE DATA_PACKAGE_structure.
      DATA: lv_fsname(60) TYPE c.
      DATA: lv_psa_name TYPE rsodstech.
      DATA: lv_dso_tabname TYPE tabname.
      DATA: ls_new_data TYPE new_data_structure.
      DATA: ls_new_data_pre TYPE new_data_structure.
      DATA: ls_DATA_PACKAGE TYPE DATA_PACKAGE_structure.
      DATA: help_index LIKE sy-tabix.
      FIELD-SYMBOLS: <fs_minfo_requnr> TYPE ANY.
      FIELD-SYMBOLS: <fs_minfo_datapakid> TYPE ANY.
      FIELD-SYMBOLS: <fs_minfo_updmode> TYPE ANY.
      FIELD-SYMBOLS: <fs_minfo_logsys> TYPE ANY.
    **constants
      CONSTANTS:  lc_dso_name TYPE rsdodsobject VALUE '0CRM_OL'.
    **end of declaration
    **fetch request number
      lv_fsname = 'g_s_minfo-requnr'.
      ASSIGN (lv_fsname) TO <fs_minfo_requnr>.
    **fetch data package number
      lv_fsname = 'g_s_minfo-datapakid'.
      ASSIGN (lv_fsname) TO <fs_minfo_datapakid>.
    **fetch update mode
      lv_fsname = 'g_s_minfo-updmode'.
      ASSIGN (lv_fsname) TO <fs_minfo_updmode>.
    **fetch source system
      lv_fsname = 'g_s_minfo-logsys'.
      ASSIGN (lv_fsname) TO <fs_minfo_logsys>.
    **delete all data_package entries that...
    **...are not object list elements
    **...are dummy deletion records
    **...are erroneous
      DELETE DATA_PACKAGE WHERE flag NE 'O'.
    OR
                               recordmode EQ 'D'.
    **delta handling only processed for update modes 'delta' and 'repeat'
      IF <fs_minfo_updmode> = 'D' OR <fs_minfo_updmode> = 'R'.
    data selection and deletion handling only for the first data package
        IF <fs_minfo_datapakid> = '000001'.
       fetch technical name of actual PSA table
       with I_SELTYPE = 'D' (Date)
          CALL FUNCTION 'RSAR_ODS_NAME_GET'
            EXPORTING
              i_logsys                = <fs_minfo_logsys>
              i_isource               = '0CRM_PROD_OBJ_LIST'
              i_istype                = 'D'
              i_date                  = sy-datum
            I_VERSION               =
              i_seltype               = 'D'
            I_SEGMENT_ID            =
            IMPORTING
            E_ODSNAME               =
              e_odsname_db            = lv_psa_name
            E_ODS_TABTYPE           =
            E_S_ODS                 =
            E_T_ODSFIELDS           =
            E_PARTITIONED           =
            EXCEPTIONS
              parameter_failure       = 1
              no_ods_found            = 2
              no_fields_to_ods        = 3
              OTHERS                  = 4
          IF sy-subrc = 0.
         select new data from PSA Table
            SELECT * FROM (lv_psa_name)
            INTO CORRESPONDING FIELDS OF TABLE gt_new_data
         only select data from actual request
         ...do not select erroneous records
         ...do not select object list records
            WHERE request      = <fs_minfo_requnr> AND
                  bwsttecsys2  NE '10' AND
                  flag_pl_ol   NE 'P'.
            IF sy-subrc = 0.
           delete adjacent duplicates in gt_new_data
              DELETE ADJACENT DUPLICATES FROM gt_new_data
              COMPARING ALL FIELDS.
           get technical name of active DSO table
              CALL METHOD cl_rsd_odso=>get_tablnm
                EXPORTING
                  i_odsobject   = lc_dso_name
                  i_tabt        = rsdod_c_tabt-active
                IMPORTING
                  e_tablnm      = lv_dso_tabname
               E_TTYPENAME   =
               E_VIEWNM      =
               E_CHNGLOGNM   =
               E_INFOSOURCE  =
               E_DATASOURCE  =
                EXCEPTIONS
                  OTHERS        = 1.
              IF sy-subrc <> 0.
                lv_dso_tabname = '/bi0/acrm_ol00'.
              ENDIF.
           select active data from ODS A-Table for relevant items
              SELECT * FROM (lv_dso_tabname)
              INTO CORRESPONDING FIELDS OF TABLE lt_active_data
              FOR ALL ENTRIES IN gt_new_data
              WHERE crm_itmgui = gt_new_data-item_guid.
           delete older after images, only keep the latest one
              LOOP AT gt_new_data INTO ls_new_data.
                IF ls_new_data-item_guid EQ ls_new_data_pre-item_guid AND
                   ( ls_new_data-item_changed_ts NE
      ls_new_data_pre-item_changed_ts OR
                for dummy deletion records on item level there could be
                several entries with same value of item_changed_ts
                because the changing date is not updated in case of
                deletion of an item
                in this case we only keep the dummy deletion record
                itself
                   ls_new_data-recordmode = 'D' ).
                  help_index = sy-tabix - 1.
                  DELETE gt_new_data INDEX help_index.
                ENDIF.
                MOVE ls_new_data TO ls_new_data_pre.
              ENDLOOP.
           Now we delete all dummy deletion records in gt_new_data
              DELETE gt_new_data WHERE recordmode = 'D'.
           modify data of gt_active_data for field 'RECORDMODE' = 'D'
           deletion records are written to the data package (Index 1)
              LOOP AT lt_active_data INTO ls_active_data.
                ls_active_data-recordmode = 'D'.
                MOVE-CORRESPONDING ls_active_data TO ls_DATA_PACKAGE.
                INSERT ls_DATA_PACKAGE INTO DATA_PACKAGE INDEX 1.
              ENDLOOP.
           delete records of older after images
           do not consider deletion records
              LOOP AT DATA_PACKAGE INTO ls_DATA_PACKAGE
                   WHERE recordmode NE 'D'.
                help_index = sy-tabix.
             relevant for update?
                READ TABLE gt_new_data WITH KEY
                  item_guid = ls_DATA_PACKAGE-crm_itmgui
                  item_changed_ts = ls_DATA_PACKAGE-crm_itchts
                  BINARY SEARCH
                  TRANSPORTING NO FIELDS.
             if not relevant, delete it!
                IF sy-subrc <> 0.
                  DELETE DATA_PACKAGE INDEX help_index.
                ENDIF.
                CLEAR ls_DATA_PACKAGE.
                CLEAR help_index.
              ENDLOOP.
            ENDIF.
          ENDIF.
    for data packages > 000001
        ELSE.
       delete records of older after images
          LOOP AT DATA_PACKAGE INTO ls_DATA_PACKAGE.
            help_index = sy-tabix.
         relevant for update?
            READ TABLE gt_new_data WITH KEY
              item_guid = ls_DATA_PACKAGE-crm_itmgui
              item_changed_ts = ls_DATA_PACKAGE-crm_itchts
              BINARY SEARCH
              TRANSPORTING NO FIELDS.
         if not relevant, delete it!
            IF sy-subrc <> 0.
              DELETE DATA_PACKAGE INDEX help_index.
            ENDIF.
            CLEAR ls_DATA_PACKAGE.
            CLEAR help_index.
          ENDLOOP.
        ENDIF.
      ENDIF.
    if abort is not equal zero, the update process will be canceled
      ABORT = 0.
    $$ end of routine - insert your code only before this line         -
    ENDFORM.
    Analysis on it:
       1. I read in the SAP help that u201CBefore you perform a delta update for this ODS object, ensure that data previously loaded into the ODS object is activated. Otherwise the delta update could lead to inconsistencies.u201D  (http://help.sap.com/saphelp_tm80/helpdata/en/ba/21423fc657e07fe10000000a114084/frameset.htm). But I had seen in the ODS
    (0CRM_OL) manage that number of requests are activated at once so I had deleted the ODS requests and reloaded one by one by activating them sequentially.
    Till the request which has got problem that is SID of request 90949 (PSA) dated 06/09/2011. After loading the said request the contract number (0000132672) was still missing in the ODS. Then I had debugged the 9 data package of 90949 in which the contract number 0000132672 was present. I found that after the code for Start routine is completed the contract number was still existing in the internal table. But that is not updated into the New(Queue)  Table of ODS.
    Unable to debug the background job which is generated while updating the data from PSA to ODS, as no real data found at that time. 
    Code Process in Start Routine (It will use both the active table of ODS and PSA data (of relevant request )):
    Conclusion: There is no problem in Start routine. (Seems it is standard routine (need to be confirmed)).
    2.       Found in Sap help that u201CWhen you use this InfoSource to load data into the ODS objects Product Lists (0CRM_PL) and Object Lists (0CRM_OL), you must set the processing method Only PSA with the option Update Subsequently in Data Targets in the corresponding InfoPackage. The update rules for the above ODS objects use a global table in their start routine to enable delta updates for this extractor. Once the data to be loaded has been distributed to multiple data packages, the contents of this global table are only retained if you have set the above processing method. If you set any other processing method, the contents of this global table are deleted between processing of data packagesu201D (http://help.sap.com/saphelp_bw33/helpdata/en/05/bb423fd9575003e10000000a114084/frameset.htm).
    For this to implement need to delete data from ODS and from PSA so as to load the data with new info package (Process type Only PSA update sub sequential).
    Edited by: sreekanth goud on Oct 25, 2011 8:02 AM

    Hi Sreekanth,
    First check whether you ur DSO had the right combination of key fields,once u r sure about it,check the start routine,please send the code. it's heavy to see.
    Thanks & Regards
    Rams Thota

  • Create a new dimension in business layer from Data source: text file on the web

    Hi,
    I have a text data source which is published every few hours that is accessible from a certain URL. I follow the instruction given in this http://scn.sap.com/docs/DOC-43144 - where it shows in great detail how to create the connection, data foundation as well as business layer to create a universe for this type of data.
    All is well - I can use this universe in my WEBI doc and display the data.
    However, in order for me to merge the data from this universe with another universe, I need to create  new dimension based on the data from the text file. The new dimension value is simply the first 4 characters of the Subject found in the text file. The "Subject" dimension is of variant type: varchar.
    Following the guide mentioned earlier, the connection is using SAP BO OpenConnectivity driver. And this driver limits severely the SQL statement that I can use to extract a substring of another string. Here's the screenshot of the SQl expression that I can use with this driver
    After hours of searching, I cannot find any other connection driver for a text file that's published on a certain URL. The BO OpenConnection driver is the best that I could find.
    So here are my problems
    1. one of my data source is a text file published on a web
    2. the only connection I can create does not allow me to create  new dimension in the universe to create an important column "subject ID"
    3. I can create the column in webi as a variable. But when I do so, I cannot merge it with existing dimension (webi not allowing to merge these 2 types). And without the merge, the flat file universe with my database universe can't be combined.
    I'm using WEBI Rich client version 4.1 SP3 Patch 1. Build 14.1.3.1300
    Is there any other idea that you can suggest without requiring to change the extracted data?
    Thanks.
    With warm regards

    Hi Bala,
    Were you able to find out a solution for the problem with uploading values for a variable from a text file on the web?  I am confronted with the same request from users.
    Thanks,
    BQ

  • Data Source Creation in R/3

    Hi all.
       I need to extract the data from multiple tables from R/3. For that I am creating a datasource using RSO2. In this scenario do I need choose extraction from  view ( IN rso2)& create a view combining these tables & use this in data source creation? Pls let me know the procedure. Could you pls also let me know when to use the option EXTRACTION BY FM (IN rso2)
    Thanks in advance.
    Kind Regrads,
    sami.

    Hi,
    Creation of DataSource
    https://www.sdn.sap.com/irj/scn/go/portal/prtroot/docs/library/uuid/d3219af2-0c01-0010-71ac-dbb4356cf4bf
    /people/siegfried.szameitat/blog/2005/09/29/generic-extraction-via-function-module
    Regards,
    Sachinkumar

  • Data Source is in Inactive state while replicate datasource in BI 7.0

    Hello,
    When I replicate the datasource ZW_MAT_PLANT(My own data source which is combination of MARA and MARC) in RSA1-Datasources it is coming by default in inactive mode. In this inactive mode i can not delete this data source assignment adn i can not go inside this data source. While i click change icon to go inside the datasource it is giving me the error like you have no authorization for this datasource.
    But from SU53 screen shot i have authorization for this.
    Please help me in this.
    Thanks and Best Regards,
    Anitha Sukhaavsi

    Hi,
    I am facing the same issue. Did you get the solution for this?
    Thanks,
    Pal

  • JPA with MySQL-Data-Source

    Hello Forum,
    I have a question regarding usage of a MySQL-Data-Source in combination with JPA
    on the SAP NetWeaver Application Server, Java ™ EE 5 Edition.
    I have setup a custom datasource like explained in paper:
    "Working with Database Tables, DataSources and JMS Resources"
    - registered the database driver via telnet (Using mysql-connector-java-5.0.3-bin.jar)
    - created the data-sources.xml file underneath the META-INF dir of the EAR project
    [code]
    <?xml version="1.0" encoding="UTF-8"?>
    <!DOCTYPE data-sources SYSTEM "data-sources.dtd" >
    <data-sources>
      <data-source>
        <data-source-name>titan_cruises_ds</data-source-name>
        <driver-name>mysql-connector-java-5.0.3-bin.jar</driver-name>
         <init-connections>1</init-connections>
         <max-connections>10</max-connections>
         <max-time-to-wait-connection>60</max-time-to-wait-connection>
         <expiration-control>
              <connection-lifetime>600</connection-lifetime>
              <run-cleanup-thread>60</run-cleanup-thread>
         </expiration-control>
         <sql-engine>native_sql</sql-engine>
        <jdbc-1.x>
          <driver-class-name>com.mysql.jdbc.Driver</driver-class-name>
          <url>jdbc:mysql://ourHost.internal.com:3306/practise_titan_cruises</url>
          <user-name>myUser</user-name>
          <password>myPass</password>
        </jdbc-1.x>
      </data-source>
    </data-sources>
    [/code]
    After that I manually created the persistence.xml underneath the META-INF dir of the EJB project.
    [code]
    <persistence xmlns="http://java.sun.com/xml/ns/persistence" version="1.0">
         <persistence-unit name="titan_cruises_pu">
              <jta-data-source>titan_cruises_ds</jta-data-source>
         </persistence-unit>
    </persistence>
    [/code]
    After that I created the Entity named "Cabin" and the corresponding table within the db.
    Entity code:
    [code]
    package de.collogia.beans.pojo.ship;
    import java.io.IOException;
    import java.io.Serializable;
    import javax.persistence.Column;
    import javax.persistence.Entity;
    import javax.persistence.GeneratedValue;
    import javax.persistence.Id;
    import javax.persistence.Table;
    This persisted POJO class models the cabin data.
    <p>
    In this class persistence annotations are placed on the getter methods
    of the attributes. This tells the persistence manager to access them
    via the corresponding get- and set-Methods.</p>
    (Unfortunately this does not work on NetWeaver and I had to place them
    on field level aggain...)
    @author Henning Malzahn ([email protected])
    svn-revision:         $Rev:: 670                                           $:
    svn-lasted-edited-by: $Author:: henning                                    $:
    svn-last-changed:     $Date:: 2007-02-21 21:49:51 +0100 (Wed, 21 Feb 2007) $:
    @Entity
    @Table(name = "cabin")
    public class Cabin implements Serializable {
        /** The generated serial version UID used for serialization. */
        private static final long serialVersionUID = -8522497314498903378L;
        /** The actual version number of this class used for serialization. */
        private static int actualVersion = 1;
        /** The cabin's id. */
        @Id
        @GeneratedValue
        @Column(name = "id")
        private long id;
        /** The cabin's name */
        @Column(name = "name")
        private String name;
        /** The cabin's deck level */
        @Column(name = "deck_level")
        private int deckLevel;
        /** The cabin's ship id */
        @Column(name = "ship_id")
        private int shipId;
        /** The cabin's bed count */
        @Column(name="bed_count")
        private int bedCount;
    /---- Serialization/ Deserialization methods -/
    Method that is responsible for deserialization of the object.
    @param in The <code>ObjectInputStream</code> object to read
              the data from.
    @throws IOException That may occur when reading from the
                        <code>ObjectInputStream</code> object
    @throws ClassNotFoundException That may occur when invoking the default
                                   deserialization mechanism.
        private void readObject(final java.io.ObjectInputStream in)
            throws IOException, ClassNotFoundException {
            /* Invoke default deserialization mechanism. */
            in.defaultReadObject();
            /* Read the actual version number of the class. */
            actualVersion =  in.readInt();
        } // End of readObject()
    Method that is responsible for serialization of the object.
    @param out The <code>ObjectOutputStream</code> object to write
               the data to.
    @throws IOException That may occur when writing to the
                        <code>ObjectOutputStream</code> object.
        private void writeObject(final java.io.ObjectOutputStream out)
            throws IOException {
            /* Invoke default serialization mechanism. */
            out.defaultWriteObject();
            /* Write the actual version number of the class. */
            out.writeInt(actualVersion);
        } // End of writeObject()
    /---- Defining constructors -/
    Private default constructor.
        private Cabin() {
        } // End of default constructor
    Full constructor.
    @param name The cabin's name.
    @param deckLevel The cabin's deck level.
    @param shipId The cabin's ship id.
    @param bedCount The cabin's bed count.
        public Cabin(final String name,
                     final int deckLevel,
                     final int shipId,
                     final int bedCount) {
            this.name = name;
            this.deckLevel = deckLevel;
            this.shipId = shipId;
            this.bedCount = bedCount;
        } // End of full constructor
    /---- Overridden class methods -/
    Returns a string representation of the cabin's data.
    @see java.lang.Object#toString()
        @Override
        public String toString() {
            StringBuffer strBuf = new StringBuffer();
            strBuf.append(this.name);
            strBuf.append("\n");
            strBuf.append(this.deckLevel);
            strBuf.append("\n");
            strBuf.append(this.shipId);
            strBuf.append("\n");
            strBuf.append(this.bedCount);
            return strBuf.toString();
        } // End of toString()
    /---- Defining instance methods -/
    Get method for the member "<code>id</code>".
    @return Returns the id.
        public long getId() {
            return this.id;
    Set method for the member "<code>id</code>".
    HTDODO hm: Check whether it is possible to have setId method
    using private accesss level with NetWeaver JPA-Provider!
    @param id The id to set.
        private void setId(final long id) {
            this.id = id;
    Get method for the member "<code>name</code>".
    @return Returns the name.
        public String getName() {
            return this.name;
    Set method for the member "<code>name</code>".
    @param name The name to set.
        public void setName(final String name) {
            this.name = name;
    Get method for the member "<code>deckLevel</code>".
    @return Returns the deckLevel.
        public int getDeckLevel() {
            return this.deckLevel;
    Set method for the member "<code>deckLevel</code>".
    @param deckLevel The deckLevel to set.
        public void setDeckLevel(final int deckLevel) {
            this.deckLevel = deckLevel;
    Get method for the member "<code>shipId</code>".
    @return Returns the shipId.
        public int getShipId() {
            return this.shipId;
    Set method for the member "<code>shipId</code>".
    @param shipId The shipId to set.
        public void setShipId(final int shipId) {
            this.shipId = shipId;
    Get method for the member "<code>bedCount</code>".
    @return Returns the bedCount.
        public int getBedCount() {
            return this.bedCount;
    Set method for the member "<code>bedCount</code>".
    @param bedCount The bedCount to set.
        public void setBedCount(final int bedCount) {
            this.bedCount = bedCount;
    } // End of class Cabin
    [/code]
    After that I created the TravelAgentBean, a Stateless Session Bean, implementing
    a remote interface that allows construction and persisting of new Cabin objects:
    [code]
    package de.collogia.beans.session.stateless;
    import javax.ejb.Stateless;
    import javax.persistence.EntityManager;
    import javax.persistence.PersistenceContext;
    import de.collogia.beans.pojo.ship.Cabin;
    Class that implements the <code>TravelAgentRemote</code> interface
    and defines the business methods of the TravelAgent service.
    @author Henning Malzahn ([email protected])
    svn-revision:         $Rev:: 670                                           $:
    svn-lasted-edited-by: $Author:: henning                                    $:
    svn-last-changed:     $Date:: 2007-02-21 21:49:51 +0100 (Wed, 21 Feb 2007) $:
    @Stateless
    public class TravelAgentBean implements TravelAgentRemote {
        /** The <code>Log</code> object for this class. */
    //    private static final Log LOGGER;
        /** The <code>PersistenceManager</code> object. */
        @PersistenceContext(unitName = "titan_cruises_pu")
        EntityManager em;
    /---- Static initializer -/
    //    static {
    //        LOGGER = LogFactory.getLog(TravelAgentBean.class);
    //    } // End of static initializer block
    /---- Implementing remote interface methods -/
    {@inheritDoc}
        public void createCabin(final Cabin cabin) {
            this.em.persist(cabin);
        } // End of createCabin()
    } // End of class TravelAgentBean
    [/code]
    After that I created a Controller class containing a main method that looks up the remote
    interface of the TravelAgentBena like explained in document "Accessing Enterprise JavaBeans Using JNDI
    in SAP NetWeaver Application Server, Java ™ EE 5 Edition" written by Validimir Pavlov of SAP NetWeaver
    development team.
    Unfortunately I receive an Exception after invoking the createCabin(...) method.
    On the console of the NWDS I receive:
    [code]
    javax.ejb.EJBException: Exception in getMethodReady() for stateless bean sap.com/test2Earannotation|test2Ejb.jarannotation|TravelAgentBean;
    nested exception is: com.sap.engine.services.ejb3.util.pool.PoolException: javax.ejb.EJBException: Cannot perform injection over bean instance
    Caused by: java.lang.RuntimeException: The persistence unit is inconsistent:
    The entity >>de.collogia.beans.pojo.ship.Cabin<< is mapped to the table >>cabin<<, which does not exist.
    [/code]
    But if I look at the log file located in "C:\NWAS_JAVAEE5\JP1\JC00\j2ee\cluster\server0\log\defaultTrace.0.trc"
    I see the real reason is:
    [code]
    [EXCEPTION]
    #6#1064#42000#You have an error in your SQL syntax; check the manual that corresponds to your MySQL server version for the right syntax
    to use near '"cabin"' at line 1#collnx02.collogia.de:3306:null:practise_titan_cruises#select * from "cabin"#com.mysql.jdbc.exceptions.MySQLSyntaxErrorException:
    You have an error in your SQL syntax; check the manual that corresponds to your MySQL server version for the right syntax to use near '"cabin"' at line 1
         at com.mysql.jdbc.SQLError.createSQLException(SQLError.java:936)
         at com.mysql.jdbc.MysqlIO.checkErrorPacket(MysqlIO.java:2870)
         at com.mysql.jdbc.MysqlIO.sendCommand(MysqlIO.java:1573)
         at com.mysql.jdbc.MysqlIO.sqlQueryDirect(MysqlIO.java:1665)
         at com.mysql.jdbc.Connection.execSQL(Connection.java:3124)
         at com.mysql.jdbc.PreparedStatement.executeInternal(PreparedStatement.java:1149)
         at com.mysql.jdbc.PreparedStatement.executeQuery(PreparedStatement.java:1262)
         at com.sap.sql.jdbc.basic.BasicPreparedStatement.executeQuery(BasicPreparedStatement.java:99)
         at com.sap.sql.jdbc.direct.DirectPreparedStatement.executeQuery(DirectPreparedStatement.java:307)
         at com.sap.sql.jdbc.direct.DirectPreparedStatement.executeQuery(DirectPreparedStatement.java:264)
         at com.sap.engine.services.dbpool.wrappers.PreparedStatementWrapper.executeQuery(PreparedStatementWrapper.java:274)
    [/code]
    My goodness - what a long post - sorry for this - I hope I provided all information
    necessary to deal with the issue.
    Am I thinking in the right direction to blame attribute [code]<sql-engine>native_sql</sql-engine>[/code]
    of file data-sources.xml for the beaviour? Are there any other argument options than native_sql?
    Thanks in Advance!
    Henning Malzahn

    Hi Henning,
    > Despite the fact it's working now I have to do some
    > changes to my code currently
    > developed using JBoss/ Hibernate combination.
    > Hibernate allows you to have the
    > default no-arg constructor with private visibility -
    > any special reason for the fact that
    > only protected is allowed on NetWeaver?
    Here we strictly implemented the checks according to the requirements of the JPA specification. Technically, we could do with private constructors as well. But the JPA specifications requires the constructor to be protected to allow a JPA implementation to subclass entities if needed.
    > The entities in the project are final classes
    > so declaring a ctor protected doesn't really make
    > sense...
    For the same reason, your entities should not be final. Are we missing a check here ?
    > Also the persistence.xml parameter
    >
    hibernate.hbm2ddl.auto
    with the value of
    > create-drop is very useful while
    > developing the app - everytime you deploy the project
    > you get a fresh database.
    > Is there a comparable option for NetWeaver?
    No, unfortunately, there is no comparable option in SAP JPA (yet). We understand that there is a need for forward mapping. We would have liked to delegate this task to the JPA design time (i.e. Dali). However, we had to discover that Dali does not perform this task properly and we can't recommend using it any more.
    Consequently, there is no automatic schema generation in SAP JPA 1.0.
    >
    > Another thing is the extra TMP_SEQUENCE table which
    > isn't necessary using JBoss and
    > Hibernate - what's the reason for that?
    With Hibernate Entity Manager, the id generation strategy in use with GenerationType.AUTO depends on the database dialect. This means that depending on the database dialect, IDENTITY columns, SEQUENCES or generator tables (TableHiLo) are required. As Hibernate has the before mentioned schema generation property this fact can be hidden to the user.
    In SAP JPA, we are always using a table generator if GenerationType.AUTO is used. This allows for better portability across databases. It requires the table TMP_SEQUENCE. As we unfortunately do not have a schema generation capability, the user must create this table.
    Best regards,
    Adrian

Maybe you are looking for