OLTP Reporting

Hi,
do you if it is possible to use interactive reporting for Service Requests (CRM 7.0) ?
Thanks a lot.

hi,
pls refer the below links
https://www.sdn.sap.com/irj/sdn/crm-elearning
CRM Analytics: Better to use integrated OLTP reporting or BI-System - 22k
searchsap.techtarget.com/searchSAP/downloads/EIF_and_SAP_EP_and_BI_May_1_2002.ppt
thanks
swamy
reward me if usefull

Similar Messages

  • CRM 6.0 ERMS OLTP Report set up

    Has anyone used the CRM ERMS OLTP reports available?   Appreciate any comments or advice for set up based on our simple scenario for processing email.   We are on CRM 2007(s) Version 6.0 and at Support Pack Level 5
    Our CRM ERMS system is configured to use the SAP SAPconnect Interface and store all emails in the tables of the SAP CRM System.  Every incoming email is routed to the 311 Agent Inbox,  where agents have access to all incoming emails.  Each OPEN incoming email in our system is processed by an agent and once interacted with are set to status COMPLETE.  We do not have any additional business rules defined for automated email processing based on content.
    John Burtonu2019s book u201CMaximizing Your SAP CRM Interaction Centreu201D describes E-mail reports that can work with our configuration of ERMS using SAPconnect and Agent Inbox for email processing.
    In Section 7.1.3 the Note on page 232, it is stated that for our current version (6.0), several CRM-based reports, known as CRM interactive reports, are available.  The reports are online transaction processing (OLTP) and the data is retrieved directly from the CRM system. 
    In Section 7.5.3 E-Mail Reports on page 258,  there is mention of a report called Average Handling Time (the actual amount of time where an agent was actively processing the email) that our Contact Centre would like to use.  We would like to start with the standard delivered report but when we (using IC Administrator access) click on the link there is no report available.  Any advice appreciated for set up or required authorization.
    More Info:  Agent Processing of Email & Related BI Reporting we have available
    All of our available BI reports are based on Service Ticket information.
    Agents interact with the emails, processing replies by creating a correctly coded Service Ticket to wrap each e-mail.  The Service Ticket and associated contact account record act like a wrapper, containing both incoming and outgoing emails in the business context.  This allows the correct Organization ID and categorization to be associated to each email interaction via the Service Ticket.  Service tickets created when interacting with email are of interaction type u201CINTu201D.
    This is consistent with the manner we drive out Organization and Categorization for our Service Tickets created by telephone calls which are of interaction type u201C202u201D.  These Service Ticket interaction types and statistical information is reportable via the SAP BI 311 reports.  We can tell the volume of Service Tickets created via interacting with email versus the volume created by telephone contact.  We do not have telephony integration with our CRM system at this time.  We do not have any email record data being transferred to BI at this time.
    Edited by: Donna Jackson on Jan 24, 2010 6:16 PM

    Hi Donna,
    How did you solve this issue? We would also like to report on emails but we are using Interactive Reporting and have no integration with BI. So I wonder if those reports are also possible to use via Interactive Reporting. I do not see any proper OLTP Data Source....
    I would be very grateful for advise.
    Thanks and best regards,
    Joasia

  • CRM OLTP reporting

    Hi,
    I want to know more about CRM OLTP reporting.
    I would like to know the basic architecture(Provide link if any), and how does the reporting really works .
    Regards,
    Rishav

    Hi ,
    You can do CRM Interactive reporting without use of BI system. You needs to have one Reporting client and one source client in CRM system.
    you can find more information about the details steps in below link.
    http://help.sap.com/bp_crm70/CRM_DE/HTML/index.htm
    In this link look for document C41 .
    Thanks,
    Ripal

  • OLTP reporting does not end

    Hi ;
    I try to use CRM interactive reporting so I created an different BI client on CRM.On its logic , CRM client extracts data from BI client via virtual data provider (OLTP not OLAP). I found the function used on CRM from RSO2 so now I can debug it by break-point. but it works in parallel and this function is always called many times not only once and doesnt end (I tried on our test system namely there is a little data.)
    Why does it take long time when I display data from RSA1 by not ending? What is the logic behind OLTP provider? 
    Thanks.

    Hi!
    To answer your question, you should provide some more details. What Reporting Type are you refering to (ACTIVITIES, OPPORTUNITIES, CAMPAIGNS etc.)? In what module did you put your breakpoint and at which line/statement? Also, I do not understand what you mean with "display data from RSA1".
    Basically, when you execute a CRM Interactive Report (IR), this invokes one dedicated Reporting Type (previously called InfoType). You can see what delivered IR belong to which Reporting Type in transaction ORDYWB in the CRM client. Also, in this transaction you can find the name of the underlying BI Query that is used in the BI client to read the data. This query belongs to a virtual InfoCube, which is mapped to a transactional real-time DataSource. The function module that extracts the data for this DataSource can be found in transaction RSO2.
    When the IR is executed in the CRM client, the query is executed via RFC in the BI client. Then, at runtime, the data is read via the mapped DataSource directly from the CRM client (again via RFC, with the function module that you can find in RSO2).
    There are more DataSources involved for reading texts but they cannot be found the same way as described above.
    Best regards,
    Sven Kriebel

  • CRM 2007 Interactive OLTP Reports - Showing No Data

    Hi,
    I am using CRM 2007 with in-built BI client and wanted to activate OLTP interactive reports. I have done settings as per C41_BB_ConfigGuide_EN_DE.doc document provided by SAP. But when I am executing these OLTP queries this shows these error .
    1. System error in program SAPLRSOA and form FUNC RSOA_VCUBE_READ_REMOTE_DATA[6]
    2. OLAP: General error on BI side.
    And shows no data.
    Please suggest a solution.....
    Best Regards,
    Lokesh Kumar
    Edited by: Lokesh Kumar on Mar 9, 2009 2:59 PM
    Pls reply someone

    Hello Lokesh,
    In SM30 maintain the Table SMOFPARSFA for the PARAMVAL field vaue of your BI interactive client.
    I think it should help.
    Best regards,
    Partha Das

  • Best OLTP Reporting Tool

    Hi Gurus,
    My company asked me to build a reporting framework to support their needs on OLTP, i am trying to do some research on which tool is the best tool to do reporting on OLTP..
    Need help to find out a best reporting tool to make reports on OLTP data base
    Right now we are using OBIEE for our OLAP reporting and we are open to implement any tools, need inputs and comparisons between all the available tools and pros and cons also.
    Appreciate all your inputs/suggestion.
    Thanks,
    Seshu.

    1) I guess you tried to pull thousands or report's rows on screen. Firstly OBIEE does pagination perfectly. Secondly human being cannot handle more than 30-50 lines physically. It is simple psychology. As soon as you've jumped on other couple pages you have no idea what was on the first pages. I've never seen such an idiot who would need report even with few hundred rows. Such reports are example of bad practice.I agree with that but you missed my point. Enterprises are large "beats" and as such they have many departments where they use different tools. In an ideal world you will have one and only one DWH with one an only one reporting tool on top. The thruth is that this never ever happens in reality. So in many cases users are asked to provide larde data extracts to use as sources for other reporting systems or external entities. OBIEE can not handle these data extracts properly as it not designed to work as an "extraction tool". This is what you missed. BO can handle them very well as it was designed as an extration tool in mind.
    2) Here you are right if it is all about dimensional model. However here is the difference between OLTP and data warehouse. In data warehouse users should be educated to know conceptual and logical models. BTW you are getting requirements from them and you say they don't know which dims can be pulled with facts and which ones cannot? Wow )) Also OBIEE does allow to pull such dims ( not connected to facts ) using very simple OBIEE modeling technique (level-based measures). That makes OBIEE reporting/analytics very flexible. You are only thinking dimensional. That's very good for an OBIEE developer. But the world is more than just dimensional. You can't answer all reporting requirements with a dimensional model I am afraid. So keep this in mind. A DWH is not the solution to all report requirements just like saying "a truck is the best vehicle in the world because it's big and can transport big loads". BTW have you tried to implement cross-fact analysis in OBIEE? Do you know that you need to set the content level for each non-confirmed dimension in each measure individually or OBIEE will bring back the measure as "NULL"? It's non-sense really, it's too tricky and too flaky.
    3) OBIEE has GUI object called PROMPT. There you have full power to force users to apply filters. And you can easily pre-defined your prompt's filter values.Again you missed a key word on my statement: "adhoc users". Prompts are fine but don't exist unless the OBIEE developer creates them in advance, typically in a dashboard. I am talking about ad-hoc here so you know that's Answers. Try to restrict Answers users and force them to put a filter on a column and you will know what I mean. BO can do this in the metadata very easily. OBIEE can't. Prompts are fine (*) but they need to be created in advance by the report developer. (*) Actually they are not that fine, you can't even force them OOTB to use a prompt or not run a dashboard without any values. Sure there are work arounds, but such a basic feature should be part of the product!
    4) OBIEE has different concepts of dims hierarchies and visual presentations. Here you are mixing them together. Indeed OBIEE 10g has such a restriction on GUI to have maximum two embedded levels. But it is only visual presentation. In business model you can have N levels of hierarchy and multiple hierarchies in dim . My goodness !!!Do you really think that ORACLE does not follow dimensional modeling ABCs? BTW please check Gartner's magic quadrants. ............BTW did you see OBIEE 11g demo ??????Yes it's only visual presentation, but that's one of the most important things! It's a stupid restriction. If you can have N levels on the BMM why shouldn't you have the ability to prent the data in more levels. Yes, I have seen the magic quadrant and Microsoft is on top of OBIEE (see http://www.gartner.com/technology/media-products/reprints/microsoft/vol2/article15/article15.html). Yes I have seen OBIEE 11g and have running on one of our Dev boxes. It's one of the most buggy releases I have seen in years. Only last week the client tools were released for Windows. How can you have proper ecosystem without proper client tools? No Solaris or HPUX release so it's a partial release. No support for other J2EE web app servers. From our point of view OBIEE 11g is a beta release. We are seating and waiting for Oracle to clean it this mess and get the product stable. And if you think Oracle Support will help read this: http://searchoracle.techtarget.com/news/2240031826/Has-the-phrase-Oracle-Support-become-an-oxymoron
    5) Inheritance...Hm, maybe it is simple dimensional modeling concept of supertype/subtype described in any BI/DW books? I am not arguing. maybe BO does it with one click. yeah, convenient enough, but no critical.Consider this problem. You have a large DWH that a large organisation wants to share with different departments which they would then extend adding they own data sources and customise to their own needs. How would you handle changes in the "core" model that need to feed to the different departments metadata stores? RPD merging and MUDE is a mess to be honest, it barely works and it's too prone to human error. Once you worked in a very large project with many OBIEE developers working in different parallel projects you will realise that inheritance is the only sensible solution. I wish Oracle would see this as well. If the metadata was in a database it would have been so much easier...
    6) Again just lack of knowledge of dimensional modeling in OBIEE. The one support ALL types of models: snowflakes (3NF) and star. Only what OBIEE requires you cannot have standalone logical table without any relations to at least one another table. Heh...I cannot recall such an example even from Bill Inmon. Regarding 3 layers. This gives you tremendous advantage over BO )) Why? It is long story))You can support all models in dimensional, but should you be forced to use it? 3 layers is not advantage. It's an interesting aproach to solve how you need to model things in OBIEE. You have become too used to it and now think it's the best in the world. It's not, in a relational reporting solution 2 layers are more than enough (physical with all your joins and presentation with all your business names and hierarchies).
    8) You I understood you correctly ))))))) None of BI tools today can beat OBIEE on number of supported data sources. I'm just wondering. When you add new column in dim/fact table you DO NOT change your metadata????? My assumption is that BO does always "select * from <table name>"? Yes, then you don't have to change your metadata. The what kind of query BO generates against database??? select d1.*,f1* from dim d1, fact f1...????OK consider this requirement. You work for a top US bank. You have invested $5m in the state-of-the-art DWH with OBIEE that covers every requirement you may even need. The US FED comes in and tells you that it wants one of your excellent reports but filtered for a number of companies. They give the user a spreadsheet with 20k companies that you need to filter you report on. Can any user in OBIEE perform the filtering without any changes in OBIEE? No. In BO you can. End of the argument.
    9) That is not true AT ALLYou saying that OBIEE can "merge" two Answers reports and get attributes from both queries in an INNER/OUTER/FULL OUTER fashion? Please show me how, I love to learn from you! (BO indeed can).
    10) You r kidding me )) have you forgotten in what time are we living??))) Do you read latest news? By 2016 most of businesses will be in social networks. Everyone wants to be power user holding mobile device in his hands!!!
    AND.....OBIEE already DOES it !!!The web client works for 90% of the user population, as said in my comment for real "power users" a fat client approach works best. There browsers are still not powerful enough to handle large data sets. Going back to Gartner's 2011 Quadrant for Business Intelligence Platforms, which you seem to love, I quote from it: "[...] it (Oracle) has lacked innovation around mobile, in-memory, consumerization, interactive visualization and search [...]".
    Please forgive me to be so sarcastic. But today the only real advantage BO over OBIEE is PRICE. I would really advise you to look at OBIEE11g. It is almost different product then OBIEE10g.I love sarcasm. Sarcasm is my middle name. But you need to open your mind a bit. OBIEE has taken control of your brain! :-) One solution does not fit all. The world of IT is too big for that...

  • Delivery status Report

    Hi ,
        Can anybody tell me what are all the process to be finished like Picking ,PGI etc. before the Delivery is set as completed.I want to know the order of processing
    the delivery document along with their table fields.
    Regards,
    Vigneswaran S

    Hi Vignes
    better post your thread in R/3 SD. This seems to be OLTP reporting.
    thanks
    ViVa

  • Interactive reports vs BW reports

    Hello Friends,
    Please provide me the clarification for the below things.
    1. Limitations on CRM Interactive reports (I know it is used for realtime OLTP reporting, uses BW client).
    2. Can we customize and enhance BW reports or objects in the BW client ?
    3. Can we use BW system (separate) use for Interactive reports as well as general BW system
    Regards,
    Ravi

    You can find some information in www.businessobjects.com or www.crystalreports.com for example the following.
    http://www.businessobjects.com/global/pdf/solutions/xi_sap_insight.pdf

  • CRM Interactive Reports not working

    hi,
    In CRM2007 we have this new feature of interactive reports, where we can create OLTP reports. I am trying to explore this feature. But when i click on Create report option, I am getting "Maintain RFC Desination for local BI".
    When i go to SM59 and look at the destinations, i can see a connection created to local system under "Internal Connections" folder.
    Please help me resolve this issue.
    regards
    Raghav

    Hi Raghav,
    follow this u ll get
    http://help.sap.com/saphelp_crm60/helpdata/de/44/8e32e6e1663674e10000000a114a6b/frameset.htm
    reward points if helpful
    cheers
    Manohar

  • Delta fo Generic extractor using function module

    Hi,
    I am using the following function module for generic extractor but its always showing me extraction error.Could anyone please suggest to resolve the issue.
    Thanks in advance fo rsuggestion.
    FUNCTION Z_BW_SALESDATA_EXTRACT_CHNG2.
    ""Local interface:
    *"  IMPORTING
    *"     VALUE(I_REQUNR) TYPE  SBIWA_S_INTERFACE-REQUNR
    *"     VALUE(I_ISOURCE) TYPE  SBIWA_S_INTERFACE-ISOURCE OPTIONAL
    *"     VALUE(I_MAXSIZE) TYPE  SBIWA_S_INTERFACE-MAXSIZE OPTIONAL
    *"     VALUE(I_INITFLAG) TYPE  SBIWA_S_INTERFACE-INITFLAG OPTIONAL
    *"     VALUE(I_UPDMODE) TYPE  SBIWA_S_INTERFACE-UPDMODE OPTIONAL
    *"     VALUE(I_DATAPAKID) TYPE  SBIWA_S_INTERFACE-DATAPAKID OPTIONAL
    *"     VALUE(I_PRIVATE_MODE) OPTIONAL
    *"     VALUE(I_CALLMODE) LIKE  ROARCHD200-CALLMODE OPTIONAL
    *"     VALUE(I_REMOTE_CALL) TYPE  SBIWA_FLAG DEFAULT SBIWA_C_FLAG_OFF
    *"  TABLES
    *"      I_T_SELECT TYPE  SBIWA_T_SELECT OPTIONAL
    *"      I_T_FIELDS TYPE  SBIWA_T_FIELDS OPTIONAL
    *"      E_T_DATA OPTIONAL
    *"      E_T_SOURCE_STRUCTURE_NAME OPTIONAL
    *"  EXCEPTIONS
    *"      NO_MORE_DATA
    *"      ERROR_PASSED_TO_MESS_HANDLER
    The input parameter I_DATAPAKID is not supported yet !
    Example: InfoSource containing TADIR objects
      TABLES: VBFA,VBRK,tadir.
    Auxiliary Selection criteria structure
      DATA: l_s_select TYPE sbiwa_s_select.
    Maximum number of lines for DB table
      STATICS: l_maxsize TYPE sbiwa_s_interface-maxsize.
    user defined variables.
      DATA : X_UPDMODE(1) Type c,        " Update Type
            X_LastUpdate Like Sy-Datum. " Last Update Date
             X_LastUpdate TYPE RODELTAID. " Last Update Date   "TCS_KAP
    Select ranges
      RANGES: L_R_VBELN   FOR VBAK-VBELN,
              L_R_AUDAT   FOR VBAK-AUDAT.
    Parameter I_PRIVATE_MODE:
    Some applications might want to use this function module for other
    purposes as well (e.g. data supply for OLTP reporting tools). If the
    processing logic has to be different in this case, use the optional
    parameter I_PRIVATE_MODE (not supplied by BIW !) to distinguish
    between BIW calls (I_PRIVATE_MODE = SPACE) and other calls
    (I_PRIVATE_MODE = X).
    If the message handling has to be different as well, define Your own
    messaging macro which interprets parameter I_PRIVATE_MODE. When
    called by BIW, it should use the LOG_WRITE macro, otherwise do what
    You want.
    Initialization mode (first call by SAPI) or data transfer mode
    (following calls) ?
      IF i_initflag = sbiwa_c_flag_on.
    Initialization: check input parameters
                    buffer input parameters
                    prepare data selection
    The input parameter I_DATAPAKID is not supported yet !
    Invalid second initialization call -> error exit
        IF NOT g_flag_interface_initialized IS INITIAL.
          IF 1 = 2. MESSAGE e008(r3). ENDIF.
          log_write 'E'                    "message type
                    'R3'                   "message class
                    '008'                  "message number
                    ' '                    "message variable 1
                    ' '.                   "message variable 2
          RAISE error_passed_to_mess_handler.
          ENDIF.
    Check InfoSource validity
        CASE i_isource.
          WHEN 'Z_BW_SDDATA_CREATEON'.
          WHEN 'Y'.
          WHEN 'Z'.
          WHEN OTHERS.
            IF 1 = 2. MESSAGE e009(r3). ENDIF.
            log_write 'E'                  "message type
                      'R3'                 "message class
                      '009'                "message number
                      i_isource            "message variable 1
                      ' '.                 "message variable 2
            RAISE error_passed_to_mess_handler.
        ENDCASE.
    Check for supported update mode
       CASE i_updmode.
         WHEN 'F'.
         WHEN OTHERS.
           IF 1 = 2. MESSAGE e011(r3). ENDIF.
           log_write 'E'                  "message type
                     'R3'                 "message class
                     '011'                "message number
                     i_updmode            "message variable 1
                     ' '.                 "message variable 2
           RAISE error_passed_to_mess_handler.
       ENDCASE.
    Check for obligatory selection criteria
       READ TABLE i_t_select INTO l_s_select WITH KEY fieldnm = 'VBELN'.
       IF sy-subrc <> 0.
         IF 1 = 2. MESSAGE e010(r3). ENDIF.
         log_write 'E'                    "message type
                   'R3'                   "message class
                   '010'                  "message number
                   'PGMID'                "message variable 1
                   ' '.                   "message variable 2
         RAISE error_passed_to_mess_handler.
       ENDIF.
       APPEND LINES OF i_t_select TO g_t_select.
    Fill parameter buffer for data extraction calls
        g_s_interface-requnr    = i_requnr.
        g_s_interface-isource   = i_isource.
        g_s_interface-maxsize   = i_maxsize.
        g_s_interface-initflag  = i_initflag.
        g_s_interface-updmode   = i_updmode.
        g_s_interface-datapakid = i_datapakid.
        g_flag_interface_initialized = sbiwa_c_flag_on.
    Store Update mode in static variable...
       X_UPDMODE = I_UPDMODE.
        Select Single DELTAID
               Into   X_LastUpdate
               From   ROOSGENDLM
               Where  OLTPSOURCE = 'Z_BW_SDDATA_CREATEON'.
        If X_LastUpdate Is Initial.
          X_LastUpdate = '19800101'. " The oldest..., this should not happen
        Endif.
    Fill field list table for an optimized select statement
    (in case that there is no 1:1 relation between InfoSource fields
    and database table fields this may be far from beeing trivial)
        APPEND LINES OF i_t_fields TO g_t_segfields.
      ELSE.                 "Initialization mode or data extraction ?
    Data transfer: First Call      OPEN CURSOR + FETCH
                   Following Calls FETCH only
    X_UPDMODE = I_UPDMODE.
    First data package -> OPEN CURSOR
        IF g_counter_datapakid = 0.
    Fill range tables for fixed InfoSources. In the case of generated
    InfoSources, the usage of a dynamical SELECT statement might be
    more reasonable. BIW will only pass down simple selection criteria
    of the type SIGN = 'I' and OPTION = 'EQ' or OPTION = 'BT'.
          LOOP AT g_t_select INTO l_s_select WHERE fieldnm = 'VBELN'.
            MOVE-CORRESPONDING l_s_select TO L_R_VBELN.
            APPEND L_R_VBELN.
          ENDLOOP.
          LOOP AT g_t_select INTO l_s_select WHERE fieldnm = 'AUDAT'.
            MOVE-CORRESPONDING l_s_select TO L_R_AUDAT.
            APPEND L_R_AUDAT.
          ENDLOOP.
    Determine number of database records to be read per FETCH statement
    from input parameter I_MAXSIZE. If there is a one to one relation
    between InfoSource table lines and database entries, this is trivial.
    In other cases, it may be impossible and some estimated value has to
    be determined.
    *added by Yogesh
          DELETE FROM ZBWSALEDATA.
    *ended by Yogesh
          l_maxsize = g_s_interface-maxsize.
    Check for supported update mode
          CASE X_UPDMODE.
            WHEN 'F' Or 'I'. " Full or Init
    **get sales orders
              SELECT *
                           INTO CORRESPONDING FIELDS OF TABLE I_SALESDATA
                           FROM VBAK AS K
                           INNER JOIN VBAP AS P
                                 ON  PVBELN = KVBELN
                           WHERE K~VBELN IN L_R_VBELN
                           AND   K~AUDAT IN L_R_AUDAT
                           AND   K~VBTYP = 'C'.
              IF I_SALESDATA[] IS NOT INITIAL.
                PERFORM F_GET_DATA.                   "get data
                PERFORM F_DEL_VATCOND.                "delete vat conditions
                PERFORM F_GET_TAR_COSTUNIT.           "get tariff cost unit
                PERFORM F_GET_NETVAL TABLES IT_VBDPA1. "get net val of the item
                PERFORM F_GET_SALESDATA.
    *write perform with name F_INSERT_DB_ZBWSALEDATA.
                PERFORM F_INSERT_DB_ZBWSALEDATA.      "insert into table ZBWSALESDATA
               LOOP AT I_SALESDATA WHERE NETPR = 0.
                 MOVE-CORRESPONDING I_SALESDATA TO I_SALESDATA_FIN.
                 APPEND I_SALESDATA_FIN.
               ENDLOOP.
               SORT I_SALESDATA_FIN BY VBELN POSNR KSCHL.
               DELETE ADJACENT DUPLICATES FROM I_SALESDATA_FIN
                                               COMPARING VBELN POSNR
                                                         KSCHL.
               INSERT ZBWSALEDATA FROM TABLE I_SALESDATA_FIN.
              ENDIF.
            WHEN 'D'. " Delta
              SELECT *
                             INTO CORRESPONDING FIELDS OF TABLE I_SALESDATA
                             FROM VBAK AS K
                             INNER JOIN VBAP AS P
                                   ON  PVBELN = KVBELN
                            WHERE K~ERDAT >= X_LastUpdate
                             WHERE K~ERDAT >= X_LastUpdate+0(8)          "TCS_KAP
                             AND   K~VBTYP = 'C'.
              IF I_SALESDATA[] IS NOT INITIAL.
                PERFORM F_GET_DATA.                   "get data
                PERFORM F_DEL_VATCOND.                "delete vat conditions
                PERFORM F_GET_TAR_COSTUNIT.           "get tariff cost unit
                PERFORM F_GET_NETVAL TABLES IT_VBDPA1. "get net val of the item
                PERFORM F_GET_SALESDATA.
    *write perform with name F_INSERT_DB_ZBWSALEDATA.
                PERFORM F_INSERT_DB_ZBWSALEDATA.      "insert into table ZBWSALESDATA
               LOOP AT I_SALESDATA WHERE NETPR = 0.
                 MOVE-CORRESPONDING I_SALESDATA TO I_SALESDATA_FIN.
                 APPEND I_SALESDATA_FIN.
               ENDLOOP.
               SORT I_SALESDATA_FIN BY VBELN POSNR KSCHL.
               DELETE ADJACENT DUPLICATES FROM I_SALESDATA_FIN
                                               COMPARING VBELN POSNR
                                                         KSCHL.
               INSERT ZBWSALEDATA FROM TABLE I_SALESDATA_FIN.
              ENDIF.
            WHEN OTHERS.
              IF 1 = 2. MESSAGE E011(R3). ENDIF.
              LOG_WRITE 'E' "message type
              'R3' "message class
              '011' "message number
              I_UPDMODE "message variable 1
              ' '. "message variable 2
              RAISE ERROR_PASSED_TO_MESS_HANDLER.
          ENDCASE.
    **end get sales orders
          OPEN CURSOR WITH HOLD g_cursor FOR
          SELECT (g_t_fields) FROM ZBWSALEDATA.
         WHERE pgmid  IN l_r_pgmid AND
                                        object IN l_r_object.
        ENDIF.                             "First data package ?
    Fetch records into interface table. There are two different options:
    - fixed interface table structure for fixed InfoSources have to be
      named E_T_'Name of assigned source structure in table ROIS'.
    - for generating applications like LIS and CO-PA, the generic table
      E_T_DATA has to be used.
    Only one of these interface types should be implemented in one API !
        FETCH NEXT CURSOR g_cursor
                   APPENDING CORRESPONDING FIELDS
                   OF TABLE E_T_DATA
                   PACKAGE SIZE l_maxsize.
        IF sy-subrc <> 0.
          CLOSE CURSOR g_cursor.
          RAISE no_more_data.
        ENDIF.
        g_counter_datapakid = g_counter_datapakid + 1.
      ENDIF.              "Initialization mode or data extraction ?
    Best regards,
    Yogesh

    Dear Yogesh,
    Try to debug the code by putting a break point at select statement...is the FM Syntactically correct ??
    Check the Select statements..if all fields are not required avoid using Select * , and use always for all entries rather than JOINS..
    Note : For Joins you have to specify the Fields in the Select...Dont use Select * ....
    Hope it helps..
    Thanks,
    Krish

  • How to populate the ranges using FM for the SELECTs

    Hi,
    I am still working on the FM to create a generic extractor. I went through the debugger but I am still unable to determine how the ranges are populated. RSA3 always gives me zero values for the results.
    There is a RANGE statement in the sample FM and the following statements for SELECTs
      RANGES: L_R_CARRID  FOR SFLIGHT-CARRID,
              L_R_CONNID  FOR SFLIGHT-CONNID.
    and...
          LOOP AT S_S_IF-T_SELECT INTO L_S_SELECT WHERE FIELDNM = 'CARRID'.
            MOVE-CORRESPONDING L_S_SELECT TO L_R_CARRID.
            APPEND L_R_CARRID.
          ENDLOOP.
          LOOP AT S_S_IF-T_SELECT INTO L_S_SELECT WHERE FIELDNM = 'CONNID'.
            MOVE-CORRESPONDING L_S_SELECT TO L_R_CONNID.
            APPEND L_R_CONNID.
          ENDLOOP.
    My question is how is L_R_CONNID and L_R_CARRID populated with low and high values for the SELECT statements? I tried to find the DS 0SAPI_SFLIGHT_SIMPLE to run and see how it is set up but there is no such DS.
    Would someone take the time to say something about this in  several sentences? I have my own code and it seems that it is not populating the values for the SELECTs when I debug from RSA3 when I provide the low and high values.
    Would I normally populate the low and high values from the InfoPackage 'Data Selection' tab once I have implemented in BW or ready to test in BW? That would mean I have to choose those fields as selections from RSO2. Anyway, I think I have asked about this but I am hoping to get an answer to get this going...
    Appreciate any replies.

    Hi
    Here is an example of an extractor that uses
    both method's, if the InfoPackage selection exist's
    it overrides the TVARV selection (which is the default).
    FUNCTION ZBW_TC_FORECAST_SO_EXTRACTOR.
    ""Local interface:
    *"  IMPORTING
    *"     VALUE(I_REQUNR) TYPE  SBIWA_S_INTERFACE-REQUNR
    *"     VALUE(I_DSOURCE) TYPE  SBIWA_S_INTERFACE-ISOURCE OPTIONAL
    *"     VALUE(I_CHABASNM) TYPE  SBIWA_S_INTERFACE-CHABASNM OPTIONAL
    *"     VALUE(I_MAXSIZE) TYPE  SBIWA_S_INTERFACE-MAXSIZE OPTIONAL
    *"     VALUE(I_INITFLAG) TYPE  SBIWA_S_INTERFACE-INITFLAG OPTIONAL
    *"     VALUE(I_UPDMODE) TYPE  SBIWA_S_INTERFACE-UPDMODE OPTIONAL
    *"     VALUE(I_DATAPAKID) TYPE  SBIWA_S_INTERFACE-DATAPAKID OPTIONAL
    *"     VALUE(I_PRIVATE_MODE) OPTIONAL
    *"     VALUE(I_CALLMODE) TYPE  ROARCHD200-CALLMODE OPTIONAL
    *"  TABLES
    *"      I_T_SELECT TYPE  SBIWA_T_SELECT OPTIONAL
    *"      I_T_FIELDS TYPE  SBIWA_T_FIELDS OPTIONAL
    *"      E_T_DATA STRUCTURE  ZBW_TC_FORECASTING_EXT_STR OPTIONAL
    *"  EXCEPTIONS
    *"      NO_MORE_DATA
    *"      ERROR_PASSED_TO_MESS_HANDLER
    *"      HIERARCHY_NOT_FOUND
    Change History                                                      *
    Mod. #  |  Date    |  Developer     |  Description                  *
    *RD3K915762|06/21/2005| SRangaraj      | Change selection of open SO   *
             |          |                | data to include deleted matls *
             |          |                | and obsolete items too        *
    RD3K915888|06/29/2005| SRANGARAJ      | Add ext matl grp and lab offce
             |          |                | filters for data-selection    *
    The input parameter I_DATAPAKID is not supported yet !
    Auxiliary Selection criteria structure
      DATA: L_S_SELECT TYPE SBIWA_S_SELECT.
    Maximum number of lines for DB table
      STATICS L_MAXSIZE TYPE SBIWA_S_INTERFACE-MAXSIZE.
    Parameter I_PRIVATE_MODE:
    Some applications might want to use this function module for other
    purposes as well (e.g. data supply for OLTP reporting tools). If the
    processing logic has to be different in this case, use the optional
    parameter I_PRIVATE_MODE (not supplied by BIW !) to distinguish
    between BIW calls (I_PRIVATE_MODE = SPACE) and other calls
    (I_PRIVATE_MODE = X).
    If the message handling has to be different as well, define Your own
    messaging macro which interprets parameter I_PRIVATE_MODE. When
    called by BIW, it should use the LOG_WRITE macro, otherwise do what
    You want.
    Initialization mode (first call by SAPI) or data transfer mode
    (following calls) ?
      IF I_INITFLAG = SBIWA_C_FLAG_ON.
    Initialization: check input parameters
                    buffer input parameters
                    prepare data selection
    The input parameter I_DATAPAKID is not supported yet !
    Invalid second initialization call -> error exit
        IF NOT G_FLAG_INTERFACE_INITIALIZED IS INITIAL.
          IF 1 = 2. MESSAGE E008(R3). ENDIF.
          LOG_WRITE 'E'                    "message type
                    'R3'                   "message class
                    '008'                  "message number
                    ' '                    "message variable 1
                    ' '.                   "message variable 2
          RAISE ERROR_PASSED_TO_MESS_HANDLER.
        ENDIF.
    Check InfoSource validity
        CASE I_DSOURCE.
          WHEN 'ZBW_TC_SO_EXTRACT'.
          WHEN OTHERS.
            IF 1 = 2. MESSAGE E009(R3). ENDIF.
            LOG_WRITE 'E'                  "message type
                      'R3'                 "message class
                      '009'                "message number
                      I_DSOURCE            "message variable 1
                      ' '.                 "message variable 2
            RAISE ERROR_PASSED_TO_MESS_HANDLER.
        ENDCASE.
    Check for supported update mode
       CASE I_UPDMODE.
         WHEN 'F'.
         WHEN OTHERS.
           IF 1 = 2. MESSAGE E011(R3). ENDIF.
           LOG_WRITE 'E'                  "message type
                     'R3'                 "message class
                     '011'                "message number
                     I_UPDMODE            "message variable 1
                     ' '.                 "message variable 2
           RAISE ERROR_PASSED_TO_MESS_HANDLER.
       ENDCASE.
    Check for obligatory selection criteria
       READ TABLE I_T_SELECT INTO L_S_SELECT WITH KEY FIELDNM = 'PGMID'.
       IF SY-SUBRC <> 0.
         IF 1 = 2. MESSAGE E010(R3). ENDIF.
         LOG_WRITE 'E'                    "message type
                   'R3'                   "message class
                   '010'                  "message number
                   'PGMID'                "message variable 1
                   ' '.                   "message variable 2
         RAISE ERROR_PASSED_TO_MESS_HANDLER.
       ENDIF.
        APPEND LINES OF I_T_SELECT TO G_T_SELECT.
    Fill parameter buffer for data extraction calls
        G_S_INTERFACE-REQUNR    = I_REQUNR.
        G_S_INTERFACE-ISOURCE   = I_DSOURCE.
        G_S_INTERFACE-MAXSIZE   = I_MAXSIZE.
        G_S_INTERFACE-INITFLAG  = I_INITFLAG.
        G_S_INTERFACE-UPDMODE   = I_UPDMODE.
        G_S_INTERFACE-DATAPAKID = I_DATAPAKID.
        G_FLAG_INTERFACE_INITIALIZED = SBIWA_C_FLAG_ON.
    Fill field list table for an optimized select statement
    (in case that there is no 1:1 relation between InfoSource fields
    and database table fields this may be far from beeing trivial)
        APPEND LINES OF I_T_FIELDS TO G_T_FIELDS.
    Fill range tables for fixed InfoSources. In the case of generated
    InfoSources, the usage of a dynamical SELECT statement might be
    more reasonable. BIW will only pass down simple selection criteria
    of the type SIGN = 'I' and OPTION = 'EQ' or OPTION = 'BT'.
        LOOP AT G_T_SELECT INTO L_S_SELECT.
          CASE L_S_SELECT-FIELDNM.
            WHEN 'PRDHA'.
              WGF_PRDHA_LENGTH = STRLEN( L_S_SELECT-LOW ).
              IF WGF_PRDHA_LENGTH = 6.  "PARTIAL PRDHA
                 WGF_PRDHA = L_S_SELECT-LOW.
                 CONCATENATE WGF_PRDHA '%' INTO WGF_PRDHA.
              ELSEIF WGF_PRDHA_LENGTH = 12.  "FULL PRDHA
              MOVE-CORRESPONDING L_S_SELECT TO L_R_PRDHA.
              APPEND L_R_PRDHA.
              ENDIF.
            WHEN 'MATKL'.
              MOVE-CORRESPONDING L_S_SELECT TO L_R_MATKL.
              APPEND L_R_MATKL.
          ENDCASE.
        ENDLOOP.
    reset the index of where we are in the gt_header table
      g_tabix = 0.
      perform populate_default_variables.
      perform get_data.
      perform build_detail.
        EXIT.
      ENDIF.                 "Initialization mode or data extraction ?
    Data transfer: First Call      OPEN CURSOR + FETCH
                   Following Calls FETCH only
    First data package -> OPEN CURSOR
       IF G_COUNTER_DATAPAKID = 0.
    Determine number of database records to be read per FETCH statement
    from input parameter I_MAXSIZE. If there is a one to one relation
    between InfoSource table lines and database entries, this is trivial.
    In other cases, it may be impossible and some estimated value has to
    be determined.
      DESCRIBE TABLE LT_DATA LINES l_count.
      IF g_tabix GE l_count.
        RAISE no_more_data.
      ENDIF.
    *CLEAN UP THE OUTPUT TABLE
      refresh E_T_DATA.
      LOOP AT LT_DATA FROM G_TABIX INTO LS_DATA.
        APPEND LS_DATA TO E_T_DATA.
    Set global counter
        g_tabix = g_tabix + 1.
      ENDLOOP.
       G_COUNTER_DATAPAKID = G_COUNTER_DATAPAKID + 1.
    ENDIF.              "Initialization mode or data extraction ?
    ENDFUNCTION.
    Forms
    ***INCLUDE LZBW_TC_FORECAST_SO_EXTF01 .
    *&      Form  populate_default_variables
          text
    -->  p1        text
    <--  p2        text
    FORM populate_default_variables.
      data: wlf_name like tvarv-name.
      clear: R_prdh3[], wlf_name.
    *get the exclusion range from tvarv for the product hierarchy in
    *question
      concatenate 'ZBW_EXL_' WGF_PRDHA(6) INTO WLF_NAME.
    SELECT LOW FROM TVARV INTO R_prdh3-low WHERE
                                 NAME = WLF_NAME.
      move:  'I'    to R_prdh3-sign,
             'EQ'   to R_prdh3-option.
      append R_prdh3.
      clear R_prdh3.
    ENDSELECT.
    {Start of insert by SRangaraj on June 29, 2005 >>RD3K915888
    CLEAR L_R_LABOR[].
    SELECT LOW FROM TVARV INTO L_R_LABOR-Low WHERE
                                 NAME = 'ZBW_TC_FORECAST_LAB_OFF'.
      move:  'I'    to L_R_LABOR-sign,
             'EQ'   to L_R_LABOR-option.
      append L_R_LABOR.
      clear L_R_LABOR.
    ENDSELECT.
    }End of insert by SRangaraj on June 29, 2005 >>RD3K915888
    ENDFORM.                    " populate_default_variables
    *&      Form  get_data
          text
    -->  p1        text
    <--  p2        text
    FORM get_data.
      data: wlf_lmeng like vbep-lmeng.
    *get all deliveries for date range for either a range of product hrchy
    *or a like value
      refresh int_records1.
      if wgf_prdha ne space.
        select ivbeln iposnr iKLMENG jvkorg i~werks
               imatnr imeins mprdha mmatkl
        from vbap as i
           INNER JOIN VBAK AS j
           ON ( jvbeln = ivbeln
                and j~vbtyp = 'C' )
           INNER JOIN vbuk AS k
           ON ( kvbeln = ivbeln
                and k~lfgsk <> 'C'
                and k~gbstk <> 'C' )
           INNER JOIN vbup AS l
           ON ( lvbeln = ivbeln and
                lposnr = iposnr
                and l~lfgsa <> 'C'
                and l~gbsta <> 'C' )
           INNER JOIN mara AS m
           ON ( mmatnr = imatnr
    {Start of insert by SRangaraj on June 21, 2005 >>RD3K915762
                and m~lvorm eq ' '
                and m~mstae ne '99'
    {Start of insert by SRangaraj on June 29, 2005 >>RD3K915888
                and m~extwg = '080' )
    }End of insert by SRangaraj on June 29, 2005 >>RD3K915888
           INNER JOIN marc AS n
           ON ( nmatnr = imatnr
                and nwerks = iwerks
                and n~lvorm eq ' ' )
    }End of insert by SRangaraj on June 21, 2005 >>RD3K915762
        into table int_records1 where ( i~abgru = '  '
                                     and i~klmeng > 0
                                     and m~prdha like wgf_prdha
                                     and m~matkl in l_r_matkl
    {Start of insert by SRangaraj on June 29, 2005 >>RD3K915888
                                     and m~labor in l_r_labor ).
    }End of insert by SRangaraj on June 29, 2005 >>RD3K915888
      elseif not l_r_prdha[] is initial and wgf_prdha = space.
        select ivbeln iposnr iKLMENG jvkorg i~werks
               imatnr imeins mprdha mmatkl
        from vbap as i
           INNER JOIN VBAK AS j
           ON ( jvbeln = ivbeln
                and j~vbtyp = 'C' )
           INNER JOIN vbuk AS k
           ON ( kvbeln = ivbeln
                and k~lfgsk <> 'C'
                and k~gbstk <> 'C' )
           INNER JOIN vbup AS l
           ON ( lvbeln = ivbeln and
                lposnr = iposnr
                and l~lfgsa <> 'C'
                and l~gbsta <> 'C' )
           INNER JOIN mara AS m
           ON ( mmatnr = imatnr
    {Start of insert by SRangaraj on June 21, 2005 >>RD3K915762
                and m~lvorm eq ' '
                and m~mstae ne '99'
    {Start of insert by SRangaraj on June 29, 2005 >>RD3K915888
                and m~extwg = '080' )
    }End of insert by SRangaraj on June 29, 2005 >>RD3K915888
           INNER JOIN marc AS n
           ON ( nmatnr = imatnr
                and nwerks = iwerks
                and n~lvorm eq ' ' )
    }End of insert by SRangaraj on June 21, 2005 >>RD3K915762
        into table int_records1 where ( i~abgru = '  '
                                     and i~klmeng > 0
                                     and m~prdha in l_r_prdha
                                     and m~matkl in l_r_matkl
    {Start of insert by SRangaraj on June 29, 2005 >>RD3K915888
                                     and m~labor in l_r_labor ).
    }End of insert by SRangaraj on June 29, 2005 >>RD3K915888
    endif.
        sort int_records1 by vbeln posnr.
        delete adjacent duplicates from int_records1 comparing
        vbeln posnr.
    *remove unnecessary records
        if not r_prdh3[] is initial.
        DELETE INT_RECORDS1 WHERE PRDHA+6(3) IN r_prdh3.
        endif.
    *get the schedule lines for all of the above records and
    *get the lowest schedule line date per so line item
         if not int_records1[] is initial.
         refresh int_records3.
         select vbeln posnr etenr mbdat into table int_records3
         from vbep for all entries in int_records1
                           where vbeln = int_records1-vbeln and
                                 posnr = int_records1-posnr and
                                 lmeng > 0.
         sort int_records3 by vbeln posnr etenr mbdat ascending.
         loop at int_Records1.
           loop at int_records3 where vbeln = int_records1-vbeln
                                  and posnr = int_records1-posnr.
             int_records1-mbdat = int_records3-mbdat.
             modify int_records1.
             exit.
            endloop.
         endloop.
         refresh int_records3. free int_records3.
         refresh int_records2.
    *get the deliveries and calculate the open quantities
        select vbelv posnv vbeln posnn rfmng plmin
                    from vbfa into table int_records2
                                  for all entries in int_Records1
                                  where vbelv = int_records1-vbeln
                                    and posnv = int_records1-posnr
                                    and VBTYP_N = 'J'. "Dels
    *calculate open quantities next
         loop at int_records1.
           clear wlf_lmeng.
           clear int_records2.
           loop at int_records2 where vbelv = int_records1-vbeln
                                    and posnv = int_records1-posnr.
           case int_records2-plmin.
             when '-'.
              wlf_lmeng = wlf_lmeng - int_records2-rfmng.
             when others.  "just add
              wlf_lmeng = wlf_lmeng + int_records2-rfmng.
            endcase.
           endloop.
           int_records1-klmeng = int_records1-klmeng - wlf_lmeng.
           int_records1-vbeln_dl = int_records2-vbeln.
           int_records1-posnr_dl = int_records2-posnn.
           modify int_records1.
        endloop.
        endif.
        delete int_records1 where klmeng le 0.
        refresh int_records2. free int_Records2.
    ENDFORM.                    " get_data
    *&      Form  build_detail
          text
    -->  p1        text
    <--  p2        text
    FORM build_detail.
        LOOP AT int_records1.
    *DO INDIVIDUAL MOVES - ITS FASTER THAN MOVE-CORRESPONDING
        move: int_records1-vkorg    to LS_DATA-VKORG,
              int_records1-werks    to LS_DATA-WERKS,
              int_records1-matnr    to LS_DATA-MATNR,
              int_records1-klmeng   to LS_DATA-KLMENG,
              int_records1-mbdat(6) to LS_DATA-YEARMONTH,
              int_records1-meins    TO LS_DATA-MEINS,
              int_records1-vbeln    TO LS_DATA-VGBEL,
              int_records1-posnr    TO LS_DATA-VGPOS,
              int_records1-vbeln_dl TO LS_DATA-VBELN,
              int_records1-posnr_dl TO LS_DATA-POSNR,
              int_records1-mbdat    to LS_DATA-WADAT_IST,
              int_records1-PRDHA    to LS_DATA-PRDHA,
              int_records1-matkl    to LS_DATA-MATKL.
        APPEND LS_DATA TO LT_DATA.
        clear: LS_DATA.
        ENDLOOP.
    ENDFORM.                    " build_detail

  • Discoverer 10g and Cognos

    Does anybody have any document outlining differences between capabilities between discoverer and cognos?
    How effecting reporting tool is discoverer when it comes to OLTP reporting?
    Thanks

    You should check with your Oracle or Cognos sales consultant. Depending on whom you ask you will get an answer weighted towards one vendor or the other :)
    What do you mean by reporting? Discoverer is a tool that you can use for both adhoc querying as well as reporting. It also offers integrated reporting against both relational and OLAP data sources, leverages the Oracle database's analytic and BI platform capabilities, is integrated with the Oracle Application Server, offers a quick and easy way to create BI dashboards in Oracle Portal, is the only BI tool to be certified against and work with the the Oracle eBusiness Suite security model out of the box, and lots more...
    Of course, if enterprise reporting capabilities are all you are looking for, then I would suggest you take a look at Oracle Reports and the just announced Oracle XML Publisher.
    You should also check out product demos, white papers, presentations, documentations, etc... - all available on OTN. Some links are provided below. Finally, you could even try out the software - free (under a developer license), which is more than can be said about Cognos or other vendors that are too scared of letting people even try out their software without first purchasing it.
    Thanks
    Abhinav
    Oracle Business Intelligence Product Management
    BI on Oracle: http://www.oracle.com/bi/
    BI on OTN: http://www.oracle.com/technology/products/bi/
    Discoverer: http://www.oracle.com/technology/products/discoverer/
    BI Software: http://www.oracle.com/technology/software/products/ias/devuse.html
    Documentation: http://www.oracle.com/technology/documentation/appserver1012.html
    BI Samples: http://www.oracle.com/technology/products/bi/samples/
    Blog: http://oraclebi.blogspot.com/

  • Delta extraction(date) with Function module problem

    Hi All,
    FUNCTION zrsax_biw_get_data_pr_d.
    *"*"Local Interface:
    *"  IMPORTING
    *"     VALUE(I_REQUNR) TYPE  SBIWA_S_INTERFACE-REQUNR
    *"     VALUE(I_ISOURCE) TYPE  SBIWA_S_INTERFACE-ISOURCE OPTIONAL
    *"     VALUE(I_MAXSIZE) TYPE  SBIWA_S_INTERFACE-MAXSIZE OPTIONAL
    *"     VALUE(I_INITFLAG) TYPE  SBIWA_S_INTERFACE-INITFLAG OPTIONAL
    *"     VALUE(I_UPDMODE) TYPE  SBIWA_S_INTERFACE-UPDMODE OPTIONAL
    *"     VALUE(I_DATAPAKID) TYPE  SBIWA_S_INTERFACE-DATAPAKID OPTIONAL
    *"     VALUE(I_PRIVATE_MODE) OPTIONAL
    *"     VALUE(I_CALLMODE) LIKE  ROARCHD200-CALLMODE OPTIONAL
    *"     VALUE(I_REMOTE_CALL) TYPE  SBIWA_FLAG DEFAULT SBIWA_C_FLAG_OFF
    *"  TABLES
    *"      I_T_SELECT TYPE  SBIWA_T_SELECT OPTIONAL
    *"      I_T_FIELDS TYPE  SBIWA_T_FIELDS OPTIONAL
    *"      E_T_DATA STRUCTURE  ZPR_ST OPTIONAL
    *"  EXCEPTIONS
    *"      NO_MORE_DATA
    *"      ERROR_PASSED_TO_MESS_HANDLER
    * The input parameter I_DATAPAKID is not supported yet !
    * Example: InfoSource containing TADIR objects
    *  TABLES: tadir.
    *DATA: BEGIN OF zpr_st_copy,
    *        banfn TYPE zpr_st-banfn,
    *        bnfpo TYPE zpr_st-bnfpo,
    *        zebkn TYPE zpr_st-zebkn,
    *        knttp TYPE zpr_st-knttp,
    *        vbeln TYPE zpr_st-vbeln,
    *        ps_psp_pnr TYPE zpr_st-ps_psp_pnr,
    *        gsber TYPE zpr_st-gsber,
    *        werks TYPE zpr_st-werks,
    *        statu TYPE zpr_st-statu,
    *        ekgrp TYPE zpr_st-ekgrp,
    *        menge TYPE zpr_st-menge,
    *        frgdt TYPE zpr_st-frgdt,
    *        meins TYPE zpr_st-meins,
    *         loekz TYPE zpr_st-loekz,
    *      END OF zpr_st_copy.
    * Auxiliary Selection criteria structure
      DATA: l_s_select TYPE sbiwa_s_select.
    * Maximum number of lines for DB table
      STATICS: l_maxsize TYPE sbiwa_s_interface-maxsize.
    * Select ranges
    *  RANGES: l_r_pgmid  FOR tadir-pgmid,
    *          l_r_object FOR tadir-object,
        RANGES: PRNO for zpr_st-banfn,
                DATE1 FOR zpr_st-frgdt.
    * Parameter I_PRIVATE_MODE:
    * Some applications might want to use this function module for other
    * purposes as well (e.g. data supply for OLTP reporting tools). If the
    * processing logic has to be different in this case, use the optional
    * parameter I_PRIVATE_MODE (not supplied by BIW !) to distinguish
    * between BIW calls (I_PRIVATE_MODE = SPACE) and other calls
    * (I_PRIVATE_MODE = X).
    * If the message handling has to be different as well, define Your own
    * messaging macro which interprets parameter I_PRIVATE_MODE. When
    * called by BIW, it should use the LOG_WRITE macro, otherwise do what
    * You want.
    * Initialization mode (first call by SAPI) or data transfer mode
    * (following calls) ?
      IF i_initflag = sbiwa_c_flag_on.
    * Initialization: check input parameters
    *                 buffer input parameters
    *                 prepare data selection
    * The input parameter I_DATAPAKID is not supported yet !
    * Invalid second initialization call -> error exit
        IF NOT g_flag_interface_initialized IS INITIAL.
          IF 1 = 2. MESSAGE e008(r3). ENDIF.
          log_write 'E'                    "message type
                    'R3'                   "message class
                    '008'                  "message number
                    ' '                    "message variable 1
                    ' '.                   "message variable 2
          RAISE error_passed_to_mess_handler.
        ENDIF.
    * Check InfoSource validity
        CASE i_isource.
          WHEN 'ZPR_ST_DS_D' OR ''.
          WHEN OTHERS.
            IF 1 = 2. MESSAGE e009(r3). ENDIF.
            log_write 'E'                  "message type
                      'R3'                 "message class
                      '009'                "message number
                      i_isource            "message variable 1
                      ' '.                 "message variable 2
            RAISE error_passed_to_mess_handler.
        ENDCASE.
    * Check for supported update mode
        CASE i_updmode.
          WHEN 'F' OR ''.
          WHEN 'C'.
          WHEN 'R'.
          WHEN 'S'. " DELTA INITIALIZATION
          WHEN 'I'. "DELTA INITIALIZATION FOR NON CUMULATIVE
          WHEN 'D'. "DELTA
          WHEN OTHERS.
            IF 1 = 2. MESSAGE e011(r3). ENDIF.
            log_write 'E'                  "message type
                      'R3'                 "message class
                      '011'                "message number
                      i_updmode            "message variable 1
                      ' '.                 "message variable 2
            RAISE error_passed_to_mess_handler.
        ENDCASE.
    BREAK-POINT.
    * Check for obligatory selection criteria
    *    READ TABLE i_t_select INTO l_s_select WITH KEY fieldnm = 'ZPR_ST-FRGDT'.
    *    IF sy-subrc <> 0.
    *      IF 1 = 2. MESSAGE e010(r3). ENDIF.
    *      log_write 'E'                    "message type
    *                'R3'                   "message class
    *                '010'                  "message number
    *                'PGMID'                "message variable 1
    *                ' '.                   "message variable 2
    *      RAISE error_passed_to_mess_handler.
    *    ENDIF.
        APPEND LINES OF i_t_select TO g_t_select.
    * Fill parameter buffer for data extraction calls
        g_s_interface-requnr    = i_requnr.
        g_s_interface-isource   = i_isource.
        g_s_interface-maxsize   = i_maxsize.
        g_s_interface-initflag  = i_initflag.
        g_s_interface-updmode   = i_updmode.
        g_s_interface-datapakid = i_datapakid.
        g_flag_interface_initialized = sbiwa_c_flag_on.
    * Fill field list table for an optimized select statement
    * (in case that there is no 1:1 relation between InfoSource fields
    * and database table fields this may be far from beeing trivial)
        APPEND LINES OF i_t_fields TO g_t_segfields.
    *   Start tracing of extraction
    *    bice_trace_open g_r_tracer i_t_fields.
      ELSE.                 "Initialization mode or data extraction ?
    * Data transfer: First Call      OPEN CURSOR + FETCH
    *                Following Calls FETCH only
    * First data package -> OPEN CURSOR
        IF g_counter_datapakid = 0.
    * Fill range tables for fixed InfoSources. In the case of generated
    * InfoSources, the usage of a dynamical SELECT statement might be
    * more reasonable. BIW will only pass down simple selection criteria
    * of the type SIGN = 'I' and OPTION = 'EQ' or OPTION = 'BT'.
    *      LOOP AT g_t_select INTO l_s_select WHERE fieldnm = 'PGMID'.
    *        MOVE-CORRESPONDING l_s_select TO l_r_pgmid.
    *        APPEND l_r_pgmid.
    *      ENDLOOP.
    *      LOOP AT g_t_select INTO l_s_select WHERE fieldnm = 'OBJECT'.
    *        MOVE-CORRESPONDING l_s_select TO l_r_object.
    *        APPEND l_r_object.
    *      ENDLOOP.
    LOOP AT g_t_select INTO l_s_select WHERE fieldnm = 'ZPR_ST-FRGDT'.
                MOVE-CORRESPONDING l_s_select to DATE1.
                DATE1-sign = 'I'.
                DATE1-option = 'GE'.
                clear DATE1-high.
                APPEND DATE1.
          ENDLOOP.
    * Determine number of database records to be read per FETCH statement
    * from input parameter I_MAXSIZE. If there is a one to one relation
    * between InfoSource table lines and database entries, this is trivial.
    * In other cases, it may be impossible and some estimated value has to
    * be determined.
          l_maxsize = g_s_interface-maxsize.
          OPEN CURSOR WITH HOLD g_cursor FOR
    *      SELECT (g_t_fields) FROM tadir
    *                               WHERE pgmid  IN l_r_pgmid AND
    *                                     object IN l_r_object.    "#EC CI_GENBUFF
    *@ CODE FOR THE STRUCTURE TO FILL IN FROM EBAN AND EBKN@*
                SELECT a~banfn a~bnfpo k~zebkn a~knttp k~vbeln k~ps_psp_pnr k~gsber a~werks a~statu a~ekgrp a~menge a~menge a~frgdt a~meins
    * INTO CORRESPONDING FIELDS OF TABLE IT1
                FROM  eban AS a INNER JOIN ebkn AS k ON ( a~banfn = k~banfn AND a~bnfpo = k~bnfpo )
                WHERE a~banfn GE '2000000000' AND a~banfn LE '2999999999' and a~loekz eq ' ' and a~frgdt ge '20130401' and a~FRGKZ eq '2'.
    *BREAK-POINT.
        ENDIF.                             "First data package ?
    * Fetch records into interface table. There are two different options:
    * - fixed interface table structure for fixed InfoSources have to be
    *   named E_T_'Name of assigned source structure in table ROIS'.
    * - for generating applications like LIS and CO-PA, the generic table
    *   E_T_DATA has to be used.
    * Only one of these interface types should be implemented in one API !
        FETCH NEXT CURSOR g_cursor
                   APPENDING CORRESPONDING FIELDS
                   OF TABLE e_t_DATA
                   PACKAGE SIZE l_maxsize.
        IF sy-subrc <> 0.
          CLOSE CURSOR g_cursor.
    *      bice_trace_close g_r_tracer.
          RAISE no_more_data.
        ENDIF.
    *    bice_collect_table g_r_tracer e_t_data.
        g_counter_datapakid = g_counter_datapakid + 1.
      ENDIF.              "Initialization mode or data extraction ?
    ENDFUNCTION.
    in the above I_t_fields and I_T_select structures  are not getting data .....
    can any one pleasee let me know what mistake i hvae done with my code?
    and i m unbale to change data source value to F1 if i am trying to change my function module and extract structure are disappearing....
    i m not getting why it is happening .......
    thank you
    vijay

    Hi yasemin,
       I m bit confused here with the line below
                                     IF i_initflag = sbiwa_c_flag_on.(if we pass 'X' then it will enter loop and checks for F or D and then assigns values to I_T_SELECT and I_T_Fields structure)
          accordfing to my code i feel it will work for any one not both(F or D).
    one more doubt rsax_biw_get_data extraction method is F2 few of them are asking to change to F1.
    so tried running a code then in RSO2 extact structure and FM both dissapeared i did not get why it happened ?
    can you please let me know why it happend
    thanks
    vijay

  • Process Chains across multiple BW instances?

    We're considering to implement a "CIF" (corporate information factory) approach for BW, using mutiple BW layers (staging, storage, analysis, across regions).  It seems to make sense from a performance management perspective.
    However one concern is the administration / monitoring.  
    Can standard BW tools (e.g. process chains, admin cockpit) customized to be used across multiple BW instances (plus ideally OLTP reporting, which we also have)? Or would we end up having to log in to each system individually?  My gut / understanding tells me it is the latter one, but would appreciate any thoughts.
    Thanks!

    Hi Ingo,
    Thanks for the information. I too have same scenario to work out.
    Is there any possibility of automatic Source/server name conversion from system A to System B while copying all the Universes and Queries.
    For Example: In BI we have an option to datasource server name change convention while transporting from one system to another system through out the landscape.
    As you said in your reply, in life cycle manager, do we need to edit connection for each & every universe or any option for one and all universes.
    Any documentation on this is appreciable...
    Please provide me information.
    Thanks in Advance.
    Regards,
    Ravi Kanth

  • ASM BEST PRACTICES FOR 'DATA' DISKGROUP(S)

    In our quest to reduce operating costs we are consolidating databases and eliminating RAC in favor of standalone servers. This is a business decision that is a certainty.  Our SAN has been upgraded, and the new database servers are newer, faster, etc.
    Our database version is 11.2.0.4 with Grid Infrastructure 12.1.0.1. Our data diskgroup is RAID-5 and our fra is RAID-1+0.  ASM has external redundancy.  All disks are of equal size with equal storage performance and availability.
    Previously our databases were on separate clusters by function: OLTP, REPORTING and ENTERPRISE CONTENT MANAGEMENT. Development/Acceptance shared a cluster, while production was separate.
    The new architecture combines different functions onto one server for dev/acc, and another for production.  This means they will all be using the same ASM instance.  Typically we followed Oracle’s recommendation to have two disk groups, one for data and the other for FRA.  That followed well when the database was the only one using the data diskgroup.  Now that we are coming databases, is the best practice still to have one data diskgroup and one FRA diskgroup?  For example, production will house 3 databases.  OLTP is 500 GB, Reporting is 1.3 TB, and Enterprise Content Management is 6 TB and growing.
    My consideration is that if all 3 databases accessing the same data diskgroup, the smaller OLTP must traverse through the 6 TB of content management.  Or is this thinking flawed?
    Does this warrant separate diskgroups?  Are there pros and cons to this?
    Any insights are appreciated.
    Best Regards,
    Sherrie

    I have many issues to deal with in this 'consolidation', but budget reduction is happening in state and regional government.  Our SAN storage is for our enterprise infrastructure and not part of my money-savings directive.  We are also migrating to UCS blades for the infrastructure, also not part of my budget reduction contribution. Oracle licensing is our biggest software cost, this is where my directive lies.  We've always been conservative and done more with less, now we will do with less, but different because the storage and hardware are awesome. 
    We've been consolidating databases onto RAC clusters and standalones since we started doing Oracle.  For the last 7 years we've supported ASM, 6 databases and 2 passive standby instances (with Data Guard) on a 2-node cluster totalling 64gb of memory.  The new UCS blades have 256gb of memory.  I get that each database must support its background processes.  If I add up the sga, pga allocated, background processes they take up about 130gb of memory, but also consider that there is an overhead to RAC.  In all the years we've had Oracle, most of our failures, outages or downtime was because of RAC.  On the plus side of that, the seamless failover saved us most times (not all times), but required administrative time for troubleshooting.
    I would love to go the Oracle 12c and use its multitenant architecture, but I have 3rd party applications that don't yet support it.  11.2 might be our last release unless I can reduce costs.  Consolidation is real and much needed, I believe why Oracle responded to the market with multitenancy. 
    But back to my first question about how many diskgroups to service a group of databases.  What I hearing, and think I agree to, is that one data group will suffice because the ASM instance knows where to retrieve the data and waste will be reduced, as well as management. 
    I still need to do some ciphering and by no means have a final plan, but thank you all for your insights and contributions.

Maybe you are looking for