Generic Data Source is calling multiple times in RSA3

[color:red}<Moderator Message: This topic has alread been discussed a lot of times. Additionally there are weblogs related to it. Please search the forums and/or the blogs for this issue>
Hi experts,
I have the requirement to get data from Generic Data Sources with function Module,
after finishing the FM I have checked in extact checker(RSA)
in the internal table I_T_DATA is displays 281 records,
but in RSA3 it shows 112560 records, I found that the FM is calling multiple time by executing one time in RSA3.
1.what would be the problem
2.where is the porblem is it in FM coding or any other places to check.
<removed by moderator>
Regards
Vijay
Edited by: Siegfried Szameitat on Feb 3, 2009 11:45 AM

Hi Savita,
I don't understand clearly from your reply regarding flow you transported and what all you transported.
You need to first transport objects in R/3 and import them.
Then transport Infoprovider, Datasource in BI assuming depenedent Infoare Infoobject, application component already transported.
Then transport your Infosource, Update rule, Transfer rules, Infopackage.
Hope you understood.
Ravi

Similar Messages

  • Design Studio 1.3 : one large generic Data Source or multiple smaller Data Sources

    Dear all,
    In DS 1.3, is it still a best practice to have one large generic Data Source ? Or is having multiple smaller Data Sources a better solution ?
    Minimizing the number of Data Sources remains a golden rule, but what is the best solution :
    One large generic Data Source : and using setDataSource
    or
    Multiple smaller Data Sources : and using Load in Script and Background Processing
    Many thanks for sharing your ideas,
    Hans

    It depends on your application and how much the data is pulling in and how you want to present that to the user/consumer of the application.
    At TechEd Las Vegas last year, SAP showed a 9 dashboards (3 per row) with background processing for each row.

  • Generic Data Source Delta on Time Stamp Issue

    HI,
    I had a Z datasource on a Z table with Time stamp
    The Generic delta of the Data source is based on Time stamp with Upper Saftey Intreval 1Sec
    When i extract delta consutively with 1 Hr  gap i am getting an delta evn tough the table is updated with etries
    How to investigate & resolve the issue
    Thanks

    Hi
    please elaborate question is not clear.
    Ashish

  • Time Stamp Delta field for Generic Data Source

    Hi Experts,
    I have a requirement to create a Generic Data Source based on view. This view on PA0006 and PA0105 tables.
    Will you please suggest me which field i have to take as delta field.
    Thanks In Advance.

    Also, make sure that whenever a new record is created in those tables, field AEDTM is getting updated with system date else you will miss new records in delta unless there is a change in those newly created records.
    Regards,
    Gaurav

  • Regarding: Loading data from R/3 To BI for a Generic Data source

    Hi Every,
    Need Help Urgent
    I had created a Generic data source with function Module as the data source, in Rsa3 it is working fine.
    1-> I had replicated the data source to Bi then i had created a info package and then I executed the same. when it is getting the data and it is show in the request monitor (Number of records )  but the status is not changing from Yellow to Green.
    Status in Step by step analysis is every step is green Except "Data selection successfully finished ?"  (RED)
    2-> Then I had seen the Back ground job in Source system which is executing still. I waited for it for a long time but nearly 30 min.
    (I had done Steps one and 2 number of time by activating replicating the data source and so on, but still their is no change in it)
    3-> Then I had canceled that back ground job with the help of BASIS (as i feel that it is something going wrong).
    4-> I feel that their is some thing wrong in the Code of Extractor.
    Please Help...............
    ""Local Interface:
    *"  IMPORTING
    *"     VALUE(I_REQUNR) TYPE  SRSC_S_IF_SIMPLE-REQUNR
    *"     VALUE(I_DSOURCE) TYPE  SRSC_S_IF_SIMPLE-DSOURCE OPTIONAL
    *"     VALUE(I_MAXSIZE) TYPE  SRSC_S_IF_SIMPLE-MAXSIZE OPTIONAL
    *"     VALUE(I_INITFLAG) TYPE  SRSC_S_IF_SIMPLE-INITFLAG OPTIONAL
    *"     VALUE(I_READ_ONLY) TYPE  SRSC_S_IF_SIMPLE-READONLY OPTIONAL
    *"     VALUE(I_REMOTE_CALL) TYPE  SBIWA_FLAG DEFAULT SBIWA_C_FLAG_OFF
    *"  TABLES
    *"      I_T_SELECT TYPE  SRSC_S_IF_SIMPLE-T_SELECT OPTIONAL
    *"      I_T_FIELDS TYPE  SRSC_S_IF_SIMPLE-T_FIELDS OPTIONAL
    *"      E_T_DATA STRUCTURE  ZBI_MATGRIR OPTIONAL
    *Need to get the data only for two Gl account which are fro material purchase while MIGO
    *G/L Account Numbers: 0010502001 0010502002
    data: E_T_DATA1 type table of ZBI_MATGRIR.
      RANGES: R_BUKRS FOR BSIS-BUKRS,
              R_BUDAT FOR BSIS-BUDAT,
              R_GJAHR FOR BSIS-GJAHR,
              R_HKONT FOR BSIS-HKONT.
      DATA: L_S_SELECT TYPE SRSC_S_SELECT.
      STATICS: S_S_IF TYPE SRSC_S_IF_SIMPLE,
               S_COUNTER_DATAPAKID LIKE SY-TABIX,
               S_CURSOR TYPE CURSOR.
    *Declare
      TYPES: BEGIN OF TY_FAGL,
        RBURS TYPE FAGLFLEXA-RBUKRS,
        RYEAR TYPE FAGLFLEXA-RYEAR,
        DOCNR TYPE FAGLFLEXA-DOCNR,
        BUZEI TYPE FAGLFLEXA-BUZEI,
        DOCLN TYPE FAGLFLEXA-DOCLN,
        PRCTR TYPE FAGLFLEXA-PRCTR,
        SEGMENT TYPE FAGLFLEXA-SEGMENT,
      END OF TY_FAGL.
      DATA: GT_FAGL TYPE TABLE OF TY_FAGL,
            GS_FAGL TYPE TY_FAGL.
      IF I_INITFLAG = SBIWA_C_FLAG_ON.
        CASE I_DSOURCE.
          WHEN 'ZFI_GL_M4'.
          WHEN OTHERS.
            IF 1 = 2. MESSAGE E009(R3). ENDIF.
         this is a typical log call. Please write every error message like this
            LOG_WRITE 'E'                  "message type
                      'R3'                 "message class
                      '009'                "message number
                      I_DSOURCE   "message variable 1
                      ' '.                 "message variable 2
            RAISE ERROR_PASSED_TO_MESS_HANDLER.
        ENDCASE.
        APPEND LINES OF I_T_SELECT TO S_S_IF-T_SELECT.
    Fill parameter buffer for data extraction calls
        S_S_IF-REQUNR    = I_REQUNR.
        S_S_IF-DSOURCE = I_DSOURCE.
        S_S_IF-MAXSIZE   = I_MAXSIZE.
        APPEND LINES OF I_T_FIELDS TO S_S_IF-T_FIELDS.
      ELSE.
    Data transfer: First Call      OPEN CURSOR + FETCH
                   Following Calls FETCH only
    First data package -> OPEN CURSOR
        IF S_COUNTER_DATAPAKID = 0.
    Fill range tables BW will only pass down simple selection criteria
    of the type SIGN = 'I' and OPTION = 'EQ' or OPTION = 'BT'.
          LOOP AT S_S_IF-T_SELECT INTO L_S_SELECT WHERE FIELDNM = 'BUKRS'.
            MOVE-CORRESPONDING L_S_SELECT TO R_BUKRS.
            APPEND R_BUKRS.
          ENDLOOP.
          LOOP AT S_S_IF-T_SELECT INTO L_S_SELECT WHERE FIELDNM = 'GJAHR'.
            MOVE-CORRESPONDING L_S_SELECT TO R_GJAHR.
            APPEND R_GJAHR.
          ENDLOOP.
          LOOP AT S_S_IF-T_SELECT INTO L_S_SELECT WHERE FIELDNM = 'BUDAT'.
            MOVE-CORRESPONDING L_S_SELECT TO R_BUDAT.
            APPEND R_BUDAT.
          ENDLOOP.
    *GRIR Inventory (RM/Stores/Spares/FG)   10502001
    *GRIR Services & Others Payable   10502002
          R_HKONT-SIGN = 'I'. "i_t_select-sign.
          R_HKONT-OPTION = 'BT'." i_t_select-option.
          R_HKONT-LOW = '0010502001'.
          R_HKONT-HIGH = '0010502002'. "i_t_select-high.
          APPEND R_HKONT.
    Determine number of database records to be read per FETCH statement
    from input parameter I_MAXSIZE. If there is a one to one relation
    between DataSource table lines and database entries, this is trivial.
    In other cases, it may be impossible and some estimated value has to
    be determined.
          OPEN CURSOR WITH HOLD S_CURSOR FOR
           SELECT BUKRS
                  AUGBL
                  ZUONR
                  BELNR
                  GJAHR
                  BUZEI
                  BUDAT
                  HKONT
                  BLART
                  MONAT
                  BSCHL
                  SHKZG
                  DMBTR
                  WAERS
                  FROM BSIS
                 INTO TABLE E_T_DATA
                  WHERE BUKRS  IN R_BUKRS
                    AND GJAHR IN R_GJAHR
                    AND BUDAT IN R_BUDAT
                    AND HKONT IN R_HKONT.
    Fetch records into interface table.
      named E_T_'Name of extract structure'.
        FETCH NEXT CURSOR S_CURSOR
                   APPENDING CORRESPONDING FIELDS
                   OF TABLE E_T_DATA1
                   PACKAGE SIZE S_S_IF-MAXSIZE.
        IF SY-SUBRC <> 0.
          CLOSE CURSOR S_CURSOR.
          RAISE NO_MORE_DATA.
        ENDIF.
    DELETE E_T_DATA WHERE BLART NE 'WE'.
          SELECT BUKRS
                 AUGBL
                 ZUONR
                 BELNR
                 GJAHR
                 BUZEI
                 BUDAT
                 HKONT
                 BLART
                 MONAT
                 BSCHL
                 SHKZG
                 DMBTR
                 WAERS
                 FROM BSAS
                 into table   E_T_DATA
                 WHERE BUKRS  IN R_BUKRS
                   AND GJAHR IN R_GJAHR
                   AND BUDAT IN R_BUDAT
                   AND HKONT IN R_HKONT.
               FETCH NEXT CURSOR S_CURSOR
                  APPENDING CORRESPONDING FIELDS
                  OF TABLE E_T_DATA
                  PACKAGE SIZE S_S_IF-MAXSIZE.
    append LINES OF e_t_data1 TO E_T_DATA.
    DELETE E_T_DATA WHERE BLART NE 'WE'.
    ENDIF.                             "First data package ?
        DATA: F_YEAR TYPE BKPF-GJAHR.
        DATA: F_PERI TYPE BAPI0002_4-FISCAL_PERIOD.
    IF E_T_DATA[] IS NOT INITIAL.
       SELECT RBUKRS
              RYEAR
              DOCNR
              BUZEI
              DOCLN
              PRCTR
              SEGMENT
              FROM FAGLFLEXA
              INTO TABLE GT_FAGL
              FOR ALL ENTRIES IN E_T_DATA
           WHERE RYEAR = E_T_DATA-GJAHR
               AND DOCNR = E_T_DATA-BELNR
               AND RLDNR = '0L'
               AND RBUKRS = E_T_DATA-BUKRS
               AND BUZEI = E_T_DATA-BUZEI.
             WHERE RYEAR = E_T_DATA-GJAHR
               AND DOCNR = E_T_DATA-BELNR
               AND RBUKRS = E_T_DATA-BUKRS.
         AND DOCLN = E_T_DATA-BUZEI.
    ENDIF.
    LOOP AT E_T_DATA.
          IF E_T_DATA-SHKZG = 'H'.
            E_T_DATA-DMBTR = E_T_DATA-DMBTR * -1.
          ENDIF.
          CLEAR: F_YEAR.
          CALL FUNCTION 'BAPI_COMPANYCODE_GET_PERIOD'
            EXPORTING
              COMPANYCODEID       = E_T_DATA-BUKRS
              POSTING_DATE        = E_T_DATA-BUDAT
           IMPORTING
             FISCAL_YEAR         = F_YEAR
             FISCAL_PERIOD       = F_PERI.
          DATA: V_DOC(6) TYPE C .
          CLEAR: V_DOC.
          V_DOC =  E_T_DATA-BUZEI.
          IF V_DOC  IS NOT INITIAL.
            CALL FUNCTION 'CONVERSION_EXIT_ALPHA_INPUT'
              EXPORTING
                INPUT  = V_DOC
              IMPORTING
                OUTPUT = V_DOC.
          ENDIF.
         aS PROFIT center is not updated in all the lines in Bsis
          READ TABLE GT_FAGL INTO GS_FAGL WITH KEY  RYEAR = E_T_DATA-GJAHR
                                                    DOCNR = E_T_DATA-BELNR
                                                    RBURS = E_T_DATA-BUKRS
                                                    BUZEI = E_T_DATA-BUZEI.
          IF SY-SUBRC = 0.
            E_T_DATA-PRCTR = GS_FAGL-PRCTR.
            E_T_DATA-SEGMENT = GS_FAGL-SEGMENT.
          ENDIF.
    *As we are using the amount DMBTR in which the amount
    *will be in company code currency that is Local currency
    *group currency always in the main company code currency.
          CONCATENATE   F_YEAR '0' F_PERI INTO E_T_DATA-FISCPER.
          MODIFY E_T_DATA. " from gs_bsis transporting dmbtr fiscper.
          CLEAR: E_T_DATA.
        ENDLOOP.
          S_COUNTER_DATAPAKID = S_COUNTER_DATAPAKID + 1.
       ENDIF.

    Hi,
    Please check the log of same jobs for last week and check is today its taking more time,also check with basis is their any backup initited at same time.
    Moreover until the background job failed automatically its difficult to imagine what is the exact issue.
    Thanks,

  • Problem with native SQL cursor in generic data source

    Hi, All!
    I am implementing generic data source based on FM.
    Because of complicated SQL I canu2019t use Open SQL and RSAX_BIW_GET_DATA_SIMPLE-example u201Cas isu201D.
    So, I have to use Native SQL. But Iu2019ve got a problem with a cursor. When I test my data source in RSA3, everything is Ok. But, if I start appropriate info-package, I get error u201CABAP/4 processor: DBIF_DSQL2_INVALID_CURSORu201D. It happens after selecting of 1st data package in line u201CFETCH NEXT S1 INTOu2026u201D. It seems to me that when system performs the second call of my FM the opened cursor has already been disappeared.
    Did anyone do things like this and what is incorrect?
    Is it real to make generic data source based on FM with using Native SQL open, fetch, closeu2026

    Hi Jason,
    I don't think this SQL is very valuable It is just an aggregation with some custom rules. This aggregation is performing on info-provider which consists of two info-cubes. Here we have about 2 billion records in info-provider and about 30 million records in custom db-table Z_TMP (certainly, it has indexes). I have to do this operation on 21 info-providers like this and I have to do this 20 times for each info-provider (with different values of host-variable p_GROUP)
    SELECT T.T1, SUM( T.T2 ), SUM( T.T3 ), SUM( T.T4 )
            FROM (
                    SELECT F."KEY_EVENT06088" AS T1,
                            F."/BIC/EV_COST" + F."/BIC/EV_A_COST" AS T2,
                            DECODE( D.SID_EVENTTYPE, 23147, 0,
                                                          23148, 0,
                                                          23151, 0,
                                                          23153, 0,
                                                          23157, 0,
                                                          23159, 0,
                                                          24896734, 0,
                                                          695032768, 0,
                                                          695029006, 0,
                                                          695029007, 0,
                                                          695036746, 0, F."/BIC/EV_COST") +
                              DECODE( D.SID_EVENTTYPE, 23147, 0,
                                                          23148, 0,
                                                          23151, 0,
                                                          23153, 0,
                                                          23157, 0,
                                                          23159, 0,
                                                          24896734, 0,
                                                          695032768, 0,
                                                          695029006, 0,
                                                          695029007, 0,
                                                          695036746, 0, F."/BIC/EV_A_COST") AS T3,
                            DECODE( D.SID_EVENTTYPE, 23147, F."/BIC/EV_DURAT",
                                                          23148, F."/BIC/EV_DURAT",
                                                          23151, F."/BIC/EV_DURAT",
                                                          23153, F."/BIC/EV_DURAT",
                                                          23157, F."/BIC/EV_DURAT",
                                                          23159, F."/BIC/EV_DURAT",
                                                          24896734, F."/BIC/EV_DURAT",
                                                          695032768, F."/BIC/EV_DURAT",
                                                          695029006, F."/BIC/EV_DURAT",
                                                          695029007, F."/BIC/EV_DURAT",
                                                          695036746, F."/BIC/EV_DURAT", 0) AS T4
                      FROM "/BIC/VEVENT0608F" F,
                           Z_TMP G,
                           "/BIC/DEVENT06085" D
                      WHERE F."KEY_EVENT06088" = G.ID
                            AND F."KEY_EVENT06085" = D.DIMID
                            AND G.GROUP_NO = :p_GROUP
                            AND ( F."/BIC/EV_COST" < 0 OR F."/BIC/EV_A_COST" < 0 )
                            AND D.SID_EVENTTYPE <> 695030676 AND D.SID_EVENTTYPE <> 695030678
                    UNION
                    SELECT F."KEY_EVNA06088" AS T1,
                            F."/BIC/EV_COST" + F."/BIC/EV_A_COST" AS T2,
                            DECODE( D.SID_EVENTTYPE, 23147, 0,
                                                          23148, 0,
                                                          23151, 0,
                                                          23153, 0,
                                                          23157, 0,
                                                          23159, 0,
                                                          24896734, 0,
                                                          695032768, 0,
                                                          695029006, 0,
                                                          695029007, 0,
                                                          695036746, 0, F."/BIC/EV_COST") +
                              DECODE( D.SID_EVENTTYPE, 23147, 0,
                                                          23148, 0,
                                                          23151, 0,
                                                          23153, 0,
                                                          23157, 0,
                                                          23159, 0,
                                                          24896734, 0,
                                                          695032768, 0,
                                                          695029006, 0,
                                                          695029007, 0,
                                                          695036746, 0, F."/BIC/EV_A_COST") AS T3,
                            DECODE( D.SID_EVENTTYPE, 23147, F."/BIC/EV_DURAT",
                                                          23148, F."/BIC/EV_DURAT",
                                                          23151, F."/BIC/EV_DURAT",
                                                          23153, F."/BIC/EV_DURAT",
                                                          23157, F."/BIC/EV_DURAT",
                                                          23159, F."/BIC/EV_DURAT",
                                                          24896734, F."/BIC/EV_DURAT",
                                                          695032768, F."/BIC/EV_DURAT",
                                                          695029006, F."/BIC/EV_DURAT",
                                                          695029007, F."/BIC/EV_DURAT",
                                                          695036746, F."/BIC/EV_DURAT", 0) AS T4
                    FROM "/BIC/VEVNA0608F" F,
                         Z_TMP G,
                         "/BIC/DEVNA06085" D
                    WHERE F."KEY_EVNA06088" = G.ID
                          AND F."KEY_EVNA06085" = D.DIMID
                          AND G.GROUP_NO = :p_GROUP
                          AND ( F."/BIC/EV_COST" < 0 OR F."/BIC/EV_A_COST" < 0 )
                          AND D.SID_EVENTTYPE <> 695030676 AND D.SID_EVENTTYPE <> 695030678
                 ) T
            GROUP BY T.T1

  • Urgent ---- loading error through Generic Data Source

    Hi, Bi Gurus
    I am trying DATA MARTING i.e. I wanna extract data from RSMONICTAB nd RSREQDONE tables of BW(3.5) into a ODS in same BW(3.5)
    means from tables RSMONICTAB and RSREQDONE tables I created a view
    with required fields
    then I created Generic data source on it.
    --> I created coresponding Inoobjects --> ODS.
    -->I loaded this ODS from that Generic datasource.
    And I am getting this error?????????????????????
    what to do???????????????????????????????????????
    Error Message
    Error message from the source system
    Diagnosis
    An error occurred in the source system.
    System response
    Caller 09 contains an error message.
    Further analysis:
    The error occurred in Service API .
    Refer to the error message.
    Procedure
    How you remove the error depends on the error message.
    Note
    If the source system is a Client Workstation, then it is possible that the file that you wanted to load was being edited at the time of the data request. Make sure that the file is in the specified directory, that it is not being processed at the moment, and restart the request.

    Hi,
    thanx again
    I did as u said (loading upto PSA).
    job's Status in SM37 FINISHED, Duration 8 seconds.
    <b>JOB LOG</b>
                                                                                    Job started                                                                                00         516
    Step 001 started (program RSBATCH1, variant &0000000004563, user ID CHAKRAVARTYS)              00           550
    Start InfoPackage ZPAK_AVA2JBPAT43JP7DYC87X6CFAD                                              RSM1          797
    Synchronized transmission of info IDoc 1 (0 parallel tasks)                                    R3           414
    tRFC: Data Package = 0, TID = , Duration = 00:00:02, ARFCSTATE =                               R3           038
    tRFC: Start = 2007.10.11 01:29:13, End = 2007.10.11 01:29:15                                   R3           039
    InfoPackage ZPAK_AVA2JBPAT43JP7DYC87X6CFAD created request REQU_CWHVP35OSN90MVCAOSB2JNUWC     RSM1          796
    Job finished                                                                                00           517

  • Generic data source

    hi experts
       i have created generic data source for  master data and replicated in to BW system , and created the infopakge for loading in to the data target , but wen iam loading  it is coming to psa . but wen iam loading to Data target its giving error message , i tried so many times the same error is coming please help me , Error messge is "Infoobject SMID dont have Alpha Conforming value 1", (this SMID OBJECT IS  from source system object )

    Hi,
    That Char is having ptoblem, so implement the following Routines for your Problem Char.
    See the following program, and the same logic you can apply in Transfer Rules/Transformations. Take ABAPer help. Change the Code as per your requiremets, first copy and past this code in SE38 and see the result.
    REPORT  ZALPHA_INPUT.
    DATA: ZI(18) TYPE C,
          ZO(12) TYPE C.
    DATA: ZS(12) TYPE C,
          ZR(18) TYPE C.
          ZI = '000000099999889925'.
          ZS = '099999889925'.
      CALL FUNCTION 'CONVERSION_EXIT_ALPHA_INPUT'
        EXPORTING
          INPUT         = ZI
       IMPORTING
         OUTPUT        = ZO.
      CALL FUNCTION 'CONVERSION_EXIT_ALPHA_OUTPUT'
        EXPORTING
          INPUT         = ZS
       IMPORTING
         OUTPUT        =  ZR  .
    Write:/   ZI.
    Write:/   ZO.
    Write:/   ZS.
    Write:/   ZR.
    Thanks
    Reddy

  • Finalize() method being called multiple times for same object?

    I got a dilly of a pickle here.
    Looks like according to the Tomcat output log file that the finalize method of class User is being called MANY more times than is being constructed.
    Here is the User class:
    package com.db.multi;
    import java.io.*;
    import com.db.ui.*;
    import java.util.*;
    * @author DBriscoe
    public class User implements Serializable {
        private String userName = null;
        private int score = 0;
        private SocketImage img = null;
        private boolean gflag = false;
        private Calendar timeStamp = Calendar.getInstance();
        private static int counter = 0;
        /** Creates a new instance of User */
        public User() { counter++;     
        public User(String userName) {
            this.userName = userName;
            counter++;
        public void setGflag(boolean gflag) {
            this.gflag = gflag;
        public boolean getGflag() {
            return gflag;
        public void setScore(int score) {
            this.score = score;
        public int getScore() {
            return score;
        public void setUserName(String userName) {
            this.userName = userName;
        public String getUserName() {
            return userName;
        public void setImage(SocketImage img) {
            this.img = img;
        public SocketImage getImage() {
            return img;
        public void setTimeStamp(Calendar c) {
            this.timeStamp = c;
        public Calendar getTimeStamp() {
            return this.timeStamp;
        public boolean equals(Object obj) {
            try {
                if (obj instanceof User) {
                    User comp = (User)obj;
                    return comp.getUserName().equals(userName);
                } else {
                    return false;
            } catch (NullPointerException npe) {
                return false;
        public void finalize() {
            if (userName != null && !userName.startsWith("OUTOFDATE"))
                System.out.println("User " + userName + " destroyed. " + counter);
        }As you can see...
    Every time a User object is created, a static counter variable is incremented and then when an object is destroyed it appends the current value of that static member to the Tomcat log file (via System.out.println being executed on server side).
    Below is the log file from an example run in my webapp.
    Dustin
    User Queue Empty, Adding User: com.db.multi.User@1a5af9f
    User Dustin destroyed. 0
    User Dustin destroyed. 0
    User Dustin destroyed. 0
    User Dustin destroyed. 0
    User Dustin destroyed. 0
    USER QUEUE: false
    INSIDE METHOD: false
    AFTER METHOD: false
    User Dustin destroyed. 1
    User Dustin destroyed. 1
    User Dustin destroyed. 1
    User Dustin destroyed. 1
    USER QUEUE: false
    INSIDE METHOD: false
    AFTER METHOD: false
    User Dustin destroyed. 2
    User Dustin destroyed. 2
    User Dustin destroyed. 2
    User Dustin destroyed. 2
    User Dustin destroyed. 2
    User Dustin destroyed. 2
    User Dustin destroyed. 2
    User Dustin destroyed. 2
    USER QUEUE: false
    INSIDE METHOD: false
    AFTER METHOD: false
    User Dustin destroyed. 3
    User Dustin destroyed. 3
    User Dustin destroyed. 3
    User Dustin destroyed. 3
    User Dustin destroyed. 3
    User Dustin destroyed. 3
    User Dustin destroyed. 3
    User Dustin destroyed. 3
    User Dustin destroyed. 3
    USER QUEUE: false
    INSIDE METHOD: false
    AFTER METHOD: false
    User Dustin destroyed. 4
    User Dustin destroyed. 4
    User Dustin destroyed. 4
    User Dustin destroyed. 4
    User Dustin destroyed. 4
    User Dustin destroyed. 4
    User Dustin destroyed. 4
    User Dustin destroyed. 4
    User Dustin destroyed. 4
    USER QUEUE: false
    INSIDE METHOD: false
    AFTER METHOD: false
    User Dustin destroyed. 5
    User Dustin destroyed. 5
    User Dustin destroyed. 5
    User Dustin destroyed. 5
    User Dustin destroyed. 5
    User Dustin destroyed. 5
    User Dustin destroyed. 5
    User Dustin destroyed. 5
    User Dustin destroyed. 5
    USER QUEUE: false
    INSIDE METHOD: false
    AFTER METHOD: false
    User Dustin destroyed. 6
    User Dustin destroyed. 6
    User Dustin destroyed. 6
    User Dustin destroyed. 6
    User Dustin destroyed. 6
    User Dustin destroyed. 6
    User Dustin destroyed. 6
    User Dustin destroyed. 6
    User Dustin destroyed. 6
    User Dustin destroyed. 6
    Joe
    USER QUEUE: false
    INSIDE METHOD: false
    AFTER METHOD: false
    User Dustin pulled from Queue, Game created: Joe
    User Already Placed: Dustin with Joe
    User Dustin destroyed. 7
    User Dustin destroyed. 7
    User Dustin destroyed. 7
    User Dustin destroyed. 7
    User Dustin destroyed. 7
    User Dustin destroyed. 7
    User Dustin destroyed. 7
    User Dustin destroyed. 7
    User Dustin destroyed. 7
    User Dustin destroyed. 7
    INSIDE METHOD: false
    INSIDE METHOD: false
    USER QUEUE: true
    INSIDE METHOD: false
    INSIDE METHOD: false
    User Dustin destroyed. 9
    User Joe destroyed. 9
    User Dustin destroyed. 9
    User Dustin destroyed. 9
    User Dustin destroyed. 9
    User Dustin destroyed. 9
    INSIDE METHOD: true
    INSIDE METHOD: false
    USER QUEUE: true
    INSIDE METHOD: false
    INSIDE METHOD: false
    INSIDE METHOD: true
    INSIDE METHOD: false
    USER QUEUE: true
    INSIDE METHOD: false
    INSIDE METHOD: false
    It really does seem to me like finalize is being called multiple times for the same object.
    That number should incremement for every instantiated User, and finalize can only be called once for each User object.
    I thought this was impossible?
    Any help is appreciated!

    Thanks...
    I am already thinking of ideas to limit the number of threads.
    Unfortunately there are two threads of execution in the servlet handler, one handles requests and the other parses the collection of User objects to check for out of date timestamps, and then eliminates them if they are out of date.
    The collection parsing thread is currently a javax.swing.Timer thread (Bad design I know...) so I believe that I can routinely check for timestamps in another way and fix that problem.
    Just found out too that Tomcat was throwing me a ConcurrentModificationException as well, which may help explain the slew of mysterious behavior from my servlet!
    The Timer thread has to go. I got to think of a better way to routinely weed out User objects from the collection.
    Or perhaps, maybe I can attempt to make it thread safe???
    Eg. make my User collection volatile?
    Any opinions on the best approach are well appreciated.

  • Error while doing DELTA Extraction (generic data source)

    Hi BW Experts,
    In my R/3, I have a generic data source ZBUT_VW.
    It receives data from a View which is created based on couple tables.
    When I do full load to the corresponding ODS, it is successful.
    But after that I delete the ODS and created CUBE with DELTA
    I have Initialized delta.
    When I do DELTA extraction, it fails.
    The error “The ALE inbox of the SAP BW is identical to the ALE outbox of the source system
    and/or
    the maximum wait time for this request has not yet run out
    and/or
    the batch job in the source system has not yet ended.”
    I delete the initialization load and trying to do DELTA extraction again and it Fails.
    With the same error
    Please let me know how can I resolve this problem.
    URGENT
    gaurav

    I dont think it is possible to have to additive delta with creation date.
    Try a full load to cube to check if the extractor is working.

  • How to enable delta in a generic Data Source

    Hi,
       I am developing a Generic Data Source (Z) based on a view.
       How can i enable it to delta capable...
       What is the concept of Calanderday......timestamp....pointer
       How can i make a decesion (what factors do i need to consider)  to go for  Calanderday......timestamp....pointer
    Please provide me detail steps & description on this as this is my first data source
    Thanks

    hi
    how to ...gds
    https://www.sdn.sap.com/irj/sdn/go/portal/prtroot/docs/library/uuid/84bf4d68-0601-0010-13b5-b062adbb3e33
    1. Time stamp - The field is a DEC15 field which always contains the time stamp of the last change to a record in the local time format.
    2. Calendar day - The field is a DATS8 field which always contains the day of the last change.
    3. Numerical pointer - The field contains another numerical pointer that appears with each new record
    Please check this link on generic delta
    http://help.sap.com/saphelp_nw04s/helpdata/en/37/4f3ca8b672a34082ab3085d3c22145/frameset.htm
    WHEN U CHECK UR DATASOURCE IN RSA6 U WILL FIND THE DELTA UPDATE CHECKBOX IS CHECKED.....
    hope it helps,
    cheers.

  • Extraction, Flat File, Generic Data Source, Delta Initialization

    Extraction, Flat File, Generic Data Source, Delta Initialization
    I have couple of questions regarding data extraction.
    1.     If you have Data Source a Flat File e.g. Excel file I know that you have to create Data Source at BW side. How do you upload updates, by selecting Delta Update when executing next Data Load? Do you ever u201Cconvertu201D this Excel file into Application Tables to become SAP Source 
    2.     Can you please give me example of situation where you had to create Generic Data Source? What is difference between Time Stamp, Calend. Day and Numeric Pointer. Which one is most common to select?
    3.     I read somewhere that Generic Data Source does not have Setup Table. I thought that you have to have Setup Table in order to load transaction Data otherwise you will lock the Application Tables. Where am I going wrong im my thinking please?
    4.     What are steps in terms of IP before, under and after Delta Initialization. I belive that you can do two ways:
    Full Update - Initialize Delta Process (without Data Transfer) u2013 Delta Update  or
    Initialize Delta Process (with Data Transfer) u2013 Delta Update
    Am I right? What is most common method and why?
    5.     If you want to add a filed in Data Source after 6 month using it, you want to do it without re-init Delta Queue. You add field in RSA6, then provide info for ABAP to populate new filed (info u2013 name of Data Source, Extract Structure, field added, name of Application Table which contains the field). How does it work now as there is no SetUp table it has been deleted after Initialisation? How does Delta Queue know that it is going to receive data which has been  expanded by one field or it may does not need to know at all?
    THANKSSSSSSSSSs

    Hi,
    1. If you have Data Source a Flat File e.g. Excel file I know that you have to create Data Source at BW side. How do you upload updates, by selecting Delta Update when executing next Data Load? Do you ever u201Cconvertu201D this Excel file into Application Tables to become SAP Source
    Once you create Datasource for A flat file extraction then it is file source system specific hence you cont change to Application table source Data source
    In info package you can change the source as application server instead of desktop no need to change the DS
    2. Can you please give me example of situation where you had to create Generic Data Source? What is difference between Time Stamp, Calend. Day and Numeric Pointer. Which one is most common to select?
    When we don't find any standard extractor then we can go for Generic(if i want information sales along with finance information in a data source then generally we dont get standard one hence we can go for generic DS)
    Check the below link for More about generic DS .
    http://wiki.sdn.sap.com/wiki/display/BI/Generic+Extraction
    for Delta capturing you can use
    Timestamp(if the table has time stamp filed  so it can stamp the last changed record in that table hence it is easy to get delta based on the time stamp)
    Calday- (If the table doesn't have the Timestamp filed then search for Calday where you can stamp the delta based on the date when documents are changed )
    Numericpointer : If the table doesn't above both then we go for this option where you can the numeric value change stamp )
    3. I read somewhere that Generic Data Source does not have Setup Table. I thought that you have to have Setup Table in order to load transaction Data otherwise you will lock the Application Tables. Where am I going wrong im my thinking please?
    Generic datasource nothing but we extracting data directly from the database table without any interface between the application/systems
    4. What are steps in terms of IP before, under and after Delta Initialization. I belive that you can do two ways:
    Full Update - Initialize Delta Process (without Data Transfer) u2013 Delta Update or
    Initialize Delta Process (with Data Transfer) u2013 Delta Update
    Am I right? What is most common method and why?
    Correct
    5. If you want to add a filed in Data Source after 6 month using it, you want to do it without re-init Delta Queue. You add field in RSA6, then provide info for ABAP to populate new filed (info u2013 name of Data Source, Extract Structure, field added, name of Application Table which contains the field). How does it work now as there is no SetUp table it has been deleted after Initialisation? How does Delta Queue know that it is going to receive data which has been expanded by one field or it may does not need to know at all?
    Once you add the new field to structure(DS) you will get the data as on date onwards not historical data hence what is the concept of setup table  ( delta records come from the Delta Que not from the setup table )
    If you want histaric data to new field then you need to setp table deletion ...etc...
    Hope it is clear..
    Regards,
    Satya

  • Generic Data Source with View

    Hi Experts.....
         previously i am creating one view based on VBRP & VBRK common field is VBELN but i have some confusion long days, these two tables having same data Source i.e..2LIS_13_VDHDR , why u create View. So please explain one real time sinario.

    Hi,
    When we don't find any standard extractor then we can go for Generic(if i want information sales along with finance information in a data source then generally we dont get standard one hence we can go for generic DS)
    Re: Extraction, Flat File, Generic Data Source, Delta Initialization
    Eg : If you want the information about the All Customers  across the regions in that case if i have the information
    Like 1)Table 1 has all the information  about  the Customer number But not having the Customer address and region and Pin code but same information has in other table
    table 2 : Customer no,Address, region and Pincode
    So since in Two tables i have common field Customer no is present hence if create view then i can get All the information in single in view then you can create Generic DS based on the then same you ca extract the data to BW.
    Regards,
    Satya

  • Php include file getting called multiple times

    I have created a form in file career.php. To carry out the validation user_registration.php is called at the beginning of the file.
    user_registration.php in turn calls library.php. Issue is that library.php is getting called multiple times.
    I have attached phperror.log which confirms the same.
    career.php
    <?php
    session_start();
    require_once('phpScript/user_registration.php');
    ?>
    <!DOCTYPE html PUBLIC "-//W3C//DTD XHTML 1.0 Transitional//EN" "http://www.w3.org/TR/xhtml1/DTD/xhtml1-transitional.dtd">
    user_registration.php
    <?php // This file is used for validating user form data. It calls process_upload.php to validate file uploaded by user
    $errors = array();  
    $messages = array();
    $testing = TRUE;
    error_log("Enter- user_registration.php. Testing mode $testing", 0);
    try {
              error_log("user_registration.php: Entered try catch block",0);
    require_once('library.php');
              error_log("Successfully called library.php",0);
              $public_key = '6LeDsNUSAAAAAIoaYuNhGm2zx_qgBECiybUWUkt6';
              $private_key = '6LeDsNUSAAAAACuvmJrec-_5seLdmhqUMngt8AiU';
              $recaptcha = new Zend_Service_ReCaptcha($public_key, $private_key);
              if(isset($_POST['submit'])) {          // Only execute if form is submitted by user
                error_log("user_registration.php. Submit button pressed by user", 0);
                require_once('Zend/Mail.php');
                $mail = new Zend_Mail('UTF-8');
                if (isset($_POST['name'])) {
                        $val = new Zend_Validate_Regex('/^\p{L}+[-\'\p{L} ]+$/u');
                        if (!$val->isValid($_POST['name'])) {
                          $errors['name'] = "Please enter your name, no numbers permitted";
    error_log("Exit- user_registration.php.", 0);
    library.php
    <?php
    error_log("Enter- library.php. Testing mode $testing", 0);
    if ($testing)  // If true then go to testing mode
              $library = '/Applications/MAMP/htdocs/mgtools-india/ZendFramework/library';
              set_include_path(get_include_path() . PATH_SEPARATOR . $library);
    require_once('Zend/Loader/Autoloader.php');
    try {
    Zend_Loader_Autoloader::getInstance();
    error_log("Exit- library.php. Testing mode $testing", 0);
    phperror.log
    Marker - 12-Oct-2012 10:27:26 AM
    [12-Oct-2012 04:57:33 UTC] Enter- user_registration.php. Testing mode 1
    [12-Oct-2012 04:57:33 UTC] user_registration.php: Entered try catch block
    [12-Oct-2012 04:57:33 UTC] Enter- library.php. Testing mode 1
    [12-Oct-2012 04:57:33 UTC] Exit- library.php. Testing mode 1
    [12-Oct-2012 04:57:33 UTC] Successfully called library.php
    [12-Oct-2012 04:57:36 UTC] Enter- user_registration.php. Testing mode 1
    [12-Oct-2012 04:57:36 UTC] Entered try catch block
    [12-Oct-2012 04:57:36 UTC] Enter- library.php. Testing mode 1
    [12-Oct-2012 04:57:36 UTC] Exit- library.php. Testing mode 1
    [12-Oct-2012 04:57:36 UTC] Successfully called library.php
    [12-Oct-2012 04:58:38 UTC] Enter- user_registration.php. Testing mode 1
    [12-Oct-2012 04:58:38 UTC] Entered try catch block
    [12-Oct-2012 04:58:38 UTC] Enter- library.php. Testing mode 1
    [12-Oct-2012 04:58:38 UTC] Exit- library.php. Testing mode 1
    [12-Oct-2012 04:58:38 UTC] Successfully called library.php
    [12-Oct-2012 04:58:38 UTC] user_registration.php. Submit button pressed by user
    [12-Oct-2012 04:58:39 UTC] user_registration.php, Processing enquiry form
    [12-Oct-2012 04:58:39 UTC] Enter- process_upload.php. Testing mode 1
    [12-Oct-2012 04:59:01 UTC] Exit- user_registration.php.
    I am not able to understand why user_registration.php and library.php are executing again and again.
    I ran this test on local testing server MAMP.
    The results are the same on remote server also.
    Please guide.

    I'm not sure what's happening there. The important question is whether the code is executed successfully.
    Since you're using the Zend Framework, it might be better to ask in a dedicated ZF forum.

  • Generic Data Source Creation Problem....URGENT!!!!!!!!!!!!!

    Hello BW Experts...!
    I need to create a Generic Data Source out of a table called VBSEGK... I was trying in the usual way with RSO2 , but when I press save button after entering the Table name the following error is coming:
    " Invalid Extract Stucture Template VBSEGK of Data Source ZPARK_01"
    and when I click on the error message its showing
    Diagnosis
    You tried to generate an extract structure with the template structure VBSEGK. This operation failed, because the template structure quantity fields or currency fields, for example, field DMBTR refer to a different table.
    Procedure
    Use the template structure to create a view or DDIC structure that does not contain the inadmissable fields.
      VBSEGK is a standard table , so I cant change the Table structure. Can any one give me some idea of how to create Data source with this table ASAP ASAP please....
         Please ask me questions if you didnt get this...
    thanks

    Hi Harish,
    Please check OSS note 335342.
    Symptom
    The creation of a generic DataSource which is extracted from a DDIC view or a transparent table terminates with error message R8359:Invalid extract structure template &2 of Datasource &1
    Other terms
    OLTP, DataSource, extractor, data extraction, generic extractor
    Reason and Prerequisites
    The problem is caused by a program error.
    Solution
    The table or view used for extraction must have currency and unit fields within the field list of the table/view for all currency and quantity key figures.Otherwise the consistency of the extracted data cannot be ensured.To make the generation possible, check whether all key figures of your table refer to unit fields that are within the field list.If this is not the case, there are two possibilities:
    1. A table is used for extraction.
    Create a view in which you have a currency / unit field contained in the view for each key figure. The currency / unit field from the table must be included in the view to which the key figure actually refers.
    Example:
    Field WKGBTR of table COEP refers to the unit field WAERS of table TKA01. In a view that contains field COEP-WKGBTR, table TKA01 and field WAERS must be included in the field list.
    If the currency or the unit a key figure refers to is not located in a table but in a structure, the key figure has to be removed from the view and read via a customer exit (see below). Structures cannot be included in a view.
    ATTENTION!! Often the key of the table in which the referenced unit is located, does not agree with the key in the table with the corresponding key figure. In this case, the join condition of both tables is not unique in the view definition, that means for each key line of the table with the key figure, several lines of the table with the unit may be read: the result is a multiplying of the number of lines in the view by a factor corresponding to the number of lines that fit the key figure, in the unit table. To be able to deliver consistent data to a BW system, check whether the unit of the key figure in question should always have a fixed value. If yes then you can determine that in the view definition via 'GoTo -> Selection conditions'. If no, then you must proceed as follows:
    a) Remove the key figure from the view
    b) Define the DataSource
    c) Enhance the extract structure by key figure and unit for each append (Transaction RSA6)
    d) Add the key figure and the unit in a customer exit
    2. A view is already used for the extraction
    If it is not possible to obtain a unit of measure from a table on which the view is based, the unit field must be deleted from the field list.
    A note in relation to the upward compatibility of BW-BCT InfoSources: BW-BCT 1.2B was not yet able to check units and currency fields. For this reason, it is possible that InfoSources which were defined in the source system as of BW-BCT 1.2B must be redefined as of 2000.1 in the manner described above. However, checks are absolutely essential for the consistency of the extracted data.
    Regards,
    Praveen

Maybe you are looking for