Query taking lot of time (Tunning Query.... Help)

SELECT d.loc_channel, A.DEPOT_CODE, F.DEPOT_NAME,E.EMP_CODE,E.EMP_NAME, A.CUST_CODE, B.CUST_NAME, A.INSTRUMENT_NO,to_char(A.INSTRUMENT_DT,'dd/mm/yyyy')as INSTRUMENT_DT ,nvl(A.amount,0)as amount, G.REMARKS indorcvdfrombank,G.DOC_NO,to_char(G.DOC_DT,'dd/mm/yyyy')as doc_dt,g.doc_dt docdt,nvl(G.DOC_AMOUNT,0)as doc_amount,decode(nvl(c.BDSNo,'N') ,'N','No','Yes') PAYRCVD,c.BDSNO,to_char(c.BDSDT,'dd/mm/yyyy')as BDSDT,nvl(c.amount,0)as bdsamount FROM T_PAYRECEIPT A, M_CUSTOMER B,T_PAYRECEIPT C,M_LOCATION D, M_EMPLOYEE E, M_DEPOT F, T_CRD_DBT G WHERE A.CUST_CODE = B.CUST_CODE AND B.SE_LOCCODE = D.LOCATION_CODE AND D.EMP_CODE = E.EMP_CODE AND A.DEPOT_CODE = F.DEPOT_CODE AND A.REF_DOCNO = G.DOC_NO AND A.DEPOT_CODE = G.DEPOT_CODE and nvl(A.status,' ') = 'B' and a.voucher_no = c.ref_instrumentno(+) and trunc(a.voucher_dt) between to_date('19/04/2005','dd/mm/yyyy') and to_date('19/05/2005','dd/mm/yyyy') AND A.Cust_Code exists (select A.cust_code from M_Customer A) AND A.Depot_code exists (select depot_Code from M_DEPOT where DEPOT_CATEGORY in ('D','L') and status = 'A' ) order by a.cust_code,g.doc_no,docdt

While I agree with all of the other John's comments, as a first crack, I would re-write the statement this way:
SELECT d.loc_channel, a.depot_code, f.depot_name,e.emp_code,e.emp_name,
       a.cust_code, b.cust_name, a.instrument_no,
       TO_CHAR(a.instrument_dt,'dd/mm/yyyy') instrument_dt,
       NVL(a.amount,0) amount, g.remarks indorcvdfrombank,
       g.doc_no,TO_CHAR(g.doc_dt,'dd/mm/yyyy') doc_dt, g.doc_dt docdt,
       NVL(g.doc_amount,0) doc_amount,
       DECODE(NVL(c.bdsno,'N'), 'N', 'No', 'Yes') payrcvd,
       c.bdsno,TO_CHAR(c.bdsdt,'dd/mm/yyyy') bdsdt,
       NVL(c.amount,0) bdsamount
FROM t_payreceipt a, m_customer b,t_payreceipt c,m_location d,
     m_employee e, m_depot f, t_crd_dbt g
WHERE a.cust_code = b.cust_code and
      b.se_loccode = d.location_code and
      d.emp_code = e.emp_code and
      a.depot_code = f.depot_code and
      a.ref_docno = g.doc_no and
      a.depot_code = g.depot_code and
      a.status = 'b' and
      a.voucher_no = c.ref_instrumentno(+) and
      TRUNC(a.voucher_dt) BETWEEN TO_DATE('19/04/2005','dd/mm/yyyy') AND
                                  TO_DATE('19/05/2005','dd/mm/yyyy') and
      f.depot_category IN ('d','l') and
      f.status = 'a'
ORDER BY a.cust_code,g.doc_no,docdt The predicate:
a.cust_code EXISTS (SELECT a.cust_code FROM m_customer a)can be removed because the join condition:
a.cust_code = b.cust_codedoes exactly the same thing.
The predicate:
a.depot_code EXISTS (SELECT depot_code
                     FROM m_depot
                     WHERE depot_category IN ('d','l') and
                           status = 'a' ) can be removed because the join between:
a.depot_code = f.depot_codewill take care of it if you add the conditions into the main query lilke:
f.depot_category IN ('d','l') and
f.status = 'a'The predicate:
NVL(a.status,' ') = 'b'can be changed to a simple equality since NULL is not equal to anything. Removing the function may allow the optimizer to choose a better access path.
I would try to change the predicate:
TRUNC(a.voucher_dt) BETWEEN TO_DATE('19/04/2005','dd/mm/yyyy') AND
                            TO_DATE('19/05/2005','dd/mm/yyyy')to get rid of the TRUNC on voucher_dt. If there really are times in voucher_dt, I would probably use something like:
a.voucher_dt BETWEEN TO_DATE('19/04/2005','dd/mm/yyyy') AND
                     TO_DATE('19/05/2005 23:59:59','dd/mm/yyyy hh24:mi:ss')If there are no times in voucher_dt then just drop the TRUNC.
TTFN
John

Similar Messages

  • Select query is taking lot of time to fetch data.....

    Select query is taking lot of time to fetch data.
        SELECT algnum atanum  abdatu abzeit abname abenum bmatnr bmaktx bqdatu bqzeit bvlenr bnlenr bvltyp bvlber b~vlpla
               bnltyp bnlber bnlpla bvsola b~vorga INTO TABLE it_final FROM ltak AS a
                       INNER JOIN ltap AS b ON  btanum EQ atanum AND algnum EQ blgnum
                       WHERE a~lgnum = p_whno
                       AND a~tanum IN s_tono
                       AND a~bdatu IN s_tocd
                       AND a~bzeit IN s_bzeit
                       AND a~bname IN s_uname
                       AND a~betyp = 'P'
                       AND b~matnr IN s_mno
                       AND b~vorga <> 'ST'.
    Moderator message: Please Read before Posting in the Performance and Tuning Forum
    Edited by: Thomas Zloch on Mar 27, 2011 12:05 PM

    Hi Shiva,
    I am using two more select queries with the same manner ....
    here are the other two select query :
    ***************1************************
    SELECT * APPENDING CORRESPONDING FIELDS OF TABLE tbl_summary
        FROM ztftelpt LEFT JOIN ztfzberep
         ON  ztfzberep~gjahr = st_input-gjahr
         AND ztfzberep~poper = st_input-poper
         AND ztfzberepcntr  = ztftelptrprctr
        WHERE rldnr  = c_telstra_projects
          AND rrcty  = c_actual
          AND rvers  = c_ver_001
          AND rbukrs = st_input-bukrs
          AND racct  = st_input-saknr
          AND ryear  = st_input-gjahr
          and rzzlstar in r_lstar             
          AND rpmax  = c_max_period.
    and the second one is
    *************************2************************
      SELECT * APPENDING CORRESPONDING FIELDS OF TABLE tbl_summary
        FROM ztftelnt LEFT JOIN ztfzberep
         ON  ztfzberep~gjahr = st_input-gjahr
         AND ztfzberep~poper = st_input-poper
         AND ztfzberepcntr  = ztftelntrprctr
        WHERE rldnr  = c_telstra_networks
          AND rrcty  = c_actual
          AND rvers  = c_ver_001
          AND rbukrs = st_input-bukrs
          AND racct  = st_input-saknr
          AND ryear  = st_input-gjahr
          and rzzlstar in r_lstar                              
          AND rpmax  = c_max_period.
    for both the above table program is taking very less time .... although both the table used in above queries have similar amount of data. And i can not remove the APPENDING CORRESPONDING. because i have to append the data after fetching from the tables.  if i will not use it will delete all the data fetched earlier.
    Thanks on advanced......
    Sourabh

  • Partition switching taking lots of time

    I have a partitioned table. All the partitions of this table are mapped to a single filegroup.
    When I am trying to switch partition, it is taking lots of time more than 5 mins for a single partition.
    I can see that there is a file in the filegroup and its size is very high that is 234GB.
    Can partition switching take more time due to file size?
    or is there any other issue?
    I need help in finding out what is the problem.
    Thanks,

    So you are running ALTER TABLE SWITCH? Check for blocking. The operation requires an Sch-M lock which means that it is blocked any query that is running, including queries using NOLOCK.
    Erland Sommarskog, SQL Server MVP, [email protected]

  • (Urgent) form taking lots of time to load and fetch the data

    Hi
    I have very serious problem of perfomance. I have installed oracle portal and configured it properly.
    Now except form everything working perfectly. but only forms and links are taking lots of time to load
    as well as for fetching data. I try to tune my sga and aslo findout hits and miss but i didnt find any
    any problem for it. it is ok. can u help me pls. how can i make my form fast.
    My operation system is NT.
    version of database oracle 8.1.7
    Oracle Portal 3.0.8
    Thanks in advance
    raju parmar
    [email protected]
    null

    hi chetan
    Lots of Thanks.
    Now it working perfect.
    Raju
    <BLOCKQUOTE><font size="1" face="Verdana, Arial">quote:</font><HR>Originally posted by Chetan Kashyap ([email protected]):
    The workaround is provided at: http://technet.oracle.com:89/ubb/Forum81/HTML/000395.html <HR></BLOCKQUOTE>
    null

  • Web-I Report taking lots of time to refresh when filters changed

    In BO XI 3.1 Web-I report is taking lot of time to refresh when any changes are made in edit query mode on the filters.
    When the query is runned for the first time it runs well, but if we make any changes in the filters pane, it is taking lots of time to refresh, and when we cancel the query web-i server is not releasing the server resources (CPU, Memory)
    Did anyone face this kind of a problem, and resolved it.
    Please let me know your thoughts
    Thank you

    Hi,
    why do you need 100K rows in your reports? Is there a way to consolidate/aggregate your data on the database?
    Comparing sqlplus and BOBJ is a little bit unfair. Let me explain you why:
    sqlplus starts displaying data after the query returns the first results. I am sure that if you measure the time that sqlplus needs to display ALL rows, this should be way more than just 10 secs.
    On the other hand BOBJ will display something as soon as the first page can be created. Depending on the way your report is structured, it may be the case that the WebI server should first get ALL rows in order to be able to display the first page of your report. This is for example the case if you let the total number of pages being displayed or if you display total sums already at the header of your report or if you have added a chart based on your data also in the header of your report. Of course, after the first call the report will be cached (and thus fast to be displayed) until you change again the parameters values for your query.
    Since you do not display (or do you use this function in a variable or a formula, maybe?) the total number of pages in your report, I can only assume that you either trying to display total sums (or other kinds of aggregated values) or a chart in the first page of your report. Can you confirm this?
    Regards,
    Stratos

  • Delete taking lot of time

    Hi
    I have a delete statement..which is taking lot of time. If I select this scenerio only 500 records are coming. But delete is taking lot of time.
    Please advise.
    delete from whs_bi.TRACK_PLAY_EVENT a
    where a.time_stamp >=to_date('5/27/2013','mm/dd/yyyy')
    and a.time_stamp <to_date('5/28/2013','mm/dd/yyyy')Thanks in adv.
    KPR

    Lets check the wait events.
    Open 2 sessions, 1 for running the update command and another for monitoring wait events. The session in which you want to run UPDATE, find the SID of that session ( SELECT userenv('SID') from dual ).
    Now run the update in one session (of which we have already found the SID).
    Run following query in another session
    select w.sid sid,
           p.spid PID,
           w.event event,
           substr(s.username,1,10) username,
           substr(s.osuser, 1,10) osuser,
           w.state state,
           w.wait_time wait_time,
           w.seconds_in_wait wis,
           substr(w.p1text||' '||to_char(w.P1)||'-'||
                  w.p2text||' '||to_char(w.P2)||'-'||
                  w.p3text||' '||to_char(w.P3), 1, 45) P1_P2_P3_TEXT
    from v$session_wait w, v$session s, v$process p
    where s.sid=w.sid
      and p.addr  = s.paddr
      and w.event not in ('SQL*Net message from client', 'pipe get')
      and s.username is not null
      and s.sid = &your_SIDWhile UPDATE is running in another session, run above query in second session for 5-6 times, with gap of (say) 10 seconds. If you can give us the output of monitoring query (from all 5-6 runs), that might throw more light on whats going under the hood.

  • Function Module Extraction from KONV Table taking lot of time for extractio

    Hi
    I have a requirement wherein i need to get records from KONV Table (Conditions (Transaction Data) ). i need the data corresponding to Application (KAPPL) = 'F'.
    For this i had written one function module but it is taking lot of time (@ 2.5 hrs) for fetching records as there are large number of records in KONV Table.
    I am pasting the Function Module code for reference.
    <b>kindly guide me as to how the extraction performance can be improved.</b>
    <b>Function Module Code:</b>
    FUNCTION ZBW_SHPMNT_COND.
    ""Local interface:
    *"  IMPORTING
    *"     VALUE(I_REQUNR) TYPE  SBIWA_S_INTERFACE-REQUNR
    *"     VALUE(I_ISOURCE) TYPE  SBIWA_S_INTERFACE-ISOURCE OPTIONAL
    *"     VALUE(I_MAXSIZE) TYPE  SBIWA_S_INTERFACE-MAXSIZE OPTIONAL
    *"     VALUE(I_INITFLAG) TYPE  SBIWA_S_INTERFACE-INITFLAG OPTIONAL
    *"     VALUE(I_UPDMODE) TYPE  SBIWA_S_INTERFACE-UPDMODE OPTIONAL
    *"     VALUE(I_DATAPAKID) TYPE  SBIWA_S_INTERFACE-DATAPAKID OPTIONAL
    *"     VALUE(I_PRIVATE_MODE) OPTIONAL
    *"     VALUE(I_CALLMODE) LIKE  ROARCHD200-CALLMODE OPTIONAL
    *"  TABLES
    *"      I_T_SELECT TYPE  SBIWA_T_SELECT OPTIONAL
    *"      I_T_FIELDS TYPE  SBIWA_T_FIELDS OPTIONAL
    *"      E_T_DATA STRUCTURE  ZBW_SHPMNT_COND OPTIONAL
    *"      E_T_SOURCE_STRUCTURE_NAME OPTIONAL
    *"  EXCEPTIONS
    *"      NO_MORE_DATA
    *"      ERROR_PASSED_TO_MESS_HANDLER
    The input parameter I_DATAPAKID is not supported yet !
      TABLES: KONV.
    Auxiliary Selection criteria structure
      DATA: l_s_select TYPE sbiwa_s_select.
    Maximum number of lines for DB table
      STATICS: l_maxsize TYPE sbiwa_s_interface-maxsize.
    Maximum number of lines for DB table
      STATICS: S_S_IF TYPE SRSC_S_IF_SIMPLE,
    counter
              S_COUNTER_DATAPAKID LIKE SY-TABIX,
    cursor
              S_CURSOR TYPE CURSOR.
    Select ranges
      RANGES: L_R_KNUMV  FOR KONV-KNUMV,
              L_R_KSCHL  FOR KONV-KSCHL,
              L_R_KDATU  FOR KONV-KDATU.
    Declaring internal tables
    DATA : I_KONV LIKE KONV OCCURS 0 WITH HEADER LINE.
      DATA : Begin of I_KONV occurs 0,
             MANDT LIKE konv-mandt,
             KNUMV LIKE konv-knumv,
             KPOSN LIKE konv-kposn,
             STUNR LIKE konv-stunr,
             ZAEHK LIKE konv-zaehk,
             KAPPL LIKE konv-kappl,
             KSCHL LIKE konv-kschl,
             KDATU LIKE konv-kdatu,
             KBETR LIKE konv-kbetr,
             WAERS LIKE konv-waers,
             END OF I_KONV.
    Initialization mode (first call by SAPI) or data transfer mode
    (following calls) ?
      IF i_initflag = sbiwa_c_flag_on.
    Initialization: check input parameters
                    buffer input parameters
                    prepare data selection
    The input parameter I_DATAPAKID is not supported yet !
    Invalid second initialization call -> error exit
        IF NOT g_flag_interface_initialized IS INITIAL.
          IF
            1 = 2.
            MESSAGE e008(r3).
          ENDIF.
          log_write 'E'                    "message type
                    'R3'                   "message class
                    '008'                  "message number
                    ' '                    "message variable 1
                    ' '.                   "message variable 2
          RAISE error_passed_to_mess_handler.
        ENDIF.
    Check InfoSource validity
        CASE i_isource.
          WHEN 'X'.
         WHEN 'Y'.
         WHEN 'Z'.
          WHEN OTHERS.
           IF 1 = 2. MESSAGE e009(r3). ENDIF.
           log_write 'E'                  "message type
                     'R3'                 "message class
                     '009'                "message number
                     i_isource            "message variable 1
                     ' '.                 "message variable 2
           RAISE error_passed_to_mess_handler.
        ENDCASE.
    Check for supported update mode
        CASE i_updmode.
    For full upload
          WHEN 'F'.
          WHEN 'D'.
          WHEN OTHERS.
           IF 1 = 2. MESSAGE e011(r3). ENDIF.
           log_write 'E'                  "message type
                     'R3'                 "message class
                     '011'                "message number
                     i_updmode            "message variable 1
                     ' '.                 "message variable 2
           RAISE error_passed_to_mess_handler.
        ENDCASE.
        APPEND LINES OF i_t_select TO g_t_select.
    Fill parameter buffer for data extraction calls
        g_s_interface-requnr    = i_requnr.
        g_s_interface-isource   = i_isource.
        g_s_interface-maxsize   = i_maxsize.
        g_s_interface-initflag  = i_initflag.
        g_s_interface-updmode   = i_updmode.
        g_s_interface-datapakid = i_datapakid.
        g_flag_interface_initialized = sbiwa_c_flag_on.
    Fill field list table for an optimized select statement
    (in case that there is no 1:1 relation between InfoSource fields
    and database table fields this may be far from beeing trivial)
        APPEND LINES OF i_t_fields TO g_t_fields.
    Interpretation of date selection for generic extraktion
       CALL FUNCTION 'RSA3_DATE_RANGE_CONVERT'
         TABLES
           i_t_select = g_t_select.
      ELSE.                 "Initialization mode or data extraction ?
       CASE g_s_interface-updmode.
         WHEN 'F' OR 'C' OR 'I'.
    First data package -> OPEN CURSOR
        IF g_counter_datapakid = 0.
       L_MAXSIZE = G_S_INTERFACE-MAXSIZE.
          LOOP AT g_t_select INTO l_s_select WHERE fieldnm = 'KNUMV'.
            MOVE-CORRESPONDING l_s_select TO l_r_knumv.
            APPEND l_r_knumv.
          ENDLOOP.
          LOOP AT g_t_select INTO l_s_select WHERE fieldnm = 'KSCHL'.
            MOVE-CORRESPONDING l_s_select TO l_r_kschl.
            APPEND l_r_kschl.
          ENDLOOP.
          Loop AT g_t_select INTO l_s_select WHERE fieldnm = 'KDATU'.
            MOVE-CORRESPONDING l_s_select TO l_r_kdatu.
            APPEND l_r_kdatu.
          ENDLOOP.
    *In case of full upload
    Fill field list table for an optimized select statement
    (in case that there is no 1:1 relation between InfoSource fields
    and database table fields this may be far from beeing trivial)
       APPEND LINES OF I_T_FIELDS TO S_S_IF-T_FIELDS.
          OPEN CURSOR G_CURSOR FOR
            SELECT MANDT
                   KNUMV
                   KPOSN
                   STUNR
                   ZAEHK
                   KAPPL
                   KSCHL
                   KDATU
                   KBETR
                   WAERS
            FROM   KONV
            WHERE KNUMV IN l_r_knumv
            AND   KSCHL IN l_r_kschl
            AND   KDATU IN l_r_kdatu
            AND   KAPPL EQ 'F'.
        ENDIF.
        Refresh I_KONV.
        FETCH NEXT CURSOR G_CURSOR
                   APPENDING CORRESPONDING FIELDS OF TABLE I_KONV
                   PACKAGE SIZE S_S_IF-MAXSIZE.
        IF SY-SUBRC <> 0.
          CLOSE CURSOR G_CURSOR.
          RAISE NO_MORE_DATA.
        ENDIF.
        LOOP AT I_KONV.
         IF I_KONV-KAPPL EQ 'F'.
          CLEAR :E_T_DATA.
          E_T_DATA-MANDT = I_KONV-MANDT.
          E_T_DATA-KNUMV = I_KONV-KNUMV.
          E_T_DATA-KPOSN = I_KONV-KPOSN.
          E_T_DATA-STUNR = I_KONV-STUNR.
          E_T_DATA-ZAEHK = I_KONV-ZAEHK.
          E_T_DATA-KAPPL = I_KONV-KAPPL.
          E_T_DATA-KSCHL = I_KONV-KSCHL.
          E_T_DATA-KDATU = I_KONV-KDATU.
          E_T_DATA-KBETR = I_KONV-KBETR.
          E_T_DATA-WAERS = I_KONV-WAERS.
          APPEND E_T_DATA.
       ENDIF.
        ENDLOOP.
        g_counter_datapakid = g_counter_datapakid + 1.
      ENDIF.
    ENDFUNCTION.
    Thanks in Advance
    Regards
    Swapnil.

    Hi,
    one option to investigate is to select the data with a condition on KNUMV (primary IDX).
    Since shipment costs are store in VFKP I would investigate if all your F condition records are used in this table (field VFKP-KNUMV).
    If this is the case then something like
    SELECT *
    FROM KONV
    WHERE KNUMV IN (SELECT DISTINCT KNUMV FROM VFKP)
    or
    SELECT DISTINCT KNUMV
    INTO CORRESPONDING FIELD OF <itab>
    FROM VFKP
    and then
    SELECT *
    FROM KONV
    FOR ALL ENTRIES IN <itab>
    WHERE...
    will definitively speed it up.
    hope this helps....
    Olivier

  • Interface is taking lots of time for first time execution

    Hello all,
    I have one inbound interface in which i am updating T-code CO01 ,CO15 , MIGO and MB1A respectively
    depending upon the condition in the incoming file.
    When ever i run this interface for the first time it takes lots of time for execution, but when i run the same interface again with the same file it is taking half time of the first time execution.
    I am not able to understand why it is taking lots of time for first execution and after that execution time is redued very much with the same file
    Kindly help
    Thanks
    Sachin Yadav

    thank you santhosh for your reply..
    as this program is in Production system it it taking 3 to 4 hour to execute.
    Previously it was OK but from last 2 to 3 moths it is taking more time.
    when i debug the code it is working fine. I din't find any point at which it is taking more time.
    I am not aware about buffers and SAP memory. can you plz help
    Do you have any idea why this is happening.
    Or how can i rectify the problem?
    Thanks
    Sachin

  • Extractor taking lots of time.

    hi experts,
    we are using extractor 2LIS_13_VDKON to extract the data.
    but its taking lots of time.how can i solve this problem.
    any help appreciated.

    Hi,
    Have you done any enhancements(Append) to this DS.If so, make sure that code is performance optimistic.
    Basically the setup table for 2LIS_13_VDKON will has the lot of data. For example, it holds at n avarage 10 records for each billing item record.
    So also check system is uploading records to the New table of ODS or not. 
    With rgds,
    Anil Kumar Sharma .P

  • Browser is taking lot of time in loading forms

    Hi
    I am working on forms 10g..
    when i am running my form.... the browser is loading applet and jar files.
    After loading this it is taking lot of time to show forms...
    Even I am clearing all my temp files frequently...
    Why it is taking so much time??
    What I have to do so that it can load forms in very least time??
    Thanks

    After loading this it is taking lot of time to show forms...This could be caused by a resource issue on the Client PC. By resource I mean available RAM, JRE memory settings, etc. It could also be a conflict between the Client OS, installed JRE and the Browser. Please tell us what the Client OS, JRE and Browser are as well as how much RAM is installed on the client pc.
    Even I am clearing all my temp files frequently...Are you clearing the Browsers temporary internet files or the JRE's JAR cache? If the JRE's JAR cache then you are forcing the client to download the JAR each time you clear the JAR cache.
    The slowness can also be caused by how large your FORMS files are and what you are doing in the Pre-Form and When-New-Form-Instance triggers. If you are performing numerous calls to the database or calling SYNCHRONIZE too many time in any of the triggers that fire when a Form initially loads then this will cause your form to load slower.
    Hope this helps,
    Craig B-)
    If someone's response is helpful or correct, please mark it accordingly.

  • Taking lot of time for loading

    Hi,
      We are loading data from HR to BI. Connection between HR and BI has been done recently. When I am trying to load data from HR  to BI it taking lot of time--for example to load 5recs its taking 8hrs. Its same for every datasource. Should we change anything in settings to makes IDOCS work proper? Thanks

    You have to isolate the part that is taking the time.
    - Is R/3 extraction quick? (You can check with RSA3 and see how long does it take.)
    - If R/3 extraction is slow, then is the selection using an index? How many records are there in the table / view?
    - Is there a user exit? Is user exit slow?
    You can find out using monitor screen:
    - After R/3 extraction completed, how long did it take to insert into PSA?
    - How long did it take in the update rules?
    - How long did it take to activate?
    Once you isolate the problem area, post your findings here and someone will help you.

  • Activation of datatargets  in bw 3.5 is taking lot of time

    Hi
    i am using bw 3.5.I have created lot of infocubes , ods etc for practice sake .May be due to this or some other reason , its taking lot of time to get executed . sometimes same screen is appearing for a long time (nearly half an hour)even i m trying to activate any datatarget . In this case i am again stopping the bw in  sapmmc and again starting
    what may be the reason for both these things.Is it due to overload of data ??
    If so , shall i delete the previous data targets?
    Anyone plz give me the reason for this kind of problem
    Thank you
    Deepthi matturi

    Hi Deepthi
    Your are working on ides or any training system , there is no batch_jobs was scheduled to clear down the logs,
    it seems that it is bulding up logs , which needs to be deleted
    OR
    there may be less table space , you can check table space using : DB02,
    please ask your basis team to check your bw system/server
    Hope this helps
    sukhi

  • Rich Text Email taking lot of time opening in outlook

    Hi,
    We faced this issue where rich text email with embedded images taking lot of time opening in outlook.
    Outlook even hangs and it takes 2 to 3 minutes to open.
    We are using exchange 2010 SP2 and issue comes with Outlook 2007/2010 both in online mode.
    Size of the mail is usually 2-3 MB.

    Hi,
    I would like to verify if this problematic email is from your organization at first.
    About rich text, I would like to clarify the following things:
    1. We can use Rich Text when we send messages inside an organization that uses Microsoft Exchange, but it is recommend to use the HTML format.
    2. When you send a Rich Text Format message to someone outside your organization, Outlook automatically converts it to HTML, so the message keeps its formatting and its attachments.
    Hope my clarification is helpful.
    Best regards,
    Amy
    Amy Wang
    TechNet Community Support

  • Background job is taking lot of time for processing the job.

    One background job - which is processing Idocs is processing a job for more than 2000+ seconds.. and completed tho.
    But this is happening for the past few days.. is there any way to trouble shoot, or find from any logs of this completed job, to find out why it was taking lot of time for processing.
    Can you please tell me the steps of analyzing / trouble shooting why it was taking lot of time daily.
    Regards,
    Satish.

    Hi Satish,
    Run DB stat from db13 it will improve the performance.
    Check number of idocs. You can send part by part, instead of sending huge data.
    Check SM58 logs
    Suman

  • Data load taking lot of time

    Hi All,
    I am trying to upload 2LIS_02_SCL data, I am trying to refresh the entire data using Init, it is taking lot of time in Prodcution. This is creating the problem for me for next Back ground Job which runs in the night for delta's. Please can anybody guide me how to speed up the load? This datasource is connected to 2 Infocubes and 1 ODS.
    Regards,
    Rajesh

    Hi Sal,
    I have done the same things as you said, my mistake is
    R/3:
    Locked all R/3 users.
    Deleted from LBWG.
    Filled setup tables using Document Date from 2006 to 2008 (Month wise i have filled).
    BW:
    I have cleaned entire data from 0PUR_C01 and 0PUR_C04.
    Loaded data at a time to 2 Cubes and 1 ODS using Init upload.
    It is started taking long time. Actual problem is load was not finished at the time of Daily load Process Chain starts(Night).
    I have cancelled the job. made available only delta to Process Chain.
    Now problem is escaled, again i started today same Init for 1 Cube only, again taking long time.
    Regards,
    Rajesh

Maybe you are looking for

  • Problem with "System.in.read()" read() method!

    import java.io.*; import java.lang.*; class Count { public static void main(String args[]) throws IOException int count = 0; while (System.in.read()!=-1) count++; System.out.println("Input has " + count + " chars."); in this code the loop stucks and

  • Apple Hardware Test & My Power Supply

    I was wondering if the Apple Hardware Test does any check on the Power Supply. I was planning on replacing mine on a Mac Pro (Early 2008). I have been having a constant freeze from wake issue, and the system refuses to power on from a cold start unle

  • Can't replace udev with systemd-tools (file conflict)

    Hello, just want to update, im getting this output: (26/26) Überprüfe Paket-Integrität [######################] 100% (26/26) Lade Paket-Dateien [######################] 100% (26/26) Prüfe auf Dateikonflikte [######################] 100% Fehler: Konnt

  • YouTube File Blurry in Fullscreen (Captivate 5.5)

    Hello, I am publishing my Captivate 5.5 project to YouTube, and it looks great in YouTube in the small screen version, but when I view it in fullscreen it is very blurry.  Here are my settings details: - Project created with Custom Size setting of Yo

  • Way to limit image size on uploaded images by editors?

    Help! I asked this a few days ago, with no reply. Is there a way to limit image size when an editor or publisher uploads an image? Something like the feature Adobe Contribute has? Thanks so much. Mikki