Time taken by secondary TMUSREVT to remove dead entries

Background -
We used to have a huge number of dead entries in our TMUSREVT.DAT (Tuxedo Version: 8.1.362), which were caused by timeouts on the WSL. These were never getting cleaned out and the file size kept growing.
Based on the response to this tmusrevt.dat - dead entries do not seem to be removed. we started a secondary TMUSREVT (with the -S option) and we have a working solution.
Problem -
However, the amount of time it takes before deleting the dead entries varies a lot - sometimes 20 minutes and sometimes 2+ hours.
Is there any documentation that says when the secondary TMUSREVT will remove the dead entries? I was hoping it would be the -p option in the CLOPT of the TMUSREVT, but that did not seem to have any effect on the time taken to clean out dead entries.
So, the question is - When will the secondary TMUSREVT remove the dead entries? Is the time taken inversely proportional to how busy the environment is?
Edited by: Arvind Ramanan on Feb 11, 2011 11:38 AM
Edited by: Arvind Ramanan on Feb 11, 2011 2:07 PM

From Oracle Doc ID 1057214.1
The -p option is configured differently for primary and secondary servers.
For primary TMUSREVT, -p means polling interval for garbage collection of unnecessary entries.
For secondary TMUSREVT, -p means polling interval when the secondary server polls the primary server for changes to tmusrevt.dat and updates its local control file if necessary. At -p inverval, the secondary server will do a tpcall() to get new data from the primary server.
So if -p of secondary TMUSEREVT is big, a subscriber can't get the post message if it sees old data in the control file and it subscribes before the polling interval.
The polling time (-p option) for secondary TMUSREVT depends on your application requirement.
1)client calls tpsubsribe("event1",...);
2)TMUSREVT(primary) adds event in tmusrevt.dat
3)TMUSREVT(secondary) gets latest tmusrevt.dat "-p" interval.
When any process calls tppost() for "event1", if it calls the secondary TMUSREVT (usually after the primary is booted) between steps in 2) and 3), the client in 1) can't get post message because the event database (tmusrevt.dat) of secondary TMUSREVT is old.
So, consider the above scenario when setting the -p option of the secondary TMUSREVT.
The recommendation is to set a small value for the -p option in the secondary TMUSREVT if it uses the secondary TMUSREVT.
The other thing you need to be cautious about is the size of tmusrevt.dat. If tmusrevt.dat is very big, it affects both system resources and performance.
For example, if the client calls tpsubscribe() and if it does not call() tpunsubcribe before the process exits, it will increase the size of tmusrevt.dat because garbage collection of primary servers can delete only two entries at one interval. If many clients call tpsubscribe() and do not call tpunsubscribe() before exiting, many garbled entries will remain in tmusrevt.dat. Also, use of TPEVPERSIST causes entries to remain in the tmusrevt.dat file.
To keep the tmusrevt.dat file small, call tpunsubscribe() when the event is not necessary. If the size is large then it causes TMUSREVT to consume many CPU cycles .
A size of 100K should be okay for tmusrevt.dat.
Hope this answers your question.
-Venkat

Similar Messages

  • Got a mac book pro a month back, the bootup time taken now is much more. Requesting possible solutions. Thanks.

    Got a mac book pro a month back, the bootup time taken now is much more as compared to earlier. Requesting some possible solutions.
    Thanks,
    Sid

    Linc,
    As per your instructions, these were the messages that were logged in the first few minutes after start up.
    Jul 17 21:17:18 localhost com.apple.launchd[1]: *** launchd[1] has started up. ***
    Jul 17 21:17:37 localhost com.apple.usbmuxd[22]: usbmuxd-211 built on Feb  8 2011 at 13:49:43 on Feb  8 2011 at 13:49:43, running 64 bit
    Jul 17 21:17:39 localhost bootlog[41]: BOOT_TIME: 1310917638 0
    Jul 17 21:17:39 localhost mDNSResponder[29]: mDNSResponder mDNSResponder-258.21 (May 26 2011 14:40:13) starting
    Jul 17 21:17:39 localhost configd[13]: bootp_session_transmit: bpf_write(en1) failed: Network is down (50)
    Jul 17 21:17:39 localhost configd[13]: DHCP en1: INIT-REBOOT transmit failed
    Jul 17 21:17:39 Siddharth-Jaswas-MacBook-Pro configd[13]: setting hostname to "Siddharth-Jaswas-MacBook-Pro.local"
    Jul 17 21:17:39 Siddharth-Jaswas-MacBook-Pro configd[13]: network configuration changed.
    Jul 17 21:17:42 Siddharth-Jaswas-MacBook-Pro blued[16]: Apple Bluetooth daemon started
    Jul 17 21:17:43 Siddharth-Jaswas-MacBook-Pro /System/Library/CoreServices/loginwindow.app/Contents/MacOS/loginwindow[30]: Login Window Application Started
    Jul 17 21:17:43 Siddharth-Jaswas-MacBook-Pro com.apple.kextd[10]: Can't load /System/Library/Extensions/IOSerialFamily.kext/Contents/PlugIns/InternalModemSu pport.kext - no code for running kernel's architecture.
    Jul 17 21:17:43 Siddharth-Jaswas-MacBook-Pro com.apple.kextd[10]: Failed to load /System/Library/Extensions/IOSerialFamily.kext/Contents/PlugIns/InternalModemSu pport.kext - (libkern/kext) requested architecture/executable not found.
    Jul 17 21:17:43 Siddharth-Jaswas-MacBook-Pro com.apple.kextd[10]: Load com.apple.driver.InternalModemSupport failed; removing personalities.
    Jul 17 21:17:44 Siddharth-Jaswas-MacBook-Pro loginwindow[30]: Login Window Started Security Agent
    Jul 17 21:17:45 Siddharth-Jaswas-MacBook-Pro loginwindow[30]: Login Window - Returned from Security Agent
    Jul 17 21:17:45 Siddharth-Jaswas-MacBook-Pro loginwindow[30]: USER_PROCESS: 30 console
    Jul 17 21:17:45 Siddharth-Jaswas-MacBook-Pro com.apple.launchd.peruser.501[86] (com.apple.ReportCrash): Falling back to default Mach exception handler. Could not find: com.apple.ReportCrash.Self
    Jul 17 21:17:45 Siddharth-Jaswas-MacBook-Pro configd[13]: network configuration changed.
    Jul 17 21:17:50 Siddharth-Jaswas-MacBook-Pro com.apple.launchd.peruser.501[86] (com.apple.Kerberos.renew.plist[110]): Exited with exit code: 1
    Jul 17 21:17:50 Siddharth-Jaswas-MacBook-Pro com.apple.usbmuxd[22]: HandleUSBMuxDictionary client 0x101800ab0-iTunesHelper/com.apple.iTunesHelper using library usbmuxd-211 built on Jan 13 2011 at 04:19:31, running usbmuxd-211 built on Feb  8 2011 at 13:49:43
    Jul 17 21:18:10 Siddharth-Jaswas-MacBook-Pro [0x0-0xf00f].com.google.Chrome[140]: [0717/211810:INFO:breakpad_mac.mm(89)] Breakpad disabled
    Jul 17 21:18:16 Siddharth-Jaswas-MacBook-Pro [0x0-0xf00f].com.google.Chrome[140]: [140:519:54119099370:ERROR:CONSOLE(6465)] "Uncaught TypeError: Cannot read property 'can_uninstall' of undefined", source: chrome://newtab/ (6465)
    Thanks,
    Sid

  • OWB 10g - The time taken for data load is too high

    I am loading data on the test datawarehouse server. The time taken for loading data is very high. The size of data is around 7 GB (size of flat files on the OS).
    The time it takes to load the same amount of data on the production server from the staging area to the presentation area(datawarehouse) is close to 8 hours maximum.
    But, in the test environment the time taken to execute one mapping (containing 300,000 records)is itself 8 hours.
    The version of Oracle database on both the test and production servers is the same i.e., Oracle 9i.
    The configuration of the production server is : 4 Pentium III processors (2.7 GHz each), 2 GB RAM, Windows 2000 Advanced Server, 8 kilobyte primary memory cache, 512 kilobyte secondary memory cache, 440.05 Gigabytes Usable Hard Drive Capacity, 73.06 Gigabytes Hard Drive Free Space
    The configuration of the test server is : 4 Pentium III processors (2.4 GHz each), 1 GB RAM, Windows 2000 Advanced Server, 8 kilobyte primary memory cache,
    512 kilobyte secondary memory cache, 144.96 Gigabytes Usable Hard Drive Capacity, 5.22 Gigabytes Hard Drive Free Space.
    Can you guys please help me to detect the the possible causes of such erratic behaviour of the OWB 10g Tool.
    Thanks & Best Regards,
    Harshad Borgaonkar
    PwC

    Hello Harshad,
    2 GB of RAM doesn't seem to be very much to me. I guess your bottleneck is I/O. You've got to investigate this (keep an eye on long running processes). You didn't say very much about your target database design. Do you have a lot of indexes on the target tables and if so have you tried to drop them before loading? Do your OWB mappings require a lot of lookups (then apropriate indexes on the lookup table are very useful)? Do you use external tables? Are you talking about loading dimension or fact tables or both? You've got to supply some more information so that we can help you better.
    Regards,
    Jörg

  • CMD=Ping&log query is taking long time-taken when checked in IIS logs.... Exchange 2010 SP3..

    Query regarding the ActiveSync and parameter time-taken from ActiveSync IIS logs.
    Here what I see for from the logs.
    [email protected] 45.101.90.185 Apple-iPad2C3/1202.410 200 0 0 1501129
    443 [email protected] 45.101.90.185 Apple-iPad2C3/1202.410 200
    0 0 22105
    443 [email protected] 45.101.90.185 Apple-iPad2C3/1202.410 200
    0 0 452
    443 [email protected] 45.101.90.185 Apple-iPad2C3/1202.410 200
    0 0 936
    443 [email protected] 45.101.90.185 Apple-iPad2C3/1202.410 200
    0 0 656238 
    In the above log, highlighted are the time-taken and I just want to check what is the ideal time-taken value, some value above should be causing some problem, like the one of the top 1501129 ?
    ?AND I see its for POST event and CMD=Ping&log query.......
    We have Mobile Iron in the environment and we are seeing few timeout errors on MobileIron server and for users intermittently. They usually see below error... However we don't see any end users issues, but just want to get rid of below error. MobileIron guys
    are pointing it to time-taken value which is high intermittently.
    IOException connection to server [email protected] -- java.io.IOException:
    awaitUninterruptibly was stopped by timeout
    @BALA

    Hi,
    To understand more about the issue, I’d like to confirm the following information:
    1. What’s your Exchange 2010 version? 
    http://support.microsoft.com/kb/2536517/en-us
    2. Do you install other software, like SQL, on the same Exchange Server?
    3. Change another admin account to access EMS.
    Thanks,
    Angela Shi
    TechNet Community Support

  • How to show the processing time taken for a BPEL process in BAM report.

    Hi All,
    I have the data as below in the Data object. I would like to show the time taken for each order to complete in the report.
    instance Id     order Id     product Name     product Code     price     status     instance Time      updaterName
    1360010     ord004     Guitar     prod003     2000     requested     9/22/2008 12:12:11 PM     Invoke_InsertSalesOrder
    1360010     ord004     Guitar     prod003     2000     Approved     9/22/2008 12:15:11 PM     Invoke_OrderStatusUpdate
    This data comes from simple BPEL process where sensors are configured at the start and end of BPEL process. Also have a human task activity in between to create the time difference.
    In Enterprise link design studio, I tried to calculate the time difference using expression calculator and store it as calculated field. But that doesn't seems to work because when I execute the plan, second sensor data reaches only after human approval whereas first sensor data would be waiting for calculation and ultimately nothing comes into data object.
    How and where the calculation be done to show the processing time in the report. Please someone throw some light on this.
    Regards
    Jude.
    Edited by: user600726 on Sep 30, 2008 1:30 AM

    I would suggest modifying your data object so that the data can all be in a single row and use the sensor at the end of the process to upsert (update) the row created by the sensor at the start of the process. The time difference between two fields in the same row is then an easy calculation on a BAM report -- No EL plan should be needed.

  • Report to display Average time taken for processing payments".

    Hi,
    I have been asked to develop a report for "Report to display Average time taken for processing payments".
    Could any one guide me technically what are the different tables i need to take to generate the report. Treat this is very urgent. Pls provide sample code too....
    Thanks in advance....

    Given below is the set up for credit card payment processing:
    Set Up Credit Control Areas:
    Define Credit Control Area
    Transaction: OB45 
    Tables: T014
    Action: Define a credit control area and its associated currency.  The Update Group should be ‘00012’.  This entry is required so the sales order will calculate the value to authorize
    Assign Company Code to Credit Control Area
    Transaction: OB38
    Tables: T001
    Action: Assign a default credit control area for each company code
    Define Permitted Credit Control Area for a Company
    Code
    Transaction: 
    Tables: T001CM
    Action: For each company code enter every credit control area that can be used
    Identify Credit Price
    Transaction: V/08
    Tables: T683S
    Action: Towards the end of the pricing procedure, after all pricing and tax determination, create a subtotal line to store the value of the price plus any sales tax.  Make the following entries:
    Sub to:  “A”
    Reqt:  “2”
    AltCTy:  “4”
    Automatic Credit Checking
    Transaction: OVA8
    Tables: T691F
    Action: Select each combination of credit control areas, risk categories and document types for which credit checking should be bypassed.  You need to mark the field “no Credit Check” with the valid number for sales documents.
    Set Up Payment Guarantees
    Define Forms of Payment Guarantee
    Transaction: OVFD
    Tables: T691K
    Action: R/3 is delivered with form “02” defined for payment cards.  Other than the descriptor, the only other entry should be “3” in the column labeled “PymtGuaCat”
    Define Payment Guarantee Procedure
    Transaction: 
    Tables: T691M/T691O
    Action: Define a procedure and a description. 
    Forms of Payment Guarantee and make the following entries Sequential Number  “1” 
    Payment Guarantee Form “02”
    Routine Number   “0”    Routine Number can be used to validate payment card presence.
    Define Customer Payment Guarantee Flag
    Transaction: 
    Tables: T691P
    Action: Define a flag to be stored in table. 
    Create Customer Payment Guarantee = “Payment Card Payment Cards (All Customers can use Payment Cards)”.
    Define Sales Document Payment Guarantee Flag
    Transaction: 
    Tables: T691R
    Action: Define the flag that will be associated with sales document types that are relevant for payment cards
    Assign Sales Document Payment Guarantee Flag
    Transaction: 
    Tables: TVAK
    Action: Assign the document flag type the sales documents types that are relevant for payment cards.
    Determine Payment Guarantee Procedure
    Transaction: OVFJ
    Tables: T691U
    Action: Combine the Customer flag and the sales document flag to derive the payment guarantee procedure
    Payment Card Configuration
    Define Card Types
    Transaction: 
    Tables: TVCIN
    Action: Create the different card types plus the routine that validates the card for length and prefix (etc…) 
    Visa , Mastercard, American Express, and Discover 
    Create the following entries for each payment card 
    AMEX  American Express ZCCARD_CHECK_AMEX Month
    DC  Discover Card  ZCCARD_CHECK_DC  Month*****
    MC  Mastercard  ZCCARD_CHECK_MC  Month
    VISA  Visa   ZCCARD_CHECK_VISA  Month
    The Routines can be created based on the original routines delivered by SAP. 
    *****SAP does not deliver a card check for Discover Card. We created our own routine.
    Define Card Categories
    Transaction: 
    Tables: TVCTY
    Action: Define the card category to determine if a
    payment card is a credit card or a procurement card.
    Create the following two entries
    Cat Description  One Card  Additional Data
    CC Credit Cards  No-check  No-check
    PC Procurement Cards No-check  Check
    Determine Card Categories
    Transaction: 
    Tables: TVCTD
    Action: For each card category map the account number range to a card category.  Multiple ranges are possible for each card category or a masking technique can be used.  Get the card number ranges from user community.  Below is just a sample of what I am aware are the different types of cards. 
    Visa Credit  Expires in 7 days. 
        400000   405500
        405505   405549
        405555   415927
        415929   424603
        424606   427532
        427534   428799
        428900   471699
        471700   499999
    Visa Procurement  Expires in 7 days.
        405501   405504
        405550   405554
        415928   415928
        424604   424605
        427533   427533
        428800   428899
    Mastercard Credit Expires in 30 days
        500000   540499
        540600   554999
        557000   599999
    Mastercard Procurement Expires in 30 days
        540500   540599
        555000   556999
    American Express Credit Expires in 30 days
        340000   349999
        370000   379999
    Discover Card Credit Expires in 30 days
        601100   601199
    Set Sales Documents to accept Payment Card Information Transaction: 
    Tables: TVAK
    Action: Review the listing of Sales Document types and enter “03” in the column labeled “PT” for each type which can accept a payment card
    Configuration for Authorization Request
    Maintain Authorization Requirements
    Transaction: OV9A
    Tables: TFRM
    Action: Define and activate the abap requirement that determines when an authorization is sent.  Note that the following tables are available to be used in the abap requirement (VBAK, VBAP, VBKD, VBUK, and VBUP).
    Define Checking Group
    Transaction: 
    Tables: CCPGA
    Action: Define a checking group and enter the
    description.  Then follow the below guidelines for the remaining fields to be filled.
    AuthReq Routine 901 is set here.
    PreAu  If checked R/3 will request an authorization for a .01 and the authorization will be flagged as such. (Insight does not use pre-authorization check).
    A horizon This is the days in the future SAP will use to determine the value to authorize
    (Insight does not use auth horizon period).
    Valid  You will get warning message if the payment card is expiring within 30 days of order entry date. 
    Assign Checking Group to Sales Document
    Transaction: 
    Tables: TVAK
    Action: Assign the checking group to the sales order types relevant for payment cards
    Define Authorization Validity Periods
    Transaction: 
    Tables: TVCIN
    Action: For each card type enter the authorization validity period in days.
    AMEX American Express 30
    DC Discover card  30
    MC Master card  30
    VISA Visa   7
    Configuration for clearing houses
    Create new General Ledger Accounts
    Transaction: FS01
    Tables: 
    Action: Two General Ledger accounts need to be created for each payment card type.  One for A/R reconciliation purposes and one for credit card clearing.
    Maintain Condition Types
    Transaction: OV85
    Tables: T685
    Action: Define a condition type for account determination and assign it to access sequence “A001”
    Define account determination procedure
    Transaction: OV86
    Tables: T683 / T683S
    Action: Define procedure name and select the procedure for control.  Enter the condition type defined in the previous step.
    Assign account determination procedure
    Transaction: 
    Tables:
    Action: Determine which billing type we are using for payment card process.
    Authorization and Settlement Control
    Transaction: 
    Tables: TCCAA
    Action: Define the general ledger accounts for reconciliation and clearing and assign the function modules for authorization and settlement along with the proper RFC destinations for each.
    Enter Merchant ID’s
    Transaction: 
    Tables: TCCM
    Action: Create the merchant id’s that the company uses to process payment cards
    Assign merchant id’s
    Transaction: 
    Tables: TCCAA
    Action: Enter the merchant id’s with each clearinghouse account

  • PO Lead Time cannot capture the time taken for shipping!

    Dear All
    I understand that we have PO lead time = PO Processing Time (Working Day) + Planned Delivery Time (Calendar Day) + GR processing time (working day).
    And this PO lead time will be added on top of my PO Creation Date to defer the actual goods availability date.
    My question:
    1. Planned delivery time is the time taken from vendor place to send out the goods to my warehouse. What if it is overseas purchase where goods leaving vendor's Port will first arrive in my country custom, and it will take 3 days to do clearance. once it is cleared, forwarding agent will delivery goods from my country custom to my warehouse. In this case, how do I capture it in SAP system for the planned delivery time as it has 4 periods of time now
    a. Time taken from vendor's port to reach my country's port
    b. Time taken for my country custom to do clearing
    c. Time taken for forwarding agent to fetch goods from custom to my warehouse
    d. Time taken for unpack, take out , count, inspect and put for use (GR processing time)
    Do I need to use user Exit?
    Thanks
    Edited by: Daimos on Apr 27, 2009 6:52 PM

    Dear dogboy.
    I think we must use feature on the PO Confirmation Control (CC) Key at PO Item Level:
    ED - Estimated Time of Departure from Overseas Port.
    EA - Actual Time of Departure from Overseas Port.
    EA - Estimated Time of Arrival
    AA - Actual Time of Arrival
    And the purchaser will maintain the value of each of the CC Key each time they are notified by the vendor.
    And we need to come out with a Customised Report to capture those CC dates entered so that finance is able to prepare $ in advance if the moment the EA is maintained, meaning the estimated date of arrival at the Custom there.
    But the problem is that PO User Exit is only at the header of Confirmation Control Key but not capture the DATE field we entered for each CC.
    That was the problem I last encountered.

  • Does PSE 13 (Mac) support arranging pictures by EXIF Date-Time Taken?

    Does PSE 13 (Mac) support arranging pictures by EXIF Date-Time Taken?
    If not, is this an option under consideration?

    billqaz a écrit:
    Does PSE 13 (Mac) support arranging pictures by EXIF Date-Time Taken?
    If not, is this an option under consideration?
    It's the default sort order. You can choose chronological order( ascending or descending) or by Import batch or by filename. Contrary to older versions, those sort orders are available not only in thumbnail view, but also in Folders view and albums. In albums you can sort as you like 'custom order'. No difference in Mac or Win.

  • Sort photos by time taken

    In Photoshop 9 can I view or sort photos by time taken not just by date taken?

    You can check Edit >> Preferences >>  General Dialog as depicted here-
    Thanks
    Andaleeb

  • Sorting photos by date/time taken, when date/time created is not an option?

    I'm so sorry if this has been answered elsewhere, but I can't find it on Google or in the forums...
    I need to sort wedding photos from multiple photographers by date taken so I can sort through the wedding as a full event. Sorting by the date/time the file was created will not suffice because I had to convert all photos from one of my photogs from NEF to DNG (D610 files not compatible with CS5), so the 'date created' for those DNG file is days later than all the other images. Thus, sorting by "date created' batches all the photos from one of my photogs together at the end.
    Is there any option in Bridge that will actually read the metadata and sort by the shoot time of the image?
    Thanks for any help!

      There has been a known bug in PSE9 although only affecting raw images and caused by the Camera Raw 6.4 update. The timestamp gets changed by one hour each time an image is modified. I don’t know if your issue is related but the work around is to revert to ACR 6.3
    http://kb2.adobe.com/cps/915/cpsid_91582.html
    The only other work around is to temporarily switch to folder location view, if your images were imported from a single folder. Click on the display button (near top right in Organizer) to change the browser view. If you click on the folder in the left hand pane the files will normally be in sequential file name order which usually mirrors the date/time taken. From the folder location it’s possible to create an instant album, which can then be used as the basis of a slideshow.

  • In the numbers app, using the "date and time" function, is it possible to remove the time? I need to put together a list of dates, but I don't need or want times.

    In the numbers app, using the "date and time" function, is it possible to remove the time? I need to put together a list of dates, but I don't need or want times.

    When formatting your column to date/time, pick Date & time, and then pick the letter i in the circle to the right. Then scroll down and pick "No time"
    Jason

  • How much time taken to responde on client by DCs/GCs

    Hi
    how can i get the answer of below question which asked by clients.
    1: Time taken by AD to respond back on authentication request.
    2: Time taken after user request submission to application related sub service to request served by the application
    3: how much time Directory client use the global catalogue interface to perform forest-wide searches by querying a single server

    > 1: Time taken by AD to respond back on authentication request.
    > 2: Time taken after user request submission to application related sub
    > service to request served by the application
    > 3: how much time Directory client use the global catalogue interface to
    > perform forest-wide searches by querying a single server
    A client side network capture can answer all of these :)
    Greetings/Grüße,
    Martin
    Mal ein
    gutes Buch über GPOs lesen?
    Good or bad GPOs? - my blog…
    And if IT bothers me -
    coke bottle design refreshment (-:

  • Query Execution time - Elapsed time v Actual time taken

    Hi All,
    I have this scenario where I am querying a single table with the following results. It is a very heavy query in that there are multiple aggregate functions and multiple unions on it. Even if the query is written poorly (i doubt it is) why would the actual
    time taken to execute the query be much more than the statistics provided through the following commands?
    SET STATISTICS IO ON;
    SET STATISTICS TIME ON;
    Attached are the stats provided for the relevant query in question.
    Table '123456789_TEMP_DATA'. Scan count 178, logical reads 582048, physical reads 0, read-ahead reads 0, lob logical reads 0, lob physical reads 0, lob read-ahead reads 0.
    Table 'Worktable'. Scan count 0, logical reads 0, physical reads 0, read-ahead reads 0, lob logical reads 0, lob physical reads 0, lob read-ahead reads 0.
    SQL Server Execution Times:
       CPU time = 936 ms,  elapsed time = 967 ms.
    2014-01-06 17:36:41.383
    Now, although the CPU Time/Elapsed time shows that it takes less than a second, it actually takes more than 15 seconds to fetch the results. (This is the actual time that you get on the bottom bar of the Query pane as well.)
    What is the reason? Why is it that there is such a big discrepancy between the numbers? How can I improve this situation?
    Thanks!

    Yes. I am returning a huge number of rows to the client. 
    The query is simply against a single table. 
    Select
     'First Record',AVG(COLUMN1),STDEV(COLUMN1
    ),COUNT(COLUMN1)
    FROM [TABLE1] WHERE (SOME CONDITION)
    UNION ALL
    Select  'Second Record',AVG(COLUMN2),STDEV(COLUMN2),COUNT(COLUMN2) FROM [TABLE1]
    WHERE (SOME OTHER CONDITION)
    Imagine there are 178 records fetched in this manner with 178 UNIONs. The WHERE clause will always change for each SELECT statement.
    Now, the question is not so much about the query itself, but why the execution time is actually 15 seconds whilst the SQL STATISTICS show it to be 936ms (<1 second)
    Thanks!

  • Time-taken in access.log

    hi,
    I would like to know if the time-taken parameter in the extended format log represent
    the total time of the request (including the sleep time when all the threads are
    occupied) or not.
    thanks
    Alain

    "joerg" <[email protected]> wrote:
    >
    Does the time-taken in access.log include network latency to web-client
    For example given a request for xyz.jsp. What is included in the transaction-time
    We are using WLS6.1 SP3.I think the access log will only record how long the weblogic servers took to
    process a request.

  • Compare time taken by sqls

    Hi experts,
    Can any one refer me any tool or utility by which i can get time taken by different sql's without using explain plan..
    Regards,
    SKP

    If you are running it from SQL*Plus, just set timing on
    SQL> set timing on
    SQL> select empno,ename from emp where deptno=10;
    EMPNO ENAME
    7782 CLARK
    7839 KING
    7934 MILLER
    Elapsed: 00:00:01.97
    SQL> select empno,ename from emp where deptno=20;
    EMPNO ENAME
    7369 SMITH
    7566 JONES
    7788 SCOTT
    7876 ADAMS
    7902 FORD
    Elapsed: 00:00:01.91
    SQL>

Maybe you are looking for

  • Why can´t I create custom paper sizes in millimeters?

    Since I upgraded to Tiger (?) or since I bought (?) my new Epson printer RX700 I can´t customize paper sizes in millimeters. For example the value 12,1x12,2 gets evened out to 12x12. This happens regardless of programs I try to print from. Epson has

  • Is there a way to turn screen off manually?

    Just wondering if there is a way or key to press to turn the display off. I have my imac screen set up to turn off after 10 minutes and would like to keep it that way but there are times when I would like to turn it off right away but keep the comput

  • Change page content

    Hi, If I've updated the page content, do I have to publish the site again for it to be updated?

  • Restore iPhoto Librariy from Time Machine

    After some problems I have removed my iPhoto Library from my MBP harddisk ! Now I would like to import the iPhoto Library File which I backupped with Time Machine. Everything goes well for a while but suddenly the backup stops and I get a screen whic

  • BEX Analyzer not working after SAPGUI730 update

    Good day I really require your expert assistance with the following: The client has just upgraded to SAPLogon 7.30 file version 7300.3.1084 patch level 8, but they did not consider the application level and MO versions on any of the laptops. A few la