AWR Statistics gathering is taking hours...

Is it normal in 10g to have the following job run for hours?
EXEC SYS.DBMS_SCHEDULER.RUN_JOB('GATHER_STATS_JOB');
It takes like 4 hours sometimes to run - we run it once a day..Thank you!

AWR is the automatic workload repository - which is similar in mechanism to statspack, taking regular snapshots of the dynamic performance views.
The gather_stats_job has nothing to do with the operation of the AWR, beyond the fact that AWR data is stored in tables, so the gather_stats_job may decided to collect stats on those tables from time to time.
The default action for gather_stats_job is to collect stats for all tables with missing or stale statistics. The sample size for each table is chosen automatically (effectively by trial and error starting with a very small sample. Histograms are also collected automatically, based on a check of which columns have been used historically for "where" clauses combined with a sample to check if such columns show skewed data patterns.
If you do a lot of inserts, updates, and deletes on this particular database, you are more likely to end up with table statistics becoming stale more frequently, leading to longer lists of tables that need stats recalculated.
You may find that Oracle is generating too many histograms, and histograms can take a long time to construct. If this is the case, then you could consider changing the default setting for stats collection to skip the automatic histogram generation and add code to build histograms only on the columns that you think need them.
[addendum: you say you are running gather_stats_job daily - but it runs automatically every weekday at 10:00 pm and all weekend; did you disable the standard job, or did you mean that you were just letting the standard job).
Regards
Jonathan Lewis
http://jonathanlewis.wordpress.com
http://www.jlcomp.demon.co.uk
"The greatest enemy of knowledge is not ignorance,
it is the illusion of knowledge." Stephen Hawking.                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                               

Similar Messages

  • Awr Statistics

    Hi all,
    11.2.0.3
    Aix 6.1
    I run AWR stats gathering every hour on top of the hour, in our PROD database for 24 hrs a day. Is it possible to select the window (which hour time range) in a certain day was our database the busiest, meaning most resource intensive? by just using a sql query? or Do I need to run each time range 24 times and compare the 24 report results?
    Thanks a lot,
    zxy

    Hi,
    normally a database is busy when the application is busy. An application should have its own performance metrics (KPIs or key performance indicators) defined. E.g. a database for an online store application is probably the busiest when the application is processing maximum number of orders per unit time.
    If you are unfamiliar with workload periodicity for a certain application, you can approach the problem from a different angle, e.g. look at physical or logical reads, or CPU usage, and pick the hour when this metric has maximum value. Within this approach you should first of all be looking at resource which is scarce for this database. I.e. if a database is CPU-bound, then you should be looking first of all at periods of maximum CPU usage.
    For every metric it's possible to compose a query which would identify the hour or the snapshot when it reaches a local maximum.
    Best regards,
    Nikolay

  • With one condition included my discoverer report is taking hours to run

    Hi,
    I am trying to build a discoverer report based on a view and i have defined couple of conditions as per the business requirement. Here is my view which is executing fine from database and at discoverer level.
    Now the requirement is to get the details with conditions
    1. Application_date is not equal to apply date
    2. Applied_amount>0
    parameters should be
    Application_date low and high
    The report is successfully executing when i use condition1 that is application_date<>apply date with parameters mentioned.
    But when i define second condition at report level that applied_amount>0, its taking hours to run.
    Please advise me the way to use this condition2 so that it should be executed in normal time.
    Thanks,
    Prathyusha.
    Edited by: 958913 on Oct 10, 2012 1:28 PM

    Hi Srini,hussein,
    Thanks so much.
    Yes srini that is related to my post. I have tried all the options to make that condition workout. But hard luck i could nt able to make this work out.
    Im still wondered my query is successfully getting executed in database fine and when i give any condition on applied amount its taking hours to run from discoverer desktop and i dont find it executed anytime till now.
    I am not sure about enabling this trace option as i reviewed the ID's suggested by you and seems that all needs to be done with the help of DBA. But i dont have that much time to make this workout. This is an immediate business requirement and i have never seen this problem anytime with discoverer desktop.
    Any help would be appreciated.
    Thanks,
    Prathyusha.

  • HT4623 So my upgarde to iOS 6.1.2 in the iPhone is taking hours by Wifi. I was told I can patch the phone to my computer, then upgrade by way of iTunes - How do I do this? The site seems particulalry unhelpful! Thanks to all who respond

    So my upgrade of 6.1.2 is taking hours by Wifi. How do I do it throuigh iTunes, patched into my computer? Thanks!

    Once the Device is asking to be Restored with iTunes... it is too late to save anything...
    To minimise loss... Connect to iTunes on the computer you Usually Sync with and Restore from the most recent Backup...
    http://support.apple.com/kb/HT1414
    Restore from Backup
    http://support.apple.com/kb/ht1766

  • Why materialized view taking hours and yet not created

    hi,
    i am trying to create the following MV
    create materialized view MV1
    force on demand
    start with date 1 next date2
    AS
    select query 1--> complex query that return approcimately 2 millions rows
    union all
    select query2 --> complex query that returns 300K rowsit has been taking hours i.e > 4 hrs and still unable to create the MV
    w/o the create materialized view statement i am able to return the results from the union of the 2 queries
    what might be the possibility ?
    tks & rdgs

    Issue could be due to the complexity of your query..Even i faced the same issue when i worked on materialized views.It was taking too long to create materialized view and then we splitted the query and created two materialized views.
    Finally created a View for those MV's.You can also try the same
    Materialized View Log
    When you create a materialized view using the FAST option you will need to create a view log on the base tables.A fast refresh uses materialized view logs to update only the rows that have changed since the last refresh.
    When DML changes are made to the master table's data, Oracle stores rows describing those changes in the materialized view log and then uses the materialized view log to refresh materialized views based on the base table. This process is called an incremental or fast refresh.Without a materialized view log, Oracle must reexecute the materialized view query to refresh the materialized view. This process is called a complete refresh.Usually, a fast refresh takes less time than a complete refresh.
    You can go through the below link
    http://www.lc.leidenuniv.nl/awcourse/oracle/server.920/a96567/repmview.htm
    Thanks
    Suni

  • My sql query is taking hours to run from toad

    SELECT * from (select hr.NAME org_name, ractla.org_id, acct_site.attribute_category,
              acct_site.attribute10 general_agent, arc.reversal_category,arr.reversal_gl_date,ard.reversed_source_id,
              DECODE (acct_site.attribute_category,
                      'COMM', acct_site.attribute9,
                      'IPLAN', acct_site.attribute11,
                      NULL
                     ) broker_number,                       -- changed by Michelle
              SUBSTR (hca.account_number, 1, 12) customer_number,
              ractla.attribute3 plan_type,
              SUBSTR (hp.party_name, 1, 15) customer_name,
              DECODE (ractla.attribute_category,
                      'Commercial Receivables', ractla.attribute1,
                      'Subscriber Receivables', ractla.attribute1,
                      NULL
                     ) product,                                        --x Product
              rax.trx_number trx_number, rax.trx_date transaction_date,
              DECODE
                 (SIGN (INSTR (rax.attribute4, '/')),
                  1, TO_DATE
                            (   '01'
                             || TO_CHAR
                                      (DECODE (SIGN (INSTR (rax.attribute4, '/')),
                                               1, TO_DATE
                                                         (SUBSTR (rax.attribute4,
                                                                  1,
                                                                  10
                                                          'rrrr/mm/dd'
                                               NULL
                                       'MON-RRRR'
                             'DD-MON-RRRR'
                  NULL
                 ) coverage_month, 
              DECODE (SIGN (INSTR (rax.attribute4, '/')),
                      1, TO_DATE (SUBSTR (rax.attribute4, 1, 10), 'rrrr/mm/dd'),
                      NULL
                     ) due_date,
              DECODE (acct_site.attribute_category,
                      'COMM', SUBSTR (hca.account_number, 1, 6),
                      acct_site.attribute10
                     ) employer_group,
              DECODE (arc.reversal_category,
                      'NSF', TRUNC (arr.creation_date),
                      'REV', TRUNC (arr.creation_date),
                      DECODE (arr.reversal_gl_date,
                              NULL, TRUNC (arr.apply_date),
                              DECODE (ard.reversed_source_id,
                                      NULL, TRUNC (arr.apply_date),
                                      TRUNC (arr.creation_date)
                     ) as application_date,     
                     arr.apply_date,   
                     arr.creation_date,                       --9/8/03
              arc.receipt_number receipt_number, arc.receipt_date receipt_date,
              arr.amount_applied applied_amount, rax.attribute1 company,
              rax.attribute2 division, ractla.attribute2 coverage_plan,                                                              -- (Plan Code)
              DECODE (acct_site.attribute_category,
                      'IPLAN', acct_site.attribute13,
                      'SH', acct_site.attribute8,
                      NULL
                     ) coverage_type,
              ps.amount_due_original trans_amount,
              NVL (ps.amount_due_remaining, 0) trans_amount_remaining,
              SUBSTR (hre.first_name || ' ' || hre.middle_name || ' '
                      || hre.last_name,
                      1,
                      25
                     ) salesrep_name,
              SUBSTR (acct_site.attribute5, 1, 5) salesrep_extension,
              hca.attribute4 SOURCE 
         FROM apps.ra_customer_trx_all rax,                        -- invoice info
              apps.ar_payment_schedules_all ps,
              apps.ra_cust_trx_types_all rat,
              apps.hz_cust_accounts_all hca,
              apps.hz_parties hp,
              apps.ar_receivable_applications_all arr,
              apps.ar_cash_receipts_all arc,                      --- receipt info
              apps.hr_operating_units hr,
              apps.hr_employees hre,
              apps.hz_cust_acct_sites_all acct_site,
              apps.hz_cust_site_uses_all acct_site_use,   --added by tapas on 7/16
              apps.ra_customer_trx_lines_all ractla,
              apps.ar_distributions_all ard
        WHERE
        hca.cust_account_id = rax.bill_to_customer_id 
          AND ps.customer_trx_id = rax.customer_trx_id
          AND rat.cust_trx_type_id = rax.cust_trx_type_id
          AND rat.org_id = rax.org_id
          AND arr.applied_customer_trx_id = rax.customer_trx_id
          AND arr.status = 'APP'
          AND arr.cash_receipt_id = arc.cash_receipt_id
          AND hr.organization_id = rax.org_id
          AND rax.customer_trx_id = ractla.customer_trx_id
          AND ractla.line_number = 1
          AND rax.bill_to_site_use_id = acct_site_use.site_use_id
          AND hca.cust_account_id = acct_site.cust_account_id
          AND acct_site.cust_acct_site_id = acct_site_use.cust_acct_site_id
          AND hca.party_id = hp.party_id 
          AND acct_site.attribute4 = hre.employee_id(+)
          AND hr.NAME = 'Commercial Group'
          AND ard.source_table = 'RA'
          AND ard.source_id = arr.receivable_application_id) app
          where app.application_date between :start_date and :end_Date
         and app.application_date <> app.apply_date
         and app.applied_amount>0;
    This is my query..due to some requirement i have written an inline view. The query is getting executed without inline view and even after including inline view but when i included the condition "app.applied_amount>0" its taking hours and i dont see it complete. Please advise. Urgently needed as a discoverer report needs to be built based on this.
    Thanks.
    Edited by: 958913 on Oct 8, 2012 3:52 PM
    Edited by: 958913 on Oct 8, 2012 4:19 PM
    Edited by: 958913 on Oct 8, 2012 4:33 PM

    Hi,
    the next action plan i have taken is as i need to write a condition that is based on application date(which is not a database column and is getting by a decode statement in the select) i have removed that decode statement from select and written in inline view and there by adding in where condition. Here is the query and still its taking time to run.
    /* Formatted on 2012/10/09 13:51 (Formatter Plus v4.8.8) */
    SELECT hr.NAME org_name, ractla.org_id, acct_site.attribute_category,
           acct_site.attribute10 general_agent, arc.reversal_category,
           arr.reversal_gl_date, ard.reversed_source_id,
           DECODE (acct_site.attribute_category,
                   'COMM', acct_site.attribute9,
                   'IPLAN', acct_site.attribute11,
                   NULL
                  ) broker_number,                          -- changed by Michelle
           SUBSTR (hca.account_number, 1, 12) customer_number,
           ractla.attribute3 plan_type,
           SUBSTR (hp.party_name, 1, 15) customer_name,
           DECODE (ractla.attribute_category,
                   'Commercial Receivables', ractla.attribute1,
                   'Subscriber Receivables', ractla.attribute1,
                   NULL
                  ) product,                                           --x Product
           rax.trx_number trx_number, rax.trx_date transaction_date,
           DECODE
              (SIGN (INSTR (rax.attribute4, '/')),
               1, TO_DATE (   '01'
                           || TO_CHAR (DECODE (SIGN (INSTR (rax.attribute4, '/')),
                                               1, TO_DATE
                                                         (SUBSTR (rax.attribute4,
                                                                  1,
                                                                  10
                                                          'rrrr/mm/dd'
                                               NULL
                                       'MON-RRRR'
                           'DD-MON-RRRR'
               NULL
              ) coverage_month,
           DECODE (SIGN (INSTR (rax.attribute4, '/')),
                   1, TO_DATE (SUBSTR (rax.attribute4, 1, 10), 'rrrr/mm/dd'),
                   NULL
                  ) due_date,
           DECODE (acct_site.attribute_category,
                   'COMM', SUBSTR (hca.account_number, 1, 6),
                   acct_site.attribute10
                  ) employer_group,
           app.application_date,
           arr.apply_date, arr.creation_date,                             --9/8/03
                                             arc.receipt_number receipt_number,
           arc.receipt_date receipt_date, arr.amount_applied applied_amount,
           rax.attribute1 company, rax.attribute2 division,
           ractla.attribute2 coverage_plan,                         -- (Plan Code)
           DECODE (acct_site.attribute_category,
                   'IPLAN', acct_site.attribute13,
                   'SH', acct_site.attribute8,
                   NULL
                  ) coverage_type,
           ps.amount_due_original trans_amount,
           NVL (ps.amount_due_remaining, 0) trans_amount_remaining,
           SUBSTR (hre.first_name || ' ' || hre.middle_name || ' '
                   || hre.last_name,
                   1,
                   25
                  ) salesrep_name,
           SUBSTR (acct_site.attribute5, 1, 5) salesrep_extension,
           hca.attribute4 SOURCE
      FROM apps.ra_customer_trx_all rax,                           -- invoice info
           apps.ar_payment_schedules_all ps,
           apps.ra_cust_trx_types_all rat,
           apps.hz_cust_accounts_all hca,
           apps.hz_parties hp,
           apps.ar_receivable_applications_all arr,
           apps.ar_cash_receipts_all arc,                         --- receipt info
           apps.hr_operating_units hr,
           apps.hr_employees hre,
           apps.hz_cust_acct_sites_all acct_site,
           apps.hz_cust_site_uses_all acct_site_use,      --added by tapas on 7/16
           apps.ra_customer_trx_lines_all ractla,
           apps.ar_distributions_all ard,
           (select rax1.customer_trx_id,DECODE (arc1.reversal_category,
                      'NSF', TRUNC (arr1.creation_date),
                      'REV', TRUNC (arr1.creation_date),
                      DECODE (arr1.reversal_gl_date,
                              NULL, TRUNC (arr1.apply_date),
                              DECODE (ard1.reversed_source_id,
                                      NULL, TRUNC (arr1.apply_date),
                                      TRUNC (arr1.creation_date)
                     ) as application_date
    from  apps.ar_receivable_applications_all arr1,
              apps.ar_cash_receipts_all arc1, 
              apps.ra_customer_trx_all rax1,
              apps.ar_distributions_all ard1
    where  arr1.applied_customer_trx_id = rax1.customer_trx_id
          AND arr1.status = 'APP'
          AND arr1.cash_receipt_id = arc1.cash_receipt_id
           AND ard1.source_table = 'RA'
          AND ard1.source_id = arr1.receivable_application_id
          ) app
    WHERE hca.cust_account_id = rax.bill_to_customer_id
       AND app.customer_trx_id = rax.customer_trx_id
       AND arr.apply_date <> app.application_date
       and app.application_date between :start_date and :end_date
       --AND rax.trx_number IN ('52715888', '52689753')
       AND ps.customer_trx_id = rax.customer_trx_id
       AND rat.cust_trx_type_id = rax.cust_trx_type_id
       AND rat.org_id = rax.org_id
       AND arr.applied_customer_trx_id = rax.customer_trx_id
       AND arr.status = 'APP'
       AND arr.cash_receipt_id = arc.cash_receipt_id
       AND hr.organization_id = rax.org_id
       AND rax.customer_trx_id = ractla.customer_trx_id
       AND ractla.line_number = 1
       AND rax.bill_to_site_use_id = acct_site_use.site_use_id
       AND hca.cust_account_id = acct_site.cust_account_id
       AND acct_site.cust_acct_site_id = acct_site_use.cust_acct_site_id
       AND hca.party_id = hp.party_id
       AND acct_site.attribute4 = hre.employee_id(+)
       AND hr.NAME = 'Commercial Group'
       AND ard.source_table = 'RA'
       AND ard.source_id = arr.receivable_application_id

  • Cgi statistics gathering under 6.1 and Solaris 9

    Hello all,
    is it possible to log for cgi requests the value for handling each of the time spent on the request?
    I see a lot of editable parameters in the 'Performance, Tuning and Scaling Guide' but can't figure out how to do that.
    Once in a thread I read "...enable statistics gathering, then add %duration% to your access log format line".
    I can't find the terminus %duration% in the guide, which parameter is taken?
    Regards Nick

    Hello elvin,
    thanks for your reply. Now I think I managed to let the webserver log the duration of a cgi request, but I'm unsure how to interpret the value eg. in the access log I get
    ..."GET /cgi/beenden.cgi ... Gecko/20040113 MultiZilla/1.6.3.1d" 431710"
    ..."GET /pic.gif ... Gecko/20040113 MultiZilla/1.6.3.1d" 670"
    so the last value corresponds to my %duration% in the magnus.conf.
    431710 ... in msec? - makes no sense
    670 ... in msec?
    The complete string in magnus.conf reads as follows:
    Init fn="flex-init" access="$accesslog" format.access="%Ses->client.ip% - %Req->vars.auth-user% [%SYSDATE%] \"%Req->reqpb.clf
    -request%\" %Req->srvhdrs.clf-status% %Req->srvhdrs.content-length% \"%Req->headers.user-agent%\" \%duration%\""Regards Nick

  • HT4946 I am trying to update my Ipad 1 but it is taking hours on the backup - why would this be, how can I make it faster so I can get the latest IOS update?

    I am trying to update my Ipad 1 but it is taking hours on the backup, what can I do to make this faster?

    I'm thinking that you have a corrupt backup. You can try this if you want to give it a shot.
    Without connecting the iPad to the computer, launch iTunes. Go to Edit>Preferences>Devices. See how many backups you have for the iPad. If you have multiple backups, delete all but one. Click on the backup in the window and then click on Delete.
    Quit iTunes, connect the iPad and then try again.

  • Statistics gathering

    Hello ,
    Every one I'm little confuse about "Statistics gathering" in ebs so I have some question in my mind which are following.
    kindly any one clear my concept about it.I really appreciate you.
    1.What is Statistics gathering ?
    2. What is the benefit of it?
    3.Can after this our ERP performance is better?
    one question is out this subject is that can If any one wanna APPS DBA then who must be DBA(oracle 10g,9i etc) or who only have a concept of oracle dba like backup,recovery,cloning etc.
    Regards,
    Shahbaz khan

    1.What is Statistics gathering ?
    Statistics gathering is a process by which Oracle scans some or all of your database objects (such as tables, indexes etc.) and stores the information in objects such as dbal_tables, dba_histograms. Oracle uses this information to determine the best execution path for statements it has to execute (such as select, update etc.)
    2. What is the benefit of it?
    It does help the queries become more efficient.
    3.Can after this our ERP performance is better?
    Typically, if you are experiencing performance issues, this is one of the first remedies.
    one question is out this subject is that can If any one wanna APPS DBA then who must be DBA(oracle 10g,9i etc) or who only have a concept of oracle dba like backup,recovery,cloning etc.I will let Hussein or Helios answer that question. They can offer a lot of helpful advice. You can also refer to Hussein's recent thread on a similar topic.
    See Re: Time Management and planned prep
    Hope this helps,
    Sandeep Gandhi

  • Downloaded games from suppliers to iPad - these are now downlighting to iTunes on my m/s computer and taking hours and hours to do so.  If I delete them from iTunes will this affect the games now on my iPad?  Any advice would be appreciated.

    Games downloaded direct to iPad.   They are now auto downloading to iTunes on M/s PC and taking hours to do so.  If I delete those partially downloaded to PC will it affect those on iPad when I next sync iPad with PC?
    Any advice appreciated - thanks.
    Reflo

    To start things off, I like to manually manage my media and will insturct you to do the same. To enable doing this, you check off the box for it on your phone's summary on iTunes. Now tranferring media is as easy as dragging and dropping.
    (1) To get songs from your phone onto your iTunes, you open up your phone's dropdown menu on the iTunes sidebar and click Music. You can be able to drag music file into your Music folder in the Library section of iTunes. You can also just right click your phone's name and click "Transfer Purchases" if you want to move everything new on your iPhone onto your computer's library.
    (2) You open your Music library on your computer and simply drag highlighted songs onto your phone's name on the sidebar. When in album grid view or cover flow, you can click and drag a whole album right on. The downside of manual management is that you can easily accidentally add doubles of songs onto your phone. Of course, if you notice this on your phone, you can just swipe to the right across the song's name and you should have a nice little delete button so you can remove extras anywhere.
    (3) I heard about this online and it actually is buried in the Terms and Conditions somewhere. I don't think this rule will apply if you are exchanging the music over your iTunes account devices. I think this applies to buring the iTunes material to CDs. I think these speacial media file can somehow detect what they are moved or copied onto and keep a statistic buried in their coding. This is more of a logical assumption than a fact, but what else could these terms possibly mean? If you want to know more about this, Bruce Willis was having a big arguement about this online because he wanted to give his medie to his children. You can probably find an article about this using those search terms.
    I worked hard on this response and would appreciate if you marked it correct if it served as any help to you. Thank you in advance.

  • Setting of Optimizer Statistics Gathering

    I'm checking in my db setting and database is analysing each day. But as I notice there are a lot of tables that information shows last analysis in about month ago... Do I have to change some parameters?

    lesak wrote:
    I don't have any data that show you that my idea is good. I'd like to confirm on this forum that my idea is good or not. I've planned to make some changes to have better performance of query that read from top use tables. If this is bad solutions it's also important information for me.One point of view is that your idea is bad. That point of view would be to figure out what the best access for your query is and set that as a baseline, or figure out what statistics get you the correct plans on a single query that has multiple plans that are best with different values sent in through bind variables, and lock the statistics.
    Another point of view would be to gather current plans for currently used queries, then do nothing at all unless the optimizer suddenly decides to switch away from one, then figure out why.
    Also note the default statistics gathering is done in a window, if you have a lot of tables changing it could happen that you can't get stats in a timely fashion within the window.
    Whether the statistics gathering is appropriate may depend on how far off histograms are from describing the actual data distribution you see. What my be appropriate worry for one app may be obsessive tuning disorder for another. 200K rows out of millions may make no difference at all, or may make a huge difference if the newly added data is way off from what the statistics make the opitmizer think it is.
    One thing you are probably doing right is to recognize that tuning particular queries may be much more useful than obsessing over statistics.
    Note how much I've used the word "may" here.

  • Performance impact after changing the awr snapshot timing from 1 hour to 15 minuts.

    want to know performance impact after changing the AWR snapshot timing from 1 hour to 15 minutes.

    Hi,
    1) typically performance impact is negligible
    2) we have no way of knowing whether or system fits into the definition of "typical"
    3) the best way would be to do that on a test system and measure the impact
    4) I would be concerned more about SYSAUX growth than performance impact -- you need to make sure that you won't run out of space because of x4 more frequent snapshots
    Best regards,
      Nikolay

  • Trying to download update to CoPilot Live and CoPilot GPS with maps.  files sizes are large and taking hours to download on wireless connection.  How can I download App updates and new maps while connected to PC and Itunes through hard wire internet link?

    Trying to download update to CoPilot Live and CoPilot GPS with maps.  Files sizes are large and taking hours to download on wireless connection.  How can I download updates and new maps while connected to PC and Itunes through hard wire internet link?

    I'm on my iPad, so I don't know if this is the page with an actual download. I don't see a button, but assume that is because I  am on an iPad. It is in the DL section of Apple downloads.
    http://support.apple.com/kb/DL1708

  • Excel ADODB Sql Query Execution taking hours when manipulate excel tables

    Hello All 
    I have 28000 records with 8 column in an sheet. When I convert the sheet into ADODB database and copy to new
    excel using below code it is executing in less than a min
    Set Tables_conn_obj = New ADODB.Connection
    Tables_conn_str = "Provider=Microsoft.ACE.OLEDB.12.0;Data Source=" & Table_Filename & ";Extended Properties=""Excel 12.0;ReadOnly=False;HDR = Yes;IMEX=1"""
    Tables_conn_obj.Open Tables_conn_str
    First_Temp_sqlqry = "Select * INTO [Excel 12.0;DATABASE=C:\Prod Validation\Database\Second Acat Table.xlsb].[Sheet1] Table [first - Table$];" Tables_conn_obj.Execute First_Temp_sqlqry
    But when I change the query to manipulate one column in current table based on another table in the same excel
    and try to copy the results in another excel, it is taking more than one hour.. why it is taking this much time when both the query results returns the same number of rows and column. I almost spend one week and still not able to resolve this issue.
    Even I tried copyfromrecordset, getrows(), getstring(), Looping each recordset fields options all of them taking
    same amount of time. Why there is huge difference in execution time.
    Important note: Without into statement even below query is executing in few seconds.
    select ( ''''''manipulating first column based on other table data''''''''''''''
    iif(
    [Second - Table$].[Policy Agent] = (select max([ACAT$].[new_Agent_number]) from [ACAT$] where [ACAT$].[new_Agent_number] = [Second - Table$].[Policy Agent] and [ACAT$].[ACAT_EffectiveDate] = ( select MAX([ACAT$].[ACAT_EffectiveDate] ) from [ACAT$] where [ACAT$].[new_Agent_number] = [Second - Table$].[Policy Agent]and [ACAT$].[ACAT_EffectiveDate] > '2014-10-01') ) , (select max([ACAT$].[Old_Agent_number]) from [ACAT$] where [ACAT$].[new_Agent_number] = [Second - Table$].[Policy Agent] and [ACAT$].[ACAT_EffectiveDate] = ( select MAX([ACAT$].[ACAT_EffectiveDate] ) from [ACAT$] where [ACAT$].[new_Agent_number] = [Second - Table$].[Policy Agent]and [ACAT$].[ACAT_EffectiveDate] > '2014-10-01')) ,
    iif( [Second - Table$].[Policy Agent] = (select max([ACAT$].[Old_Agent_number]) from [ACAT$] where [ACAT$].[Old_Agent_number] = [Second - Table$].[Policy Agent] and [ACAT$].[ACAT_EffectiveDate] = ( select MAX([ACAT$].[ACAT_EffectiveDate] ) from [ACAT$]where [ACA T$].[Old_Agent_number] = [Second - Table$].[Policy Agent]and [ACAT$].[ACAT_EffectiveDate] <= '2014-10-01') ), (select max([ACAT$].[new_Agent_number]) from [ACAT$] where [ACAT$].[Old_Agent_number] = [Second - Table$].[Policy Agent] and [ACAT$].[ACAT_EffectiveDate] = ( select MAX([ACAT$].[ACAT_EffectiveDate] ) from [ACAT$] where [ACAT$].[Old_Agent_number] = [Second - Table$].[Policy Agent]and [ACAT$].[ACAT_EffectiveDate] <= '2014-10-01')) ,
    [Second - Table$].[Policy Agent] ))) as [Policy Agent],
    ''''''summing up all other columns''''''''''''''
    (iif(isnull(sum([Second - Table$].[Auto BW-Line Of Business Detail])),0,sum([Second - Table$].[Auto BW-Line Of Business Detail]))) as [Auto BW-Line Of Business Detail],(iif(isnull(sum([Second - Table$].[Auto Farmers])),0,sum([Second - Table$].[Auto Farmers]))) as [Auto Farmers],(iif(isnull(sum([Second - Table$].[MCA])),0,sum([Second - Table$].[MCA]))) as [MCA],(iif(isnull(sum([Second - Table$].[CEA])),0,sum([Second - Table$].[CEA]))) as [CEA],(iif(isnull(sum([Second - Table$].[Commercial P&C])),0,sum([Second - Table$].[Commercial P&C]))) as [Commercial P&C],(iif(isnull(sum([Second - Table$].[Comm WC])),0,sum([Second - Table$].[Comm WC]))) as [Comm WC],(iif(isnull(sum([Second - Table$].[Fire Farmers])),0,sum([Second - Table$].[Fire Farmers]))) as [Fire Farmers],(iif(isnull(sum([Second - Table$].[Flood])),0,sum([Second - Table$].[Flood]))) as [Flood],(iif(isnull(sum([Second - Table$].[Kraft Lake])),0,sum([Second - Table$].[Kraft Lake]))) as [Kraft Lake],(iif(isnull(sum([Second - Table$].[Life])),0,sum([Second - Table$].[Life]))) as [Life],(iif(isnull(sum([Second - Table$].[Foremost])),0,sum([Second - Table$].[Foremost]))) as [Foremost],(iif(isnull(sum([Second - Table$].[Umbrella])),0,sum([Second - Table$].[Umbrella]))) as [Umbrella],(iif(isnull(sum([Second - Table$].[MCNA])),0,sum([Second - Table$].[MCNA]))) as [MCNA]
    INTO [Excel 12.0;DATABASE=C:\Prod Validation\Database\Second Acat Table.xlsb].[Sheet1]
    from [Second - Table$] group by [Second - Table$].[Policy Agent] ;

    Hi Fei,
      Thank you so much for the reply post. I just executed the same above SQL without INTO Statement and assigned the SQL result to ADODB recordset as below. If the time difference is due to the SQL query then below statements also should execute for hours
    right, but it gets executed in seconds. But to copy the recordset to excel again it is taking hours. I tried copyfromrecordset,
    getrows(), getstring(), Looping each recordset fields options and all of them taking same amount of time. Please let me know there is delay in time for this much small data
    Even I tried to typecast all columns to double, string in SQL and still the execution time  is not reduced. 
    First_Temp_Recordset.Open sql_qry, Tables_conn_obj, adOpenStatic, adLockOptimistic ''' OR SET First_Temp_Recordset = Tables_conn_obj.Execute sql_qry

  • Excel ADODB Sql Query Execution taking hours when manipulate excel tables why?

    I have 28000 records with 8 column in an sheet. When I convert the sheet into ADODB database and copy to new excel using below code it is executing in less than a min
    Set Tables_conn_obj = New ADODB.Connection Tables_conn_str = "Provider=Microsoft.ACE.OLEDB.12.0;Data Source=" & Table_Filename & ";Extended Properties=""Excel 12.0;ReadOnly=False;HDR = Yes;IMEX=1""" Tables_conn_obj.Open
    Tables_conn_str First_Temp_sqlqry = "Select * INTO [Excel 12.0;DATABASE=C:\Prod Validation\Database\Second Acat Table.xlsb].[Sheet1] Table [first - Table$];" Tables_conn_obj.Execute First_Temp_sqlqry
    But when I change the query to manipulate one column in current table based on another table in the same excel and try to copy the results in another excel, it is taking more than one hour.. why it is taking this much time when both the query results returns
    the same number of rows and column. I almost spend one week and still not able to resolve this issue.
    Even I tried copyfromrecordset, getrows(), getstring(), Looping each recordset fields options all of them taking same amount of time. Appreciate any inputs...
    select ( ''''''manipulating first column based on other table data''''''''''''''
    iif( [Second - Table$].[Policy Agent] = (select max([ACAT$].[new_Agent_number]) from [ACAT$] where [ACAT$].[new_Agent_number] = [Second - Table$].[Policy Agent] and [ACAT$].[ACAT_EffectiveDate] = ( select MAX([ACAT$].[ACAT_EffectiveDate] ) from [ACAT$] where
    [ACAT$].[new_Agent_number] = [Second - Table$].[Policy Agent]and [ACAT$].[ACAT_EffectiveDate] > '2014-10-01') ) , (select max([ACAT$].[Old_Agent_number]) from [ACAT$] where [ACAT$].[new_Agent_number] = [Second - Table$].[Policy Agent] and [ACAT$].[ACAT_EffectiveDate]
    = ( select MAX([ACAT$].[ACAT_EffectiveDate] ) from [ACAT$] where [ACAT$].[new_Agent_number] = [Second - Table$].[Policy Agent]and [ACAT$].[ACAT_EffectiveDate] > '2014-10-01')) ,
    iif( [Second - Table$].[Policy Agent] = (select max([ACAT$].[Old_Agent_number]) from [ACAT$] where [ACAT$].[Old_Agent_number] = [Second - Table$].[Policy Agent] and [ACAT$].[ACAT_EffectiveDate] = ( select MAX([ACAT$].[ACAT_EffectiveDate] ) from [ACAT$]where
    [ACA T$].[Old_Agent_number] = [Second - Table$].[Policy Agent]and [ACAT$].[ACAT_EffectiveDate] <= '2014-10-01') ), (select max([ACAT$].[new_Agent_number]) from [ACAT$] where [ACAT$].[Old_Agent_number] = [Second - Table$].[Policy Agent] and [ACAT$].[ACAT_EffectiveDate]
    = ( select MAX([ACAT$].[ACAT_EffectiveDate] ) from [ACAT$] where [ACAT$].[Old_Agent_number] = [Second - Table$].[Policy Agent]and [ACAT$].[ACAT_EffectiveDate] <= '2014-10-01')) ,
    [Second - Table$].[Policy Agent] ))) as [Policy Agent],
    ''''''summing up all other columns''''''''''''''
    (iif(isnull(sum([Second - Table$].[Auto BW-Line Of Business Detail])),0,sum([Second - Table$].[Auto BW-Line Of Business Detail]))) as [Auto BW-Line Of Business Detail],(iif(isnull(sum([Second - Table$].[Auto Farmers])),0,sum([Second - Table$].[Auto Farmers])))
    as [Auto Farmers],(iif(isnull(sum([Second - Table$].[MCA])),0,sum([Second - Table$].[MCA]))) as [MCA],(iif(isnull(sum([Second - Table$].[CEA])),0,sum([Second - Table$].[CEA]))) as [CEA],(iif(isnull(sum([Second - Table$].[Commercial P&C])),0,sum([Second
    - Table$].[Commercial P&C]))) as [Commercial P&C],(iif(isnull(sum([Second - Table$].[Comm WC])),0,sum([Second - Table$].[Comm WC]))) as [Comm WC],(iif(isnull(sum([Second - Table$].[Fire Farmers])),0,sum([Second - Table$].[Fire Farmers]))) as [Fire
    Farmers],(iif(isnull(sum([Second - Table$].[Flood])),0,sum([Second - Table$].[Flood]))) as [Flood],(iif(isnull(sum([Second - Table$].[Kraft Lake])),0,sum([Second - Table$].[Kraft Lake]))) as [Kraft Lake],(iif(isnull(sum([Second - Table$].[Life])),0,sum([Second
    - Table$].[Life]))) as [Life],(iif(isnull(sum([Second - Table$].[Foremost])),0,sum([Second - Table$].[Foremost]))) as [Foremost],(iif(isnull(sum([Second - Table$].[Umbrella])),0,sum([Second - Table$].[Umbrella]))) as [Umbrella],(iif(isnull(sum([Second - Table$].[MCNA])),0,sum([Second
    - Table$].[MCNA]))) as [MCNA]
    INTO [Excel 12.0;DATABASE=C:\Prod Validation\Database\Second Acat Table.xlsb].[Sheet1]
    from [Second - Table$] group by [Second - Table$].[Policy Agent] ;

    Hi Fei,
      Thank you so much for the reply post. I just executed the same above SQL without INTO Statement and assigned the SQL result to ADODB recordset as below. If the time difference is due to the SQL query then below statements also should execute for hours
    right, but it gets executed in seconds. But to copy the recordset to excel again it is taking hours. I tried copyfromrecordset,
    getrows(), getstring(), Looping each recordset fields options and all of them taking same amount of time. Please let me know there is delay in time for this much small data
    Even I tried to typecast all columns to double, string in SQL and still the execution time  is not reduced. 
    First_Temp_Recordset.Open sql_qry, Tables_conn_obj, adOpenStatic, adLockOptimistic ''' OR SET First_Temp_Recordset = Tables_conn_obj.Execute sql_qry

Maybe you are looking for