Question on dates

hi all,
pl clarify my doubt.
how to display "experience of an employee"
e.g 3 years 4months 19days
thanks
pavani

Try
select trunc(months_between(sysdate,to_date('01-apr-2005'))/12)||' Years '||
trunc(mod(months_between(sysdate,to_date('01-apr-2005')),12))||' Months '||
trunc(to_char(sysdate-to_date(add_months(to_date('01-apr-2005'),trunc(months_between(sysdate,to_date('01-apr-2005')))))))||' Days '
from dual;

Similar Messages

  • Basic questions on data modeling

    Hi experts,
    I have some basic questions regarding data modeling within MDM. I understand the available table types and the concept of lookup fields. I know that the MDM data modeling concept is different to the relational concept. But having a strong database background my first step was to design a relational data model which I would like to transfer to a MDM repository. Unfortunately I didn't found good information material on this. So here are some questions maybe you can help me:
    1) Is it the right approach to model n:m relationships with multivalued lookup fields? E.g. main table Users with lookup field from subtable SapAccounts (a user can have accounts in different SAP systems, that means more than one account).
    2) Has a record always be unique in MDM repositories (e.g. should we use Auto ID's in every table or do we have to mark a combination of fields as unique)? Is a composite key of 2 or more fields represented with marking these fields as unique?
    3) The concept of relationships in MDM is only based on relationships between single records (not valid for all records in a table)? Is it necessary to define all relationships similar to the relational data model in MDM? Is there something similar to referential integrity in MDM?
    4) Is it possible to change the main table to a sub table later on if we realize that it has also to be used as a lookup table for another table (when extending the data model) or do we have to create a new repository from scratch?
    Thank you for your answers.
    Regards, bd

    Yes you are correct. It is almost difficult to map relational database to mdm one. But again MDM is not 'just' a database. It holds much more 'master' information as compared to any relational db.
    1) Is it the right approach to model n:m relationships with multivalued lookup fields? E.g. main table Users with lookup field from subtable SapAccounts (a user can have accounts in different SAP systems, that means more than one account).
    Yes Here you need to use MV look up tables or can also try Qualifier tables if it gets more complex
    2) Has a record always be unique in MDM repositories (e.g. should we use Auto ID's in every table or do we have to mark a combination of fields as unique)? Is a composite key of 2 or more fields represented with marking these fields as unique?
    Concept of uniqueness differs here that you also have something called Display Fields (DF). A combination of DF can also be treated as Unique one. For instance while importing records if you select these DF as a combination, you will eliminate any possible of duplicates based on this combination. Auto Id is one of the ways to have a unique id once record is within MDM. While you use UF or DF to eliminate any possible duplicates at import level
    3) The concept of relationships in MDM is only based on relationships between single records (not valid for all records in a table)? Is it necessary to define all relationships similar to the relational data model in MDM? Is there something similar to referential integrity in MDM?
    Hmm... good one. Referencial Integrity. What I assume you are talking is that if you have relationships between tables then removing a record will not be possible as it is a foreign key for some record. Here MDM does not allow that. As Relationships within MDM are physical and not conceptual. For instance material can have components. Now if material does not exist then any relationship to components is not worthwile to maintain. Hence relationshsip is eliminated.  While in relational model relationships are more conceptual. Hence with MDM usage of lookups and main table you do not need to maintain these kind of relationships on your own.
    4) Is it possible to change the main table to a sub table later on if we realize that it has also to be used as a lookup table for another table (when extending the data model) or do we have to create a new repository from scratch?
    No. It is not possible to convert main table. There is only one main table and it cannot be changed.
    I went for the same option but it did not work. What I suggest is to look up your legacy system one by one and see what fields in general can be classified as Master, Reference, Transactional - You will start getting answers immediately.

  • Hello I want to change your secret question and date of birth

    Hello
    I want to change your secret question and date of birth

    The Three Best Alternatives for Security Questions and Rescue Mail
        1. Use Apple's Express Lane.
              Go to https://expresslane.apple.com ; click 'See all products and services' at the
              bottom of the page. In the next page click 'More Products and Services, then
              'Apple ID'. In the next page select 'Other Apple ID Topics' then 'Forgotten Apple
              ID security questions' and click 'Continue'. Please be patient waiting for the return
              phone call. It will come in time depending on how heavily the servers are being hit.
         2.  Call Apple Support in your country: Customer Service: Contact Apple support.
         3.  Rescue email address and how to reset Apple ID security questions.
    A substitute for using the security questions is to use 2-step verification:
    Two-step verification FAQ Get answers to frequently asked questions about two-step verification for Apple ID.

  • A very basic question regarding data block

    Hi All,
    I've a very basic question concerning data blocks in oracle forms 10g.
    I want to make a view only screen (only query allowed, no update, insert or delete).
    I'll have 6-7 fields on the screen but all the fields are not from a single table.
    For e.g, let say we've field names to display on the screen are f1, f2, f3, f4..
    Out of this f1 and f2 will come from table A and f3, f4 will come from table B.
    Now, my question : Is it possible to create a data block using the data block wizard for such situation if we select create data block from table options?
    If no, can you please tell me an approach to do this.
    Regards,
    Navnit

    Hello ,
    First write your query & select datablock property.
    just change the below properties
    Query Data Source Type=From Clause Query
    Query Data Source Name = (Paste query here)
    now you should add the block ITEMs and give their names according to query columns names. shows the column on canvas and run..
    Best Regard
    skyniazi
    Edited by: SKYNIAZI on Mar 29, 2009 1:32 PM

  • Hello, I got a new computer with the old harddisk. Can't play my itunes music because not authorised. Apple ID is very old e-mail adress that doesn't exist anymore. When chose for secret question brth date seems wrong. Can't reach anyone!!!

    Hello, I got a new computer with the old harddisk. Can't play my itunes music because not authorised. Apple ID is very old e-mail adress that doesn't exist anymore and I forgot the password (5 years old??). When chose for secret question birth date seems wrong. Tried everything. I need to make a new apple-id but cannot connect the music that I bought under my old name to my new name. Can't reach anyone!!! Automatic FAQ system is of no help. What to do?

    That doesn't help me. For itunes it brings me to the express lane which doesn't help because my case is not in it. It's all standard procedure things. I understand those but my situation is different. The combination of a passport forgotten and an non-existing email adress (and birth dates that are not correct or not accepted). What I need is my password emailed to another email adress than the original one because that no longer exists.

  • I have a question about Data Rates.

    Hello All.
    This is a bit of a noob question I'm sure. I don't think I really understand Data Rates and how it applies to Motion... therefore I'm not even sure what kind of questions to ask. I've been reading up online and thought I would ask some questions here. Thanks to all in advance.
    I've never really worried about Data Rates until now. I am creating an Apple Motion piece with about 15 different video clips in it. And 1/2 of them have alpha channels.
    What exactly is Data Rate? Is it the rate in which video clip data is read (in bits/second) from the Disc and placed into my screen? In Motion- is the Data Rate for video only? What if the clip has audio? If a HDD is simply a plastic disc with a dye read by "1" laser... how come my computer can pull "2" files off the disc at the same time? Is that what data transfer is all about? Is that were RAM comes into play?
    I have crunched my clips as much as I can. They are short clips (10-15seconds each). I've compressed them with the Animation codec to preserve the Alpha channel and sized them proportionally smaller (320x240). This dropped their data rate significantly. I've also taken out any audio that was associated with them.
    Is data rate what is slowing my system down?
    The data rates are all under 2MBs. Some are as low as 230Kbs. They were MUCH higher. However, my animation still plays VERY slowly.
    I'm running a 3GigRam Powerbook Pro 2.33GHz.
    I store all my media on a 1TB GRaid Firewire 800 drive. However for portability I'm using a USB 2 smartdisk external drive. I think the speed is 5200rpm.
    I'm guessing this all plays into the speed at which motion can function.
    If I total my data rate transfer I get somewhere in the vicinity of 11MBs/second. Is that what motion needs for it to play smoothly a 11MBs/second data connection? USB 2.0 is like what 480Mbs/second. So there is no way it's going to play quickly. What if I played it from my hard drive? What is the data rate of my internal HDD?
    I guess my overall question is.
    #1. Is my thinking correct on all of these topics? Do my bits, bytes and megs make sense. Is my thought process correct?
    #2. Barring getting a new machine or buying new hardware. What can I do to speed up this workflow? Working with 15 different video clips is bogging Motion down and becoming frustrating to work with. Even if only 3-4 of the clips are up at a time it bogs things down. Especially if I throw on a glow effect or something.
    Any help is greatly appreciated.
    -Fraky

    Data rate DOES make a difference, but I'd say your real problem has more to do with the fact that you're working on a Powerbook. Motion's real time capabilities derive from the capability of the video card. Not the processor. Some cards do better than others, but laptops are not even recommended for running Motion.
    To improve your workflow on a laptop will be limited, but there are a few things that you can try.
    Make sure that thumbnails and previews are turned off.
    Make sure that you are operating in Draft Mode.
    Lower the display resolution to half, or quarter.
    Don't expect to be getting real time playback. Treat it more like After Effects.
    Compressing your clips into smaller Animations does help because it lowers the data rate, but you're still dealing with the animation codec which is a high data rate codec. Unfortunately, it sounds necessary in your case because you're dealing with alpha channels.
    The data rate comes into play with your setup trying to play through your USB drive. USB drives are never recommended for editing or Motion work. Their throughput is not consistent enough for video work. a small FW drive would be better, though your real problem as I said is the Powerbook.
    If you must work on the powerbook, then don't expect real-time playback. Instead, build your animation, step through it, and do RAM previews to view sections in real time.
    I hope this helps.
    Andy

  • End of contract questions (Unlimited data+4G)

    I currently have 2 lines in my account. The main line is mine and the contract is up on 11/.07, while the secondary line is 12/04. I have been waiting for the new gen 4G and Galaxy Nexus to be announced before making a decision on what to upgrade my phone to. 
    Here are the questions I have:
    I have the unlimited data plan now. If I extend my contract with a 3G or a 4G phone that is currently on sale today then switch to Galaxy Nexus later in November, will I keep the unlimited data plan still?
    If I do not renew the contract, do I loose the loyalty discount + the unlimited data plan?
    Does switching from a 3G device to 4G device effect the data plan status at all (unlimted to 5g capped?)
    Thanks for the replies.

    stiglet wrote:
    I currently have 2 lines in my account. The main line is mine and the contract is up on 11/.07, while the secondary line is 12/04. I have been waiting for the new gen 4G and Galaxy Nexus to be announced before making a decision on what to upgrade my phone to. 
    Here are the questions I have:
    I have the unlimited data plan now. If I extend my contract with a 3G or a 4G phone that is currently on sale today then switch to Galaxy Nexus later in November, will I keep the unlimited data plan still?  As of right now, Verizon's policy is to allow you to keep the unlimited data feature as long as you keep a 3G or 4G smartphone active on your line.  It is possible that might change by November, but highly unlikely.  In any case, you would know if you could not keep the plan because Verizon would announce that they are ending the grandfathering and/or releasing phones that aren't compatible with the feature.  In general, I would expect to be able to keep your unlimited data pretty easily until Verizon finishes implementing the majority of its 4G network in 2013, but that's just my guess.
    If I do not renew the contract, do I loose the loyalty discount + the unlimited data plan? You won't lose your unlimited data.  Your plan will stay the same until you do something to change it, but you'll be on month-to-month status.  The loyalty discount expires 6 months after it becomes available (so two months after your contract ends).
    Does switching from a 3G device to 4G device effect the data plan status at all (unlimted to 5g capped?)  As far as I'm aware, unlimited is still unlimited.  The only change is that Verizon reserves the right to throttle the top 5% of users in congested markets during peak hours.  It doesn't neccesarily kick in at 5GB, and it won't affect you at all if you don't use a congested tower.
    Thanks for the replies.

  • Question concerning data removal from external hard drive

    I have been backing up via Time Machine to a 500 Gig external hard drive. I will be adding a 1 TB external hard drive and switch the Time Machine backups to that unit. I want to then use the 500 Gig unit to store photos, music and video, to free up space on the iMac. I have found the tutorials/threads that explain how to move the photos and music. I have two questions.
    First, will moving video from iMovie work basically the same as moving the photos and/or music?
    Secondly, what is the proper way to remove the existing Time Machine data from the 500 gig unit, once I have the 1TB unit up and running and know that Time Machine is backing up to it?

    I can help with number two. There are two things you can do, one quickly, the second more securely.
    Both can be done with Disk Utility in the Apps/Utilities folder. Open DU with the HDD attached to your Mac and highlight the HDD in the left hand pane and choose Erase for the right hand pane. Press the Erase button. It should happen pretty fast, but because it is a big drive it may take a couple of minutes. But this does not really erase the data, and some recovery software could do a good job of getting a lot of you data back.
    Since you want to store media libraries on this drive perhaps a better way to condition the drive is to select a Security Option and write zeros to the drive. After this any bad sectors on the drive will be locked out and not used in the future. Because this is a big drive this could take a few hours.
    Dah•veed

  • General question about data recovery from raid on windows server !

    hi,
    I like to buy data recovery software for windows server  to recover data with  any raid( or whitout raid
    so the company tells me you should be sure the Raid can be recognized correctly as local disk drive first, otherwise, our product  wil not help you.  so if I want to recover data of my customers which has windows server with any raid form ,how
    I can make this server harddisks(which is raid 1,5,6 ect)recognized coorectly as local disk drive.
    thanks
    johan
    h.david

    I have myself NAS which is connected to my computer through router.
    I am started smal bussines ,and I want to use also data recovery service for deleted documents,photo's,formatted drives ect.for my customers .things that accidantally errased from raid or non raid
    so I found some data recovery softwares
    1-stellarphoenix data recover,which has remote acces and netwerk acces.
    2--ease use data recovery software.  I liked this second software,but they telling me like this,
    you should be sure the Raid can be recognized correctly as local disk drive first, otherwise, our product  wil not help you.
    I understand the both situations you telling me.thanks
    but my question is can I not run data recovery software from bootable usb stick on the smal  servers like essential with raid to recover data in case of any problems that caused data loose.
    I know if I install data recovery software on the server to scan it ,it wil rewrite some places on harddisk and that is not wise.
    h.david

  • Differences between Oracle BAM and Oracle BI and Question BAM  Data Objects

    Hi,
    I have two questions.
    1. Can someone tell me differences between Oracle BAM and Oracle BI?
    My understanding about Oracle BAM is, we use BAM to build Dashboards or Reports.
    We can also build DashBoards or reports using Oracle BI.
    I am not able to understand why Oracle has two tools for same purpose?
    Which tool is more powerful and user friendly(Oracle BI or Oracle BAM)?
    2. Every time we plan to develop Dashboard or report in BAM, we need to create BAM ADC Data Object to store Data (i.e first step is to get data from external database or application and second step is to store data in BAM ADC data object).
    My understanding is we have an extra step(i.e creating Data Object) in Oracle BAM to develop a report or DashBoard
    I am wrong pl correct me?
    Regards,
    Shanti Nagulapalli.

    Oracle 11g has many advanced features in PL/SQL over Oracle 9i.
    refer here,
    http://www.oracle.com/technetwork/database/features/manageability/9i-to-11g-real-world-customer-exper-133754.pdf
    http://www.oracle.com/global/de/upgradecommunity/artikel/upgrade11gr2_workshop2.pdf
    http://www.compuworks.com/events/view/233.pdf
    http://education.oracle.com/pls/web_prod-plq-dad/db_pages.getCourseDesc?dc=D52601GC10&p_org_id=15942&lang=US
    Thanks

  • ICR Process 003 - question about data selection (table FBICRC003A)

    Hello, I am implementing ICR process 003. We are doing several test and we I have some questions that I hope you can help me:
    1 - If I run transaction FBICS3 - Customer/Vendor(  Select Documents) and then FBICA3 - Customer/Vendor (Document Assignment ) several times (the same selection criteria) will the same documents be selected redundantly and will be stored redundantly in table FBICRC003A? I expected that this will not happen but It seems to happen in my test environment. (?)
    2- If I need to delete the data stored in ICR '003' functionality I need to use transaction GCDE. The problem as I am using ledger '0L' for '003' process I cannot "delete data of one ledger" functionality -that allows to set selection data- and I have to "delete the data of an entire data group" that deletes all data stored in FBICRC003A & FBICRC003T tables. Should I set another ledger for '003' process in order to delete data using selection criteria? Is it recommended to not to use '0L' and create a new one?
    I have read in reference documentation that "is not necessary to set up a SL", but since all my productive companies are running in the same client that is the ICR cliente I am wondering if could be better to create and set a SL.
    Thanks in advance
    Rafael Barreda
    Edited by: Rafael Barreda on Sep 14, 2009 1:27 PM

    Hi Ralph,
    we have created a RFC in order to get data from client B to client A (where ICR system is placed). The summary is:
    I need to import vendors/customer data from Client B that belongs to a certain company code "0001". The company is called "X" in both clients A & B although it only exists as a company code (FI) in Client B and just as a company (and trading partner) in client A.
    Client A:
    1- have set company "0001" as a "Company to be reconciled" at FBIC032:
    RFC destination = ""
    RFC destination for data selection = "ZRFC0001"
    Local company= ""
    Data Source="Documents of Current Process"
    Separate Selection Process= "X"
    Data Transfer Type="Asynchronous via Direct RFC Connection"
    Sender field for reference number = "XBLNR"
    Client B:
    1- I have created the companies (V_T880) that I will need in order to inform trading partner on vendor's master data.
    2- I have assigned company code '0001' to company "X"
    3- I have assigned trading partners created on step 1 to vendors
    4- I have post few FI documents with trading partners informed.
    Then I run FBICS3 - Customer/Vendor: Select Documents in background but the programs takes a lot of time and do not select any document.
    Do you think that I am missing something?
    Thanks very much in advance.

  • New to smartphones and i have a question about data usage...

    i bought my wife a droid 2, we are both new to these phones, ( i still have my ENVY3), but my question is, since she doesnt spend too much time on the web i got her the $15 data package, as i was told it should be enough for most people that are occasional web surfers. my question is though, if she has logged on facebook, since its constanly running and updating, would that mean its constantly using data?
    thanks folks for any help

    Yes, anything that updates will use data. This can be turned off when not in use (menu> settings> wireless & networks> mobile networks, uncheck mobile data) or place the power control widget on your homescreen and it can be toggled from there.

  • Question about dates shown in finder

    I have a problem and need help.
    First question is: what date is shown in default finder settings on files? Creating date , Modified or added Date? on Mac, Windows?
    The problem lies here, I have to give clients JPEG picture files. we had to re-shoot (what the client does not need to know) The re-shoot was necessary because of a mistake I made:-( anyway ... I have to create zip folders to give him the pictures, but the create and modified dates do not change when zipped. Does anyone know how to change these dates( without re-render the files)?
    Any help is highly apreciated!

    Hi GM1941,
    When using Finder>All Images, the ones that you have more than one copy of are images that you do in fact have more than one of. An image stored under the same name and possibly placed in another location from its original.
    A good example would be if you downloaded all your images to a pictures folder on your desktop. From there you might copy a picture of a family member in a genealogical program and not rename it. There you have two copies.
    If you opened the picture in Photoshop or some other image manipulation software and saved it in a different file format (ex: Tiff, PNG, PICT, GIF) you are again able to save it again- even to the same file- with out renaming it.
    When you use the Finder to search for all images, remember that you are looking for ALL images on your computer, not just the ones in iPhoto.
    The files listed under today, yesterday and last week are a "search" list of documents/files you have opened during those times, not another copy. But, they are connected to the item they represent and, if you drag them to the trash they will indeed be in the trash and can be deleted from there.
    Good Luck,
    John

  • Some questions on data warehousing

    My questions are:
    (1.) Is ANY dimension, except the time dimension, a candidate for a SCD (slowly changing dimension)?  i.e. A sales rep dimension can have a column changing slowly like the state he belongs to, a employee dimension can have a columns
    such as highest education level changing slowly over time. So, any dimension, except the time one, can have columns which are candidates for SCD?
    (2.) When designing a DW, do you have to think about SCDs at design time? Or will the need for a SCD come later when the system is running live? When designing a DW is it best practice to look at all columns in the dimensions, and see if
    the data can change slowly over time and make room for that, or do we do it as an when the requirement comes, after the system goes live?
    (3.) Can a dimension have more than 1 column which changes slowly over time? Like, for a product dimension, the product price and the supplier both changes slowly over time. So, what is the solution to this scenario?
    (4.) What is the the MOST COMMON solution practiced in real life to a SCD problem? Is it creating more than one row with a version number or begin/end dates?
    (5.) Does a solution to a SCD require rebuilding the fact table?

    1.Yes
    2. We make them SCDs in design time itself. In case it doesnt need historic data analysis we make attributes as Type 1 to preserve only latest values
    3. The same dimension can have attributes (columns) that can be handled differently ie some columns might be processed in Type 1 way (update with latest value), some in historic way (multiple records with each intermediate values) and some in fixed way
    4. Common method is to use Valid From and Valid To date fields. The latest record will have ValidTo as NULL to indicate its the currently valid record. We will also add a bit field IsCurrent for quick retrieval and it will be 1 only for latest valued records
    5. Yes. As the surrogate key changes when you implement Type 2 changes. Each intermediate value surrogate key would be different and fact has to have correct surrogate key reference to make sure you get correct attribute value as on the required period.
    Please Mark This As Answer if it helps to solve the issue Visakh ---------------------------- http://visakhm.blogspot.com/ https://www.facebook.com/VmBlogs

  • Quick Question re Data Source

    Hi
    I have a quick question re datasource login.
    I havent touched my Crystal Reports for about a year. Since I created the reports (which users have been using through Crystal Reports server) I have changed my PC.
    I need to make a change in the formulas of the reports so I have reinstalled Crystal Reports 11 onto my PC and opened the reports. The forumula change is simple but I am having a nightmare with the data source login. The report runs against an SQL database. From memory all I did the last time was create an ODBC connection on my PC (which used the the SQL login sa account) and configured the reports to use that ODBC connection. When I open the report it asks me for the data source location which I tell it to use the one I created and the report runs fine.
    If I close Crystal Reports and and then re-open, I press F5 to run the report and it gain asks me for the data source credentials.
    Any ideas, I'm sure it is something simple I am missing?
    Thanks

    Guys
    Thanks for your help
    Ive managed to fix it but not really sure how although I'm sure a few of the things you have mentioned is what I have done.
    I basically restored the reports from last nights backup (before I did the changes), re-did the changes and saved and now it seems  to work. I think the difference here is that I had created the odbc connection prior to changing the reports - if that makes sense?
    What I would like to do is fully understand how this data source connection works.
    Don, when I open the report now and hit f5 the report presents me with the parameters and not the database login screen as it was doing before - how is this? What credentials is it using? The SQL sa account (which I configured the ODBC connection to use) or is it my domain account of which I assume permissions for this account must be set on the actual SQL database?
    As you and Jeff suggested when I look at the datasource connection screen it shows "trusted Connection: 1" - what does this mean and how is this configured, have I configured this in Crystal somewhere?
    Thanks
    Andy

  • Performance question - Caching data of a big table

    Hi All,
    I have a general question about caching, I am using an Oracle 11g R2 database.
    I have a big table about 50 millions of rows that is accessed very often by my application. Some query runs slow and some are ok. But (obviously) when the data of this table are already in the cache (so basically when a user requests the same thing twice or many times) it runs very quickly.
    Does somebody has any recommendations about caching the data / table of this size ?
    Many thanks.

    Chiwatel wrote:
    With better formatting (I hope), sorry I am not used to the new forum !
    Plan hash value: 2501344126
    | Id  | Operation                            | Name          | Starts | E-Rows |E-Bytes| Cost (%CPU)| Pstart| Pstop | A-Rows |  A-Time  | Buffers | Reads  |  OMem |  1Mem | Used-Mem |
    |  0 | SELECT STATEMENT        |                    |      1 |        |      |  7232 (100)|      |      |  68539 |00:14:20.06 |    212K|  87545 |      |      |          |
    |  1 |  SORT ORDER BY                      |                |      1 |  7107 |  624K|  7232  (1)|      |      |  68539 |00:14:20.06 |    212K|  87545 |  3242K|  792K| 2881K (0)|
      2 |  NESTED LOOPS                      |                |      1 |        |      |            |      |      |  68539 |00:14:19.26 |    212K|  87545 |      |      |          |
    |  3 |    NESTED LOOPS                      |                |      1 |  7107 |  624K|  7230  (1)|      |      |  70492 |00:07:09.08 |    141K|  43779 |      |      |          |
    *  4 |    INDEX RANGE SCAN                | CM_MAINT_PK_ID |      1 |  7107 |  284K|    59  (0)|      |      |  70492 |00:00:04.90 |    496 |    453 |      |      |          |
    |  5 |    PARTITION RANGE ITERATOR        |                |  70492 |      1 |      |    1  (0)|  KEY |  KEY |  70492 |00:07:03.32 |    141K|  43326 |      |      |          |
    |*  6 |      INDEX UNIQUE SCAN              | D1T400P0      |  70492 |      1 |      |    1  (0)|  KEY |  KEY |  70492 |00:07:01.71 |    141K|  43326 |      |      |          |
    |*  7 |    TABLE ACCESS BY GLOBAL INDEX ROWID| D1_DVC_EVT    |  70492 |      1 |    49 |    2  (0)| ROWID | ROWID |  68539 |00:07:09.17 |  70656 |  43766 |      |      |          |
    Predicate Information (identified by operation id):
      4 - access("ERO"."MAINT_OBJ_CD"='D1-DEVICE' AND "ERO"."PK_VALUE1"='461089508922')
      6 - access("ERO"."DVC_EVT_ID"="E"."DVC_EVT_ID")
      7 - filter(("E"."DVC_EVT_TYPE_CD"='END-GSMLOWLEVEL-EXCP-SEV-1' OR "E"."DVC_EVT_TYPE_CD"='STR-GSMLOWLEVEL-EXCP-SEV-1'))
    Your user has executed a query to return 68,000 rows - what type of user is it, a human being cannot possibly cope with that much data and it's not entirely surprising that it might take quite some time to return it.
    One thing I'd check is whether you're always getting the same execution plan - Oracle's estimates here are out by a factor of about 95 (7,100 rows predicted vs. 68,500 returned) perhaps some of your variation in timing relates to plan changes.
    If you check the figures you'll see about half your time came from probing the unique index, and half came from visiting the table. In general it's hard to beat Oracle's caching algorithms, but indexes are often much smaller than the tables they cover, so it's possible that your best strategy is to protect this index at the cost of the table. Rather than trying to create a KEEP cache the index, though, you MIGHT find that you get some benefit from creating a RECYCLE cache for the table, using a small percentage of the available memory - the target is to fix things so that table blocks you won't revisit don't push index blocks you will revisit from memory.
    Another detail to consider is that if you are visiting the index and table completely randomly (for 68,500 locations) it's possible that you end up re-reading blocks several times in the course of the visit. If you order the intermediate result set from the from the driving table first you may find that you're walking the index and table in order and don't have to re-read any blocks. This is something only you can know, though.  THe code would have to change to include an inline view with a no_merge and no_eliminate_oby hint.
    Regards
    Jonathan Lewis

Maybe you are looking for

  • Copying a Slide to a Master Slide

    I'm working in a slide that I want to reuse as a Master Slide. The only way I was able to do that is to make a new master and recreate every element on my slide. It even didn't allow me to copy&paste stuff from the slide to the master. What I am look

  • Photoshop Elements 11 Photo Mail

    This feature is not working at all. It had been working previously. Running Windows 8. Any solutions for me? Thank you!

  • RAW support for Olympus OMD EM5

    Outfitting for a year long back pack trip. My MacBook Pro is going. Am in processof purchasing an Olympus OMD EM5 micro 4/3 camera for the trip. I use Aperture for photo processing. Any idea when or if  Aperture will add support for developing the RA

  • Final Cut Pro X 10.0.8. How to automatically move clips to the left?

    I've upgraded to Final Cut Pro X 10.0.8, and there are differences from the earlier ones. How do I set my timeline to automatically move the clips along to the left when I delete a clip in the middle of my timeline? I can't find an answer to this que

  • Uploading large files using nio in http client

    Hi, I'm developing a multithreaded Swing client which needs to be capable of uploading and downloading large ( 100mb ) files to servlets via http. Downloading has presented no problem, but writing to the OutputStream of a URLConnection, whilst fine f