Lousy performance

Hi,
We have installed oracle 9i on a dual processor compaq 2.0ghz system with 3 raid 5 18 gb disks (windows 2000 server)
None of us have much experience with oracle, but have been using sql server for a decade.
We accepted all of the defaults during installation and have noticed really long wait time with deletes. Deleting 50,000 records, where the only criteria is on an indexed column, (i.e DELETE FROM REBATE_SIGNUPS WHERE REBATE_PROGRAM_ID IN (1000000056, 1000000059, 1000000060, 1000000061, 1000000062,1000000063) ) take over 10 minutes to complete.
Also doing things like nested queries really slows the system down. When we change the query to a join, things speed up exponentially.
Currently, only I am using the system, so there is no other load on the db.
Does anybody have a suggestion on what I could do to speed the system up. Something must be configured incorrectly to get this type of response time...
Thanks in advance,
Ned

ANALYZE
Purpose
Use the ANALYZE statement to collect non-optimizer statistics, for example, to:
Collect or delete statistics about an index or index partition, table or table partition, index-organized table, cluster, or scalar object attribute.
Validate the structure of an index or index partition, table or table partition, index-organized table, cluster, or object reference (REF).
Identify migrated and chained rows of a table or cluster.
Note:
Oracle Corporation strongly recommends that you use the DBMS_STATS package rather than ANALYZE to collect optimizer statistics. That package lets you collect statistics in parallel, collect global statistics for partitioned objects, and fine tune your statistics collection in other ways. Further, the cost-based optimizer, which depends upon statistics, will eventually use only statistics that have been collected by DBMS_STATS. See Oracle9i Supplied PL/SQL Packages and Types Reference for more information on this package.
However, you must use the ANALYZE statement (rather than DBMS_STATS) for statistics collection not related to the cost-based optimizer, such as:
To use the VALIDATE or LIST CHAINED ROWS clauses
To collect information on freelist blocks
http://download-west.oracle.com/docs/cd/B10501_01/server.920/a96540/statements_46a.htm#SQLRF01105
Joel P�rez

Similar Messages

  • Lousy Performance of Mac Pro X1900 XT in Leopard

    Being the paranoid person I am, I installed Leopard to an external FW 800 drive to test it. Immediately, the window drawing in Leopard did not seem as snappy to me as it did in Tiger. Then, I pulled up iTunes, to check what version of iTunes comes with Leopard 10.5.0... I was stunned, not by the version of iTunes, but at the graphics lameness I witnessed.
    If you launch iTunes then choose iTunes | About iTunes you will see that when the credits scroll by, they scroll by all choppy and herky-jerky. Yes folks, that's right, Apple's fastest and most top of the line machine -- a Mac Pro with an X1900 XT -- is not sufficiently swift enough to properly display the credits for iTunes.
    Yes, I tried 10.5.1, same difference.
    Needless to say, I'm sticking with Tiger until Apple fixes the X1900 XT driver(s) in Leopard.
    Can anyone else with a Mac Pro and X1900 XT please go to iTunes | About iTunes and commiserate?

    Just for info; Leopard is not a full 64 bit operating system. In fact, there is only 1 Apple GUI app that is compiled for x86_64 and that is Xcode, and even that runs in 32 bit mode by default, unless you uncheck the "Run in 32 bit mode" box when you do a Get Info on it.
    There are many 64 bit frameworks and libraries, but the rest of the operating system applications are all 32 bit.
    You can see that for yourself by running:
    file /path to applications/* | grep x86_64
    In a terminal.
    Hyperion:~ will$ file /Applications/Utilities//Contents/MacOS/ | grep x86_64
    /Applications/Utilities/OSXPM.app/Contents/MacOS/OSXPM: Mach-O 64-bit executable x86_64
    Hyperion:~ will$
    I have 2 x86_64 apps, but only because I recompiled them to see if they worked.
    There are a handful of x86_64 apps in /usr/bin and /usr/sbin, but thats about it. Oh, Apache is 64 bit by the way.

  • Why spotlight Diablo 3 on rMBA when it get's such lousy performance?

    I have the rMBA base model.  It is unable to play the game at full resolution even with all the "extras" turned to their lowest settings.  Chugging along at 20fps is unplayable if you are a "hardcore" player.  Are there any driver fixes in the future, or is the rMBP unable to play this game at it's retina settings?
    If that is the case, then why spotlight the game during the WWDC?  I have a week or so left before having to return this thing, and I'd rather not, but it would be cheaper/wiser for me to buy a MBA + gaming desktop.
    Anyone?

    It's Diablo 3's Mac client not the computer, at least thats what I'm assuming due to the current status of how bad and inconsistent the Mac client is. I run Diablo 3 on a 13inch MacBook Pro with Intel HD 3000 at 40-45 fps, but others with better computers have been struggling to get 20 fps. Try running it in Windows with Bootcamp and see how that goes because that has helped a lot with others.

  • Pdf drawings (from AutoCAD) have lousy performance

    When viewing PDF text documents zoom/pan/leafing through pages works excellent.
    I'm using a HTC Flyer and pan/zoom etc with AutoCAD-WS app (dwg format) works like a charm, but when opening the same drawing in pdf format,  Adobe Reader hangs for 10-30 secs and zoom/pan either crashes or take another 25-60 secs. 
    Any similar experiences?

    Yes. On my existing iMac 2MB 2Mhz core duo.
    I'm looking for a new computer because of this problem.  I often review drawings that are provide to me as pdf files
    I checked out a new iMac 3.1 Mhz quad core i5, 4MB and it was disapointing.  I tried both Preview (the mac pdf reader) and Adobe reader and they both had problems of flickering or delays.  Surprisingly Good Reader on my iPad does a smoother but ultimately slower pan and zoom transition with a gradual increase to full resolution.  
    My delays are not as bad as yours and my computer does not usually crash.  It seem that there should be some hardware or software fix for this.

  • Will these intel iMacs still offer flawless performance?...

    Or will there be problems like PC's have? I have a PC right now...and I'm fed up with the lousy performance of it.
    I want to get an iMac...will the new intel iMac be a better choice than the standard iMac?

    As Timon mentioned, the problems with Windows PCs are related to Windows, not the x86 architecture.
    As for which iMac is the better buy... it really depends on what you need right now.
    If you want access to the entire library of Macintosh software today and throughout the near future, then the iMac G5 is your choice. In the short term, you have a marginally wider selection of software available to you. And it's still viable in the long term; there's a huge installed base of PowerPC Macs, and we won't stop seeing new software for them for likely at least a couple years.
    If you want the performance benefits of the iMac Core Duo, then you give up some compatibility in the short term. Some current applications won't run on the Intel processors, and pricing and target dates for updates are all over the map depending on what title you need. If, though, you never intend to stray beyond the apps that ship with the iMac and maybe Microsoft Office, then you'll probably never notice the difference between now and wider availability of Universal applications.

  • Do not buy the USB3 PNY Turbo Flash Drive!

    I have highly recommended the USB 3 PNY Turbo Flash Drive in the recent past as a fantastic Read storage device.  Well do not buy it.  PNY drastically changed the memory chips inside without changing the model number and now it is a terrible device for holding your media for editing operations.
    Thanks to Randall for making me aware of this problem.  I now have 4 of the older good type and 3 bum ones.  Here are two Windows 8 transfer plots of a large AVI file from the USB 3 Flash drive to a Samsung 840 Pro on my laptop.  The first one is the original one that I purchased last July with a file transfer rate of 185 MB/s, and the second plot is the lousy performance of 120 MB/sec.of one of the last devices purchased in November.
    The new ones are usable but nowhere as desirable as the 1st generation.  I checked several websites and see that they no longer advertise the old faster specifications.  Also I have tested all my devices and scoring is consistent and very repeatable.  Now, that I know it is possible, I will start looking for a new vender that give the great read rate ideal for laptop editing.  Of course it is not as convenient as a USB 3 Flash drive but my Vantec NexStar 6G USB 3 SSD enclosure with a Samsung 840 Pro drive has a Read rate of 250 MB/second. 

    Bill,
    It looks like you may have to settle for a slower flash drive at this price point for the foreseeable future: Every single one of the USB 3.0 flash drives at this capacity point that delivers at least the same level of read performance as the original PNY Turbo costs much more money than the PNY. ($150-ish for a SanDisk Extreme Pro 128GB USB 3.0 flash drive, anyone?)
    With that said, the PNY Turbo 3.0 is only intended to compete with the slower USB 3.0 flash drives such as the SanDisk Ultra USB 3.0 drive (whose maximum read speed is only 100 MB/sec even in its largest-capacity 128GB version).

  • Bulk Naming/Renaming in Lightroom?

    Without getting into a big urination festival over file naming standards, I've been struggling since LR V1 with the way it does file naming.
    If I want to rename a set of images in LR, how do I do this?  For example,
    I import 10 images.  They were something _D3J1234 through _D3J1243.  In LR, I want them to be
    DP_<yymmdd>_<4-digit-sequence>_<whatever I specify>
    I can do this, but I have a couple problems.  First, nothing seems to be persistent.  Every time I do an import, I'm having to go in and
    re-edit my template.  Even worse, unlike Bridge, Lightroom doesn't appear to have a persistent sequence number.
    I figure I"m missing something or doing something dumb...
    NOW, here's my even bigger problem......  I now have 100 images in LR that I need to RENAME......
    Something like DP_101121_5555_buffalo through DP_101121_5654_buffalo that I goofed on, and they're actually bighorn sheep.
    SO, I'd like to rename these 100 images to be DP_101121_5555_bighorn_sheep through DP_101121_5654_bighorn_sheep
    AND, my BIGGEST problem, renaming files because of HFR, HDR, or panoramas......
    So, I have 6 images that are an HDR, named DP_110101_5432_orchid through DP_110101_5437_orchid.  I need to know these
    are an HDR set, and WHICH HDR set they are...  So, I want to name them DP_110101_5432_hdr3_orchid through DP_110101_5437_hdr3_orchid...
    For this bunch of images on this day, there may be 20 or 30 HDR sets that need to be renamed...
    How do I do this stuff in Lightroom 3?
    Oh, and YES, I know I can sort-of stack them, but you can't stack images in different directories so that's less than optimal, AND, having
    information that's ONLY in the catalog isn't something I want to do...  I've had catalogs puke and fail in V2 and already in V3.  In V3, I was
    one of the umpteen people that converted the V2 catalog, and got such lousy performance the recommendation was to "RE-CREATE
    THE CATALOG".  Which, of course, loses everything that was only in the old catalog.  So, everything needs to be in the files...
    Unless I'm wrong, and lightroom stacks and collections are persistent for the images contained in them?
    But if not, as long as Adobe support's second recommendation is "completely recreate the catalog"...

    I struggle with this too, but just a couple of suggestions on your filenaming structure.
    You'll notice that most of the suggestions that were given have the sequence number at the end - and that's a big part of answering your issue of getting a persistent sequence number using LR's renaming tools.  In order for LR to keep the original sequence number assigned to an image through subsequent renamings, you need to make sure that the sequence number is at the very end of the filename.  That is the only way LR will recognize the existing sequence number when you pick "Original Number Suffix" as part of your renaming template.
    You have to be careful with this because if there is ANYTHING, like say '-Edit' after the sequence number, LR will ignore the number and really screw up your filenames.
    So, with your current file-naming template, there's no way to use batch rename in LR to change "DP_101121_5654_buffalo" to "DP_101121_5654_bighorn_sheep".  On the other hand, if you change your initial template from "DP_<yymmdd>_<4-digit-sequence>_<whatever I specify>" to
    "DP_<yymmdd>_<whatever I specify>_<4-digit-sequence>", you've got it made.
    Paul Wasserman

  • Time Capsule external HD slow after joining an existing network.

    Hi all,
    I have recently moved and had to reconfigure my TC accordingly. Previously my TC was connected to the internet and an external HD which hosted my iTunes music (amongst other files). After moving home, I was required to join the existing wifi so changed the configuration of my TC. Now I am unable to play music from the external HD as the performance is so slow. I'm also assuming backups will have lousy performance too as I havent yet seen evidence that a backup has successfully concluded.
    The house has a netgear DGN2000 connected to ADSL. Internet access from my mac seems OK. I'm using Remote with Client Association in the configuration, adding the TC's MAC address to the netgear router.
    Any ideas why disk performance has gone down so much ? I can discount iTunes as previewing photos in Finder from the same HD is also painfully so.

    Hello Shogododo. Welcome to the Apple Discussions!
    The house has a netgear DGN2000 connected to ADSL. Internet access from my mac seems OK. I'm using Remote with Client Association in the configuration, adding the TC's MAC address to the netgear router.
    I don't think this Netgear feature is supported by the TC. How do you have the TC configured? That is do you have it so that the option "Wireless Mode = Join a wireless network" is selected?

  • Hashmap containsKey() method does not appear to work

    Hashmap containsKey() method does not appear to work
    I have an amazingly simple custom class called CalculationKey, with my own amazingly simple custom equals() method. For some reason when I call my containsKey() method on my HashMap it does not use my defined equals method in my defined key class. Do hashmaps have their own tricky way for establishing whether two keys are equal or not?
    THIS IS MY AMAZINGLY SIMPLE CUSTOM KEY CLASS
    private class CalculationKey
    private LongIdentifier repID;
    private LongIdentifier calcID;
    public CalculationKey(LongIdentifier repID, LongIdentifier calcID)
    this.repID = repID;
    this.calcID = calcID;
    public boolean equals(Object o)
    CalculationKey key = (CalculationKey)o;
    if (key.getCalcID().equals(calcID) &&
    key.getRepID().equals(repID))
    return true;
    else
    return false;
    public LongIdentifier getCalcID()
    return calcID;
    public LongIdentifier getRepID()
    return repID;
    THIS IS MY AMAZINGLY SIMPLE CALLS TO MY HASHMAP WHICH ADDS, CHECKS, AND GETS FROM THE HASHMAP.
    private Hashmap calculationResults = new Hashmap();
    public boolean containsCalculationResult(LongIdentifier repID, LongIdentifier calcID)
    if (calculationResults.containsKey(new CalculationKey(repID, calcID)))
    return true;
    else
    return false;
    public Double getCalculationResult(LongIdentifier repID, LongIdentifier calcID)
    return (Double)calculationResults.get(new CalculationKey(repID, calcID));
    public void addCalculationResult(LongIdentifier repID, LongIdentifier calcID, Double value)
    calculationResults.put(new CalculationKey(repID, calcID), value);
    }....cheers

    You can make a trivial implementation to return a
    constant (not recommended)What do you mean by that? Hmm.. I guess you mean that
    you shouldn't use the same constant for all objects?
    But don't see the int value of an (immutable) Integer
    as constant?
    /Kaj
    You can write hashCode to just always return, say, 42. It will be correct because all objects that are equal will have equal hashcodes. Objects that are not equal will also have equal hashcodes, but that's legal--it just causes a performance hit.
    The value is that it's really really simple to implement: public int hashCode() {
        return 42;
    } So you can use it temporarily while you're concentrating on learning other stuff, or during debugging as a way to confirm that the hashCode is not the problem. (Returning a constant from hashcode(), rather than computing a value, is always legal and correct, so if something's behaving wrong, and you replace your hashCode method with the one above, and it still breaks, you know hashCode isn't the problem.)
    The downside is that you're defeating the purpose of hashing, and any non-trival sized map or set is going to have lousy performance.
    For a decent hashCode recipe, look here:
    http://developer.java.sun.com/developer/Books/effectivejava/Chapter3.pdf

  • To know my laptop is sufficient or not

    my laptop specifications are
    i5 3rd gen processor,4 gb ram,1 tb hard disk,1 gb graphic
         is it sufficient to run premeirepro and after effects

    Technically, it will work. But practically, it's way too slow: All mobile i5 CPUs have only two physical cores (albeit with hyper threading). 4GB of installed RAM is way too little to do much. What's more, if your i5 CPU has a "U" or "Y" at the end of its model number (as in i5-4200U), then it only supports single-channel memory operation, which in turn currently limits the maximum installable RAM to a single 8GB stick (which would then completely replace the existing installed 4GB stick).
    Third, your laptop may not have any provision at all whatsoever to add additional fast disks (likely because the stock hard drive eats up the laptop's only internal bay, the laptop's optical drive is non-removable and your laptop has only USB 2.0 ports).
    Finally, you did not mention which specific graphics chip your laptop is using. With that laptop any installed GPU will likely be so lousy (performance-wise) to begin with that you might as well permananently lock Premiere to the MPE software-only mode (no GPU acceleration whatsoever).
    In other words, your laptop is a piece of junk (as far as practical performance in the Adobe CS and CC programs is concerned).

  • Understanding AP Network Speed

    Just typed up a long and thorough description of the configuration and performance of my AirPort network, which the forum software proceeded to throw away (except for the title) when I navigated away to edit my product list. Lovely.
    Nutshell...
    My FiOS Quantum 75/35 service is giving me about 80Mbps straight out of the Ethernet port, as measured via Speedtest.net.
    By and large, my 2008 model MBP is getting me around a quarter of that when connected wirelessly via the 5Ghz network, more or less regardless of where I go around the house.
    My iOS devices are giving me about a third to a quarter of what the laptop is providing–six to eight percent of at-the-port throughput, in othe words—with the iPad 2 on the 5Ghz network, and the iPhone 4S on the 2.4 (it can't ever seem to find the five).
    It's a fairly big, fairly brick-y house, 170 years old. Stringing Ethernet throughout is not an option, as much as I'd love to. In addition to the dual-channel base station, my network has three Airport Express 802.11n's, all set up to "Extend a wireless network."
    I know that my house is not nirvana for wireless, and that I'm not going to get test bench throughput from my Airport devices in any event. But a factor of 15 degradation with my iOS devices seems kinda severe. I can live with factor of four with the laptop, I guess, but even that seems like kinda lousy performance.
    Thoughts? Ideas? Pointers? Any and all would be appreciated.

    So, a couple of things.
    First, in preparing to test out a powerline network, I took down all of my AirPort Express units. Before doing anything further, my throughput went up by factors of two or more in most locations around the house. Since AirPlay, Bluetooth, and Apple TV can handle all our media streaming needs, I will no longer be using the APEs. So, addition—really, multiplication—by subtraction. Progress!
    I've been utterly frustrated, however, in trying to establish the powerline network. I bought two D-Link 500MBps adapters. Plugged oneinto the wall by my router, then plugged my AirPort Extreme base station into it. Plugged the other into the wall in my dining room, where I wanted to test its performance. Went through the configuration per the manual, and followed Apple's steps for setting up a roaming network. No freakin' joy; I can't get a valid IP address out of the "remote" adapter (Yes,  I confirmed that the base station was sending a non-bogus one to the "base" adapter).
    Called D-Link tech support. Turns out, the "only problem" with the product is that they won't work unless they're plugged into the same electrical circuit.
    Gosh, was that the "only problem" with the play, Mrs Lincoln? I mean, my house is over 3,000 sq ft and must have 15 separate circuits; what decent-size house doesn't? What are the odds that two arbitrary rooms will share the same breaker? Not great, at least in my 19th century Victorian.
    I had read that there could be difficulty if the units were plugged into different power *phases* (120 vs 240). I checked my breaker box, and all seems to be right on that score; both circuits appear to be on the 120 phase, flowing from the main power distribution panel.
    What really frosted my nads was that the adapter features an LED that is alleged to indicate that the device is connected to the network and communicating with its comrade(s). Turns out, not so much. Even though both of mine are shining a lovely, steady green—indicating not just that they are in contact, but are communicating at 80+ MBps—they're really not connected at all. In fact, according to D-Link tech support, all the "PowerLine Network LED" is telling me is that the unit is receiving power. Which I thought was the job of the cleverly-named "Power LED." Silly me.
    Are there any powerline products out there that don't suffer from this insignificant, tennsy-weensy, little flaw?

  • How to reorganise a RDA application without getting error  RSSTATMAN042?

    Hallo RDA experts,
    according to this [presentation|https://www.sdn.sap.com/irj/scn/go/portal/prtroot/docs/library/uuid/40784126-7503-2b10-b586-f6d5acd41f60] SAP  wants us to reorganise a RDA application frequently (see slide 37): "Data has to be deleted frequently from u201EReal-Time Data Acquisitionu201C DataStore Object to avoid
    redundant data". Another reason for reorganisation would be the lousy performance of DSO objects...
    Here we go again: after having transfered the RDA data to the cube and deleted them from the RDA-DSO the next daemon run will cause this error message:
    Real-time upload not possible; request xyz must first be passed
    Message no. RSSTATMAN042
    where xyz is the ID of the last processed RDA PSA request. Also after an additional deletion of the PSA table this error is appearing (with xyz = 0).
    I think it's not the right approach to reorganise manually at regular intervals - what is your solution for this task? Did you also experience this error message? We have BI 7.0, SP 18.
    Many regards
    Martin Lehmann
    Edited by: Martin Lehmann on Sep 24, 2009 9:32 AM
    Meanwhile I posted this problem to the OSS for further analysis and I am still waiting for responce...

    Hello there, ALGroves.
    Great job so far in troubleshooting your issue. If you haven't already, feel free to review the information in this Knowledge Base article that goes over your particular error message:
    Resolve issues between iTunes and security software
    http://support.apple.com/kb/TS3125
    If the issue persists, you can try the information in this article as another option for restoring your device:
    What to do before selling or giving away your iPhone, iPad, or iPod touch
    http://support.apple.com/kb/ht5661
    Particularly:
    Before you sell or give away your iOS device, make sure that you've removed all of your personal information. Follow these steps to protect your data and get your device to its factory default state for the new owner:
    Back up your device.
    Go to Settings > General > Reset, then tap Erase All Content and Settings.
    This will completely erase your device and turn off iCloud, iMessage, FaceTime, Game Center, and other services.
    If you're using iOS 7 and have Find My iPhone turned on, your Apple ID and password will be required. After you provide your password, the device will be erased and removed from your account so that the next owner can activate it.
    Contact your carrier for guidance on transferring service to the new owner.
    When the device is turned on for the first time by the new owner, Setup Assistant will guide them through the setup process.
    Important: Do not manually delete contacts, calendars, reminders, documents, photo streams, or any other iCloud data while signed in to your iCloud account, or the content will also be deleted from the iCloud servers and all of your iCloud devices.
    Thanks for reaching out to Apple Support Communities.
    Cheers,
    Pedro

  • AppleCare or Customer Relations help requested

    I have been a Apple customer for years. Started with a II+ followed by Mac+, MacII, iMacG3, MacMini G4, then another Mini G4, and now a MacBook Pro. Macs have worked well for me, and in return, I have bought several Apple products.
    Unfortunatley, the string of luck with my Macs has come to an end.
    I got on the waiting list to upgrade to a MacBook Pro as soon as they came out. By the second week they were shipping, the store called me and told me they had my MacBook. Very good service.
    I get it home and first find that even though its a notebook, its NOT a laptop. Even Apple won't call it a laptop, they call it a portable. You know why? Thats because they run so darn hot, if you sit it in your lap while running, it will literlly burn your legs, and I have to put a pillow in between it and my legs.
    After 3 months of using my new MacBook Pro, it started rebooting randomly, had lousy performance, and was overheating. I took it to the apple store for repair. They replaced the motherboard, heat riser, and a couple of other items and return it to me. The random reboots are gone, but it still runs mighty hot.
    Take it home. Use it for a couple more months. Now it starts having other problems. Like it just randomly powers off. No warning. No battary low prompts. Just boom, and it shut off, even with a battary that has a 90% charge.
    It also locks up frequently. Not completely locked up, just fits of no response. For 15 or 30 seconds, the computer just won't respond. No mouse clicks. No keyboard entry. Won't even respond to trackpack movements.
    Then, after using it for a couple of hours, it gets dog slow. The same apps that ran okay right after you fired it up just start running slower and slower. It's not running out of memory, as I have 1.5GB of ram in it. For some reason, it just gets slower with use, even if you close all apps, and restart just one app. It's not whats running at all, it's just how long you use the computer after startup that effects it's performance. And no, I don't run PPC apps. I only run universal apps.
    I post this information on a popular mac forum, and someone tells me that the fan is defective, the computer is overheating, and therefore, the processors are slowing down to protect themselves from generating more heat.
    I know intel processor are designed to do this through their TM1 and TM2 thermal monitoring, so this makes complete sense.
    I go to apple store, they do some tests, tell me that I have a bad battery. They replace it under warranty and call it fixed, and tell me to pick it up.
    This may explain the random shutdowns, but it doesn't explain the constant performance problems. I told them about the overheating, and again, they failed to listen, although they did run 24 hours of testing. I guess they get so many hot MacBooks, that they don't really know when one really runs hotter than it is supposed to
    I guess I'm just frustrated. I'm going to pick it up again tomorrow, but with only a battery replacement, I really don't think they have addressed the other issues, and I know I will be bringing it back in again.
    As a devoted Apple customer, I am starting to loose faith. Can someone in AppleCare or cutomer relations help?

    Don't take this wrong. But I'm not going to spend $60 more dollars on this computer to turn it from a portable into a laptop.
    They sell these things to the un-suspecting public (at least in the beginning) as laptops. If the thing is designed wrong and gets too hot to put in a lap, then don't call it a laptop - which they don't now anymore.
    I may sound stubborn here but let me make my point. Everyone wants me to spend money to cover apples less than stellar product. Buy a iLap to make up for the heating problem. Buy apple care to insure that my lemon keeps on running after the warranty 1 year warranty. Someone else has suggested that I try to debug the problem with this computer and take pride in solving it myself - and to some point, I agree with him.
    But the thing that makes me mad is that I paid a premium for this computer. I was an early adopter that bought this computer in week 2. I bought this computer to move over to the intel core mac's for the extra performance they offer, and because I have had such good experiences with apple and Macs before. I'm just not going to invest more of my time in this thing. It's time apple step up and take responsiblity.

  • Oracle Schemas

    I have come to Oracle (specifically 9i) from SQL Server 2005 and have discovered that the concept of a "schema" seems to be quite different between the two databases.
    In Oracle, a schema it appears to model a user (e.g. HR, OE, and QS), while in SQL Server it models a "namespace" (e.g. HumanResources in the AdventureWorks database).
    I find that the SQL Server model makes more sense. In fact, I am puzzled as to how an Oracle database can be partitioned into functional areas (like HumanResources or Engineering).
    Can anyone shed some light on this. Does Oracle11g use the same concept of a schema as 9i?
    Thanks in advance for any help.
    Best regards,
    Frank J. Reashore
    Vancouver, canada

    Others have given you good advice with respect to your specific issue but I would like to give you some generic advice.
    If you want to be successful in Oracle you will need to put away the attitude that what you know is better than what you don't know. If you can't just walk away.
    Almost nothing in Oracle works the way it does in SQL Server. Almost none of the capabilities are as limited as those in SQL Server. And whether you want to talk about the definition of common words (database, instance, cluster, page, block) or how log files work, or how locking takes place, or whether temp tables are used ... if you try to do it the SQL Server way you are guaranteed to either fail or build insecure, unscalable, systems with corrupt data and lousy performance.
    Reconsider your approach.
    PS: The same is true for people schooled in Oracle moving to another company's RDBMS product. A failure to adjust one's thinking, and learn to read, is a road to failure.

  • Encore CS3 - Hangs and import problems with menus and avi footage with audio tracks present.

    I'm trying to import some footage files in .avi format into Encore CS3 and am having several problems.
    In what follows when I say "crash" I mean Encore quits with a standard Windows error. When I say "hang" I mean Encore does not quit but rather becomes an unresponsive process that must be killed manually using Windows Task Manager.
    My System:
    WinXP Pro SP2
    2.93 GHz Intel Celeron CPU - single core
    Intel PERL mobo
    3GB 400MHz RAM
    NVidia 7800GS video
    I mention these specs however I am not experiencing any hardware
    related bugs or issues (like out of memory or video driver crashes
    etc) when using any CS3 app.
    The footage files (as reported by GSpot) and problems I am seeing. In all cases GSpot reports that I have the appropriate codec installed:
    footage1.avi - Dx50 XVid 1.0.3, Dolby AC3 48kHz 192kb/s, 716834 KB
    Attempting to import as timeline causes Encore CS3 to hang.
    Importing as an Asset allows it to import but when I try to click
    on it in the project windown to add it to a new empty timeline,
    Encore hangs. It plays fine in Windows Media Player.
    footage2.avi - Dx50 XVid 1.0.3, MPEG-1 Layer 3 48kHz 128kb/s, 718302 KB
    Imports fine as either a timeline or asset but no audio track comes
    with it, even though there is an audio track in the footage. It plays
    fine in Windows Media Player. Video imports perfectly, just no audio.
    footage3.avi - Dx50 XVid 1.0.3, MPEG1-Layer3 48kHz 112kb/s, 717606 KB
    Imports fine as either a timeline or asset but no audio track comes
    with it, even though there is an audio track in the footage. It plays
    fine in Windows Media Player. Video imports perfectly, just no
    audio. I tried converting this one using AVS to both MPEG3 and DivX,
    each using MPEG1-Layer3 44kHz 192kb/s audio, and got the same issue
    upon importing into Encore CS3. I also loaded it into Camtasia
    Studio and it looked fine over there. The audio was present and I
    was able to export it as a .WAV file. When I tried looking at the
    file in Soundbooth CS3, there was no audio track or, more
    accurately, the audio track was a flat line.
    By way of comparison these are importing perfectly:
    footage4.avi - XVID XviD 1.1.0 Beta2, MPEG1-Layer3 48kHz 189kb/s Two mono
    channels LAME3.96r, 717436 KB.
    footage5.avi - XVID XviD 1.1.0 Final, Dolby AC3 48kHz 448kb/s Six
    channels (3/2 .1), 716418 KB.
    footage6.avi - XVID XviD 1.1.0 Final, Dolby AC3 48kHz 448kb/s Six
    channels (3/2 .1), 716418 KB.
    footage7.avi - XVID XviD 1.0.3, Dolby AC3 48kHz 448kb/s 6 channels
    (3/2 .1), 717502 KB
    footage8.avi - XVID XviD 1.0.3, Dolby AC3 48kHz 448kb/s 6 channels
    (3/2 .1), 717180 KB
    footage9.avi - XVID XviD 1.1.2 Final, Dolby AC3 48kHz 448kb/s 6 channels
    (3/2 .1), 2291904 KB
    Although this one imported & transcoded correctly, when I went to
    build it to a DVD image file I got some strange errors during the
    "planning video" stage that prevented the process from completing.
    Unfortunately I neglected to write down the error message I recieved
    which read like quite nonsensical babble to me, and I have since
    deleted the project folder. I remember it was not a memory or
    display related error and it included a timestamp where the error
    was happening. I played the footage in the preview panel thru the
    identified timestamp but it previewed perfectly without even a
    jitter in either the video or audio. Took forever to transcode so I
    doubt I'll be doing this one again anytime soon.
    Another issue I am having is with the motion menus. I burned a DVD and it does not load correctly in one of my DVD players (a Sony rdr-gx330 recorder/player). It appears to take extra time to recognize the disk, the startup (first-play) menu does not show automatically, and I must press the Main Menu button on the

    Another issue I am having is with the motion menus. I burned a DVD and it does not load correctly in one of my DVD players (a Sony rdr-gx330 recorder/player). It appears to take extra time to recognize the disk, the startup (first-play) menu does not show automatically, and I must press the Main Menu button on the remote control to make it appear. Other than that it seems to play ok in that player. Also tested it in 2 other DVD players (a Panasonic and Zenith) and it worked just fine like it's supposed to in them. Also plays perfectly in my computer's DVD player/burner using Windows Media Player v10. I tried burning to Memorex +RW single-layer single-sided disks at 6, 7, & 8Mbps project settings but got the exact same result every time.
    Encore CS3 crashes fairly often during authoring, particularly during imports, and when I go to edit things like calling up a timeline in the timeline panel, dragging footage into an existing timeline, or changing button links in a menu etc.. I have also experienced seemingly random hangs even more often, probably 5 or 6 times as often as a crash. Saving the project after every little change I make, then closing and restarting Encore allows me to make progress albeit at a significantly reduced workflow rate. Accessing Photoshop from Encore appears to be working flawlessly so far aside from the slow load times. Encore hangs sometimes during video transcoding, repeat the process and it works ok. If I try to import a menu from the library after importing my footage files, Encore hangs trying to import it either at the 50% mark or the 99% mark (it varies even with the same exact menu selected from the library). If I import the menu first and the footage after, then no problems. Also trying to render motion menus after importing footage causes Encore to hang, render before footage import and everything works ok. Interestingly enough, previewing my project in Encore's monitor or preview panels works perfectly even in High quality mode -- in fact it operates quite smoothly in comparison to other apps I have tried. Although those footage files that had missing audio tracks on import obviously played silently, I have yet to experience a hang or crash during a preview or when scrubbing through a timeline in the monitor panel.
    The last problem I've seen is with the project cache files. It doesn't always get rid of cache files when stuff is deleted from the project. I was working on one project and surprisingly ran out of disk space. When I tracked it down I found that there were some two dozen copies of a footage file I had repeatedly imported and deleted from the project still sitting in the cache folder. The file was an avi about 2GB so it was taking up some 40 GB of disk space for stuff that was no longer in the project. I tried shutting down Encore and manually deleting all the cache files, and that worked for most of them, but there was one 2GB file that kept reappearing in the cache folder even though it was no longer in the project. The XML file in the cache folder still had references to it in there so I can only assume that was the reason it kept showing up. I eventually had to delete and totally recreate the project from scratch to regain my lost disk space. This included having to sit thru a 6 hour long transcoding session which pissed me off to no end.
    Encore CS3 appears on the surface to be a very powerful, feature rich, and approachable app with a simple easy to use interface, but it is awfully bugged in just about every area I have explored and still needs lots and lots of work for it to be really useful. Frankly for me it is pretty damn annoying that I have invested so much money in this suite only to find several of the apps in it (Encore & Premiere in particular) behaving this poorly. In my opinion every single app in the suite exhibits lousy performance whenever you must do anything at all involving the disk from the time you click to start it up until the time you click to close it down. Several of them are just as seriously bugged as Encore too, and being a long time Photoshop v5.5 user I expected much more from Adobe. Even Photoshop is two or three times as slow now (e.g. it takes about 20-30 seconds to simply create an empty 500x500 pixel image file and almost a full minute to load a 30x60 pixel single-layer gif). So far I have attempted to create 6 DVDs using Encore and after 3 weeks of effort have only succeeded in getting one of them onto a final disk -- and that one is kinda limping to work correctly in my DVD player. Three of those projects became so corrupted while working on them that I had to delete them completely and recreate them from scratch, only to find that I still couldn't get them to completion for one reason or another. Seriously disappointing.

Maybe you are looking for

  • Dunning Letter on Receivables Aging SAP B1 2007

    You run the customer receivables aging report and double click a row on receivables aging report window. You highlight one row, there is no letter checkbox compare to SAP 2005 A. There is no Dunning Letter PLD as well. Therefore dunning letter report

  • DRM for iPod/iTunes for multiple iPods, shared computers

    In my family there are four people who use iTunes/iPod. Two are adults with Apple IDs, two are children who do not (and will not) have credit cards. We have two shared Windows XP computers and two shared Mac OS 9 machines. We just got our third iPod.

  • Error in Control record

    Hi All,           i used to work on US payroll all these days in my IDES, but today when i was working with india specific, when i am creating the control record, and entering the payroll period as 01 2010, it is not showing it as 01.04.2010 instead

  • Gross Wight is not gettting copied between Delivery and Billing Document

    Hello, I´m trying to determine freight costs based on data from Delivery document (essentialy weight). Although, when I change manually the "Total weight" in the Delivery screen (which is actually the gross weight, I assume), this new weight is not c

  • RE: Query Design

    HI , I have  a requirement like this , we have 4 KFs KF!,KF2,KF3,KF4 . characters calmonth , and we have varaible version week (by default it is current week). So my requirement is i need to have 4 KFs(KF1,KF2,KF3,KF4) for first 3months based on the