Some thoughts on adding ram to G4 sawtooth

I've gone down the list and read all I can about adding ram to my G4. It now has 2-64mb pc-100's, 1-256mb pc-100,and 1-128mb pc133-333. I would like to max it out since it's been freezing up during graphic work when I have more than one Adobe application running. Am I correct in thinking that's the problem or could there be something else going on? I have already expanded with a 1ghz processor, and 120gig hard drive. I'm running Panther. Any suggestions? Also I read somewhere that it matters what slot you put each ram chip in. Is there some order to this?
powermac G4 (agp graphics),Powerbook Titanium OSX   Mac OS X (10.3.9)  

Claire, In the Finder under the Apple/About this Mac/More Info/Memory you will see the Memory Slot order of each stick follow by their size. Take a look here she is replacing memory in the first slot, also the size should be printed on the side each stick.
http://www.info.apple.com/usen/cip/html/g4ge/g4gbenet-memory-cip.mov.html
Joe
Power Mac G4 Gigabit Ethernet   Mac OS X (10.3.9)  

Similar Messages

  • Some Thoughts On An OWB Performance/Testing Framework

    Hi all,
    I've been giving some thought recently to how we could build a performance tuning and testing framework around Oracle Warehouse Builder. Specifically, I'm looking at was in which we can use some of the performance tuning techniques described in Cary Millsap/Jeff Holt's book "Optimizing Oracle Performance" to profile and performance tune mappings and process flows, and to use some of the ideas put forward in Kent Graziano's Agile Methods in Data Warehousing paper http://www.rmoug.org/td2005pres/graziano.zip and Steven Feuernstein's utPLSQL project http://utplsql.sourceforge.net/ to provide an agile/test-driven way of developing mappings, process flows and modules. The aim of this is to ensure that the mappings we put together are as efficient as possible, work individually and together as expected, and are quick to develop and test.
    At the moment, most people's experience of performance tuning OWB mappings is firstly to see if it runs set-based rather than row-based, then perhaps to extract the main SQL statement and run an explain plan on it, then check to make sure indexes etc are being used ok. This involves a lot of manual work, doesn't factor in the data available from the wait interface, doesn't store the execution plans anywhere, and doesn't really scale out to encompass entire batches of mapping (process flows).
    For some background reading on Cary Millsap/Jeff Holt's approach to profiling and performance tuning, take a look at http://www.rittman.net/archives/000961.html and http://www.rittman.net/work_stuff/extended_sql_trace_and_tkprof.htm. Basically, this approach traces the SQL that is generated by a batch file (read: mapping) and generates a file that can be later used to replay the SQL commands used, the explain plans that relate to the SQL, details on what wait events occurred during execution, and provides at the end a profile listing that tells you where the majority of your time went during the batch. It's currently the "preferred" way of tuning applications as it focuses all the tuning effort on precisely the issues that are slowing your mappings down, rather than database-wide issues that might not be relevant to your mapping.
    For some background information on agile methods, take a look at Kent Graziano's paper, this one on test-driven development http://c2.com/cgi/wiki?TestDrivenDevelopment , this one http://martinfowler.com/articles/evodb.html on agile database development, and the sourceforge project for utPLSQL http://utplsql.sourceforge.net/. What this is all about is having a development methodology that builds in quality but is flexible and responsive to changes in customer requirements. The benefit of using utPLSQL (or any unit testing framework) is that you can automatically check your altered mappings to see that they still return logically correct data, meaning that you can make changes to your data model and mappings whilst still being sure that it'll still compile and run.
    Observations On The Current State of OWB Performance Tuning & Testing
    At present, when you build OWB mappings, there is no way (within the OWB GUI) to determine how "efficient" the mapping is. Often, when building the mapping against development data, the mapping executes quickly and yet when run against the full dataset, problems then occur. The mapping is built "in isolation" from its effect on the database and there is no handy tool for determining how efficient the SQL is.
    OWB doesn't come with any methodology or testing framework, and so apart from checking that the mapping has run, and that the number of rows inserted/updated/deleted looks correct, there is nothing really to tell you whether there are any "logical" errors. Also, there is no OWB methodology for integration testing, unit testing, or any other sort of testing, and we need to put one in place. Note - OWB does come with auditing, error reporting and so on, but there's no framework for guiding the user through a regime of unit testing, integration testing, system testing and so on, which I would imagine more complete developer GUIs come with. Certainly there's no built in ability to use testing frameworks such as utPLSQL, or a part of the application that let's you record whether a mapping has been tested, and changes the test status of mappings when you make changes to ones that they are dependent on.
    OWB is effectively a code generator, and this code runs against the Oracle database just like any other SQL or PL/SQL code. There is a whole world of information and techniques out there for tuning SQL and PL/SQL, and one particular methodology that we quite like is the Cary Millsap/Jeff Holt "Extended SQL Trace" approach that uses Oracle diagnostic events to find out exactly what went on during the running of a batch of SQL commands. We've been pretty successful using this approach to tune customer applications and batch jobs, and we'd like to use this, together with the "Method R" performance profiling methodology detailed in the book "Optimising Oracle Performance", as a way of tuning our generated mapping code.
    Whilst we want to build performance and quality into our code, we also don't want to overburden developers with an unwieldy development approach, because what we'll know will happen is that after a short amount of time, it won't get used. Given that we want this framework to be used for all mappings, it's got to be easy to use, cause minimal overhead, and have results that are easy to interpret. If at all possible, we'd like to use some of the ideas from agile methodologies such as eXtreme Programming, SCRUM and so on to build in quality but minimise paperwork.
    We also recognise that there are quite a few settings that can be changed at a session and instance level, that can have an effect on the performance of a mapping. Some of these include initialisation parameters that can change the amount of memory assigned to the instance and the amount of memory subsequently assigned to caches, sort areas and the like, preferences that can be set so that indexes are preferred over table scans, and other such "tweaks" to the Oracle instance we're working with. For reference, the version of Oracle we're going to use to both run our code and store our data is Oracle 10g 10.1.0.3 Enterprise Edition, running on Sun Solaris 64-bit.
    Some initial thoughts on how this could be accomplished
    - Put in place some method for automatically / easily generating explain plans for OWB mappings (issue - this is only relevant for mappings that are set based, and what about pre- and post- mapping triggers)
    - Put in place a method for starting and stopping an event 10046 extended SQL trace for a mapping
    - Put in place a way of detecting whether the explain plan / cost / timing for a mapping changes significantly
    - Put in place a way of tracing a collection of mappings, i.e. a process flow
    - The way of enabling tracing should either be built in by default, or easily added by the OWB developer. Ideally it should be simple to switch it on or off (perhaps levels of event 10046 tracing?)
    - Perhaps store trace results in a repository? reporting? exception reporting?
    at an instance level, come up with some stock recommendations for instance settings
    - identify the set of instance and session settings that are relevant for ETL jobs, and determine what effect changing them has on the ETL job
    - put in place a regime that records key instance indicators (STATSPACK / ASH) and allows reports to be run / exceptions to be reported
    - Incorporate any existing "performance best practices" for OWB development
    - define a lightweight regime for unit testing (as per agile methodologies) and a way of automating it (utPLSQL?) and of recording the results so we can check the status of dependent mappings easily
    other ideas around testing?
    Suggested Approach
    - For mapping tracing and generation of explain plans, a pre- and post-mapping trigger that turns extended SQL trace on and off, places the trace file in a predetermined spot, formats the trace file and dumps the output to repository tables.
    - For process flows, something that does the same at the start and end of the process. Issue - how might this conflict with mapping level tracing controls?
    - Within the mapping/process flow tracing repository, store the values of historic executions, have an exception report that tells you when a mapping execution time varies by a certain amount
    - get the standard set of preferred initialisation parameters for a DW, use these as the start point for the stock recommendations. Identify which ones have an effect on an ETL job.
    - identify the standard steps Oracle recommends for getting the best performance out of OWB (workstation RAM etc) - see OWB Performance Tips http://www.rittman.net/archives/001031.html and Optimizing Oracle Warehouse Builder Performance http://www.oracle.com/technology/products/warehouse/pdf/OWBPerformanceWP.pdf
    - Investigate what additional tuning options and advisers are available with 10g
    - Investigate the effect of system statistics & come up with recommendations.
    Further reading / resources:
    - Diagnosing Performance Problems Using Extended Trace" Cary Millsap
    http://otn.oracle.com/oramag/oracle/04-jan/o14tech_perf.html
    - "Performance Tuning With STATSPACK" Connie Dialeris and Graham Wood
    http://www.oracle.com/oramag/oracle/00-sep/index.html?o50tun.html
    - "Performance Tuning with Statspack, Part II" Connie Dialeris and Graham Wood
    http://otn.oracle.com/deploy/performance/pdf/statspack_tuning_otn_new.pdf
    - "Analyzing a Statspack Report: A Guide to the Detail Pages" Connie Dialeris and Graham Wood
    http://www.oracle.com/oramag/oracle/00-nov/index.html?o60tun_ol.html
    - "Why Isn't Oracle Using My Index?!" Jonathan Lewis
    http://www.dbazine.com/jlewis12.shtml
    - "Performance Tuning Enhancements in Oracle Database 10g" Oracle-Base.com
    http://www.oracle-base.com/articles/10g/PerformanceTuningEnhancements10g.php
    - Introduction to Method R and Hotsos Profiler (Cary Millsap, free reg. required)
    http://www.hotsos.com/downloads/registered/00000029.pdf
    - Exploring the Oracle Database 10g Wait Interface (Robin Schumacher)
    http://otn.oracle.com/pub/articles/schumacher_10gwait.html
    - Article referencing an OWB forum posting
    http://www.rittman.net/archives/001031.html
    - How do I inspect error logs in Warehouse Builder? - OWB Exchange tip
    http://www.oracle.com/technology/products/warehouse/pdf/Cases/case10.pdf
    - What is the fastest way to load data from files? - OWB exchange tip
    http://www.oracle.com/technology/products/warehouse/pdf/Cases/case1.pdf
    - Optimizing Oracle Warehouse Builder Performance - Oracle White Paper
    http://www.oracle.com/technology/products/warehouse/pdf/OWBPerformanceWP.pdf
    - OWB Advanced ETL topics - including sections on operating modes, partition exchange loading
    http://www.oracle.com/technology/products/warehouse/selfserv_edu/advanced_ETL.html
    - Niall Litchfield's Simple Profiler (a creative commons-licensed trace file profiler, based on Oracle Trace Analyzer, that displays the response time profile through HTMLDB. Perhaps could be used as the basis for the repository/reporting part of the project)
    http://www.niall.litchfield.dial.pipex.com/SimpleProfiler/SimpleProfiler.html
    - Welcome to the utPLSQL Project - a PL/SQL unit testing framework by Steven Feuernstein. Could be useful for automating the process of unit testing mappings.
    http://utplsql.sourceforge.net/
    Relevant postings from the OTN OWB Forum
    - Bulk Insert - Configuration Settings in OWB
    http://forums.oracle.com/forums/thread.jsp?forum=57&thread=291269&tstart=30&trange=15
    - Default Performance Parameters
    http://forums.oracle.com/forums/thread.jsp?forum=57&thread=213265&message=588419&q=706572666f726d616e6365#588419
    - Performance Improvements
    http://forums.oracle.com/forums/thread.jsp?forum=57&thread=270350&message=820365&q=706572666f726d616e6365#820365
    - Map Operator performance
    http://forums.oracle.com/forums/thread.jsp?forum=57&thread=238184&message=681817&q=706572666f726d616e6365#681817
    - Performance of mapping with FILTER
    http://forums.oracle.com/forums/thread.jsp?forum=57&thread=273221&message=830732&q=706572666f726d616e6365#830732
    - Poor mapping performance
    http://forums.oracle.com/forums/thread.jsp?forum=57&thread=275059&message=838812&q=706572666f726d616e6365#838812
    - Optimizing Mapping Performance With OWB
    http://forums.oracle.com/forums/thread.jsp?forum=57&thread=269552&message=815295&q=706572666f726d616e6365#815295
    - Performance of mapping with FILTER
    http://forums.oracle.com/forums/thread.jsp?forum=57&thread=273221&message=830732&q=706572666f726d616e6365#830732
    - Performance of the OWB-Repository
    http://forums.oracle.com/forums/thread.jsp?forum=57&thread=66271&message=66271&q=706572666f726d616e6365#66271
    - One large JOIN or many small ones?
    http://forums.oracle.com/forums/thread.jsp?forum=57&thread=202784&message=553503&q=706572666f726d616e6365#553503
    - NATIVE PL SQL with OWB9i
    http://forums.oracle.com/forums/thread.jsp?forum=57&thread=270273&message=818390&q=706572666f726d616e6365#818390
    Next Steps
    Although this is something that I'll be progressing with anyway, I'd appreciate any comment from existing OWB users as to how they currently perform performance tuning and testing. Whilst these are perhaps two distinct subject areas, they can be thought of as the core of an "OWB Best Practices" framework and I'd be prepared to write the results up as a freely downloadable whitepaper. With this in mind, does anyone have an existing best practices for tuning or testing, have they tried using SQL trace and TKPROF to profile mappings and process flows, or have you used a unit testing framework such as utPLSQL to automatically test the set of mappings that make up your project?
    Any feedback, add it to this forum posting or send directly through to me at [email protected]. I'll report back on a proposed approach in due course.

    Hi Mark,
    interesting post, but I think you may be focusing on the trees, and losing sight of the forest.
    Coincidentally, I've been giving quite a lot of thought lately to some aspects of your post. They relate to some new stuff I'm doing. Maybe I'll be able to answer in more detail later, but I do have a few preliminary thoughts.
    1. 'How efficient is the generated code' is a perennial topic. There are still some people who believe that a code generator like OWB cannot be in the same league as hand-crafted SQL. I answered that question quite definitely: "We carefully timed execution of full-size runs of both the original code and the OWB versions. Take it from me, the code that OWB generates is every bit as fast as the very best hand-crafted and fully tuned code that an expert programmer can produce."
    The link is http://www.donnapkelly.pwp.blueyonder.co.uk/generated_code.htm
    That said, it still behooves the developer to have a solid understanding of what the generated code will actually do, such as how it will take advantage of indexes, and so on. If not, the developer can create such monstrosities as lookups into an un-indexed field (I've seen that).
    2. The real issue is not how fast any particular generated mapping runs, but whether or not the system as a whole is fit for purpose. Most often, that means: does it fit within its batch update window? My technique is to dump the process flow into Microsoft Project, and then to add the timings for each process. That creates a Critical Path, and then I can visually inspect it for any bottleneck processes. I usually find that there are not more than one or two dogs. I'll concentrate on those, fix them, and re-do the flow timings. I would add this: the dogs I have seen, I have invariably replaced. They were just garbage, They did not need tuning at all - just scrapping.
    Gee, but this whole thing is minimum effort and real fast! I generally figure that it takes maybe a day or two (max) to soup up system performance to the point where it whizzes.
    Fact is, I don't really care whether there are a lot of sub-optimal processes. All I really care about is performance of the system as a whole. This technique seems to work for me. 'Course, it depends on architecting the thing properly in the first place. Otherwise, no amount of tuning of going to help worth a darn.
    Conversely (re. my note about replacing dogs) I do not think I have ever tuned a piece of OWB-generated code. Never found a need to. Not once. Not ever.
    That's not to say I do not recognise the value of playing with deployment configuration parameters. Obviously, I set auditing=none, and operating mode=set based, and sometimes, I play with a couple of different target environments to fool around with partitioning, for example. Nonetheless, if it is not a switch or a knob inside OWB, I do not touch it. This is in line with my dictat that you shall use no other tool than OWB to develop data warehouses. (And that includes all documentation!). (OK, I'll accept MS Project)
    Finally, you raise the concept of a 'testing framework'. This is a major part of what I am working on at the moment. This is a tough one. Clearly, the developer must unit test each mapping in a design-model-deploy-execute cycle, paying attention to both functionality and performance. When the developer is satisifed, that mapping will be marked as 'done' in the project workbook. Mappings will form part of a stream, executed as a process flow. Each process flow will usually terminate in a dimension, a fact, or an aggregate. Each process flow will be tested as an integrated whole. There will be test strategies devised, and test cases constructed. There will finally be system tests, to verify the validity of the system as a production-grade whole. (stuff like recovery/restart, late-arriving data, and so on)
    For me, I use EDM (TM). That's the methodology I created (and trademarked) twenty years ago: Evolutionary Development Methodology (TM). This is a spiral methodology based around prototyping cycles within Stage cycles within Release cycles. For OWB, a Stage would consist (say) of a Dimensional update. What I am trying to now is to graft this within a traditional waterfall methodology, and I am having the same difficulties I had when I tried to do it then.
    All suggestions on how to do that grafting gratefully received!
    To sum up, I 'm kinda at a loss as to why you want to go deep into OWB-generated code performance stuff. Jeepers, architect the thing right, and the code runs fast enough for anyone. I've worked on ultra-large OWB systems, including validating the largest data warehouse in the UK. I've never found any value in 'tuning' the code. What I'd like you to comment on is this: what will it buy you?
    Cheers,
    Donna
    http://www.donnapkelly.pwp.blueyonder.co.uk

  • Inspiron 530 - Adding RAM

    Hello,
    I would like to purchase some memory for my inspiron 530 ! From here> http://accessories.euro.dell.com/sna/category.aspx?c=uk&l=en&s=dhs&cs=ukdhs1&category_id=4325&mfgpid=1226170&chassisid=-1 <<- The 1 gb module (28 pounds)
    I have read on the documentation ( http://support.dell.com/support/edocs/systems/inspd530/en/OM/HTML/parts.htm#wp1580857 ) that when i install memory i need to remove the PCI Express x16 card ... I don't understand. I thought adding ram was simple and all you had to do is open the case , insert the ram and fit them in properly and then close the case. Start the pc and it's done.
    I know that my inspiron 530 can only have 4gb of ram . I have 2 gb of ram at the moment.
    Is there something I need to know? Is there something different?

  • Adding RAM to 1.6 GHz G5

    I just finished a rather strange process of adding RAM to my G5 which I thought I would share, in case it helps others:
    First of all, I have a 1.6 GHz CPU machine, which has 4 DIMM slots. According to the Setup Guide, since there are only 4 slots (not 8), you need to use 333 MHz, PC2700 DIMMs (not 400 MHz, PC3200 DIMMs).
    When I first got my G5, it came with 256 MB of RAM (2 x 128 MB 333 MHz, PC2700). About a year ago, I added 512 MB (2 x 256 MB 333 MHz, PC2700), bringing the total up to 768 MB. This addition went smoothly, without any problems.
    Recently, I decided to replace the original 256 MB of RAM, with two more 256 MB modules, bringing the total up to 1.5 GB, so I ordered "1 GB DDR400 PC3200 2x512 DIMMS" (Part # M9653G/A) directly from Apple. Note that the RAM Apple currently sells is 400 MHz, PC3200, not 333 MHz, PC2700. In fact, before placing the order online, I called Apple to ask about this, and the person I talked to assured me that this would not be a problem, and that my 1.6 GHz would work with the 400 MHz, PC3200, despite what the manual says, so I ordered them.
    Upon receipt, I removed the original pieces, installed the new ones, and my G5 would not acknowledge their presence. It said I only had 512 MB. I called Apple to report the problem, and they suspected it was simply faulty RAM, so they sent replacements. When I got them, I installed them, and got the same results. I called Apple to tell them it still wasn't working, and they had no idea why. Once again, I pointed out that they were sending me 400 MHz, PC3200, not 333 MHz, PC2700, and they still insisted that this was not the problem, but I started to suspect even more that it was, especially after trying to upgrade twice.
    In order to test this theory, I ordered RAM from MacMall. This time I ordered 1 GB, in the form of 2 x 512 MB, 333 MHz, PC2700 (made by Kingston, Manufacturer Part No. KTA-G5333/1G). When I received these, I moved the original replacement RAM up into the top two slots, where the original ones were, and placed the two new ones in the bottom two slots. When I tried to turn the computer back on, it wouldn't. The light on the front would flash once, then pause, then flash once again, then pause, etc. When I checked the manual's troubleshooting section, it described what two, three, four, and 5 or 6 flashes meant, but not one. URGH!
    At this point, I had no idea what was wrong, so I took my G5 to my local Apple store to ask them. The guy there had the idea of putting one of the original replacement RAM in one of the two top slots, and one of the new replacement RAM in the other top slot. Then he put one of the original ones in one of the two bottom slots, and one of the new ones in the other bottom slot. Believe it or not, not only did the computer turn back on, but it also recognized the new total of 1.5 GB of RAM!!!
    This was such a strange process that I thought I would share it, in case someone is also currently going through this process, or wants to in the future.
    Hope this helps, and I also hope I've made it clear exactly what happened. If anything isn't clear, please let me know, and I'll try again.
    Trent
    Power Mac G5   Mac OS X (10.4.4)  

    PC3200 is fine for that system. The problem was that you were installing the RAM in the wrong slots.
    The RAM must be installed in matched pairs, one in the top, one in the bottom, working out from the middlemost pair.
    Viewed from the side
    ======== Slot J14
    ======== Slot J12
    ======== Slot J11
    ======== Slot J13
    So, one pair of identical parts goes in J11/J12, and another pair goes in J13/J14. Doesn't matter if the bigger pair is first or second.
    This wasn't in the manual for your Mac?
    Quad G5 2.5Ghz 4.5GB 2x250G, PB 15" 1.5Ghz,80G,1.5G   Mac OS X (10.4.4)  

  • Does adding RAM help solve iMovie4 jerky playback

    I have been reading threads about jerky playback in iMovie4 in large projects, after adding music from iTunes etc. My wife has a monster project and playback during editing has became very eratic. I have emptied cache, repaired the hard drive, repaired permissions etc. I have not tossed iMovie preferences. I have 23 gigs of space on the HD and a 250gig external FW HD that I bought after she started the project on the internal HD. I would like to know from you experts: 1. Would it help to move the project to the larger external HD? 2. Would adding RAM help (currently iMac G5 1.8mhz with 768 megs RAM)? 3. Is there any way to tell if the project will export to iDVD smoothly if it continues to look terrible in iMovie playback? I hate to go through a few hours of rendering for nothing. Thanks so much for your help. I always find good and useful info in discussions.
    Mike Ryan

    Michael, there are some very useful applications that I highly recommend to keep your Mac running smoothly. As a defragmenter I use Allsoft's Disk Warrior that also has a very good directory repair feature. Check out also Drive Genius that has a host of useful features. Also get iMovie file fixers, invaluable tool that is free that will allow iMovie to import certain files in audio and photo formats that are greyed out and also has other features. I believe I got mine from Mr. Lennart Thelander in previous posts here. Cache Out is an excellent application to clear out cache files. I also advise you do a regular permission repair from Disk Utility in your Utilities folder that is on your Mac. These are basically my must have applications. As for copying your iMovie project to your external, just drag your entire project folder ( the blue one with the media and star folder(s) in it directly in the external hard drive. I find that iMovie is one application that requires a clean (defragmented) drive with ample space to spare. I defrag my drives if I trash files often from them, or if I notice erratic and jerky performance or slowness in response. I hope that some of my suggestions will help your in your editing.

  • Does adding RAM help increase my laptop speed significantly?

    I have a 667MHz Powerbook G4 with 512MB RAM. It's getting really slow lately, especially after installing MacOS 10.4 and Adobe CS2.
    I tried not to open too many programs at a time, but it doesn't seem to help much.
    Does anybody have the same problem with the newer Powerbook or Macbook Pro? I'm wondering if I need a new laptop, or just an extra RAM.

    Hi shintaoetama,
    first of all: WELCOME TO THE DISCUSSIONS!
    Although adding RAM may increase speed a little running some routine system maintenance will certainly speed up things much more radically: MacOS X 10.3/10.4: System maintenance
    If this answered your question please consider granting some stars: Why reward points?

  • Adding RAM - which slot has original 64?

    When I bought my flower power imac 500 mhz I added ram. It was originally supposed to come with two 64 mg slots but I had 128 installed bringing it up to 192. Thought back then this was enough : ) . I want to add RAM but don't know which one is the 64 Mg one. Would it be the bottom or the top one. I tried to read the labels and it doesn't seem to say on it. One does have a blue stickey dot on it. I was thinking this is the larger one. It is on the top when you are looking inside the slot in the back. Anyone know if there is anyway to tell. Don't want to take the wrong one out.

    Hello Kathy,
    Go into System Profiler. Click on Memory. It will
    show the two slots. On my 400MHz the two slots
    are J13 and J14. J13 is the TOP slot. Hope this helps.
    wtk

  • I have found that my Mac Book Pro 15 inch late 2011 is slower than my PC Windows 7 for certain downloads, and working with iPhoto.  Would adding RAM be helpful in speeding up this process.  Currently I have the stock 4GB RAM, and storage 500GB HD drive.

    I have found that my Mac Book Pro late 2011 15 inch is slower than I expected with certain downloads, or working with iPhoto than expected compared to my previous Windows 7 PC.  Would adding RAM be of any value in speeding up the process?

    how to tell if your mac needs more ram

  • My iTunes won't open. I recently added RAM to my computer. Other than that, nothing is different. Help!

    My iTunes won't open. I recently added RAM to my computer. Other than that, nothing is different. Help!

    Hello Kcook2484
    Download iTunes again and install it again to troubleshooting
    iTunes 11.1.2
    http://support.apple.com/kb/dl1614
    Regards,
    -Norm G.

  • HT201774 How do I delete excess email messages from my iPhone.  There are 300 messages listed, and every time I delete some, more are added from old messages.  How do I limit the number of messages that can appear on my phone?  How can I delete messages i

    How do I delete excess email messages from my iPhone.  There are 300 messages listed, and every time I delete some, more are added from old messages.  How do I limit the number of messages that can appear on my phone?  How can I delete messages in bulk,

    You can't.
    All photos transferred from your computer are stored in the Photo Library. The photos in the album or albums below are not duplicates - they include a pointer to the original photos stored in the Photo Library. This way you can view the photos in a particular album only by selecting the album, or you can view all photos available in all albums by selecting Photo Library.
    Just as with an iTunes playlist. A song cannot be in an iTunes playlist unless the song is in the main iTunes library. Placing a song in a playlist does not duplicate the song. Remove the main iTunes library or the songs from the main iTunes library that are in a playlist and the songs will no longer be in the playlist either. This way you can listen to the songs in the playlist only by selecting the playlist, or all songs in your entire iTunes library by selecting Music under Library in the iTunes source list.
    The same with iPhoto on a Mac. A photo cannot be in an in iPhoto Event or Album unless the photo is in the main iPhoto library. Placing a photo in an Event or Album does not duplicate the photo. This way you can view the photos in an iPhoto Event or Album only by selecting the Event or Album, or all photos in all Events or Albums.

  • Just some thoughts

    Just some thoughts.
    Adobe says that piracy is one of the reasons for going to the cloud.  CC has to verify the subscription periodically why not add that process to the perpetual license for a nominal yearly fee.  On the other hand, why can't a user subscribe to CC for a month to get the latest version and then go on the web to get a crack?
    I also own, unless they go to a subscription, Elements 10.  Lots of functions that CS6 has for photo enhancements.  I found some plugins that really enhance Elements. e.g. ElementsXXL adds new menu items, icons, buttons, key shortcuts and dialogs, so they seamlessly integrate into the user interface of Photoshop Elements. ElementsXXL bridges the gap between Photoshop Elements and Photoshop and greatly enhances the image editing experience in Photoshop Elements.
    I am sure other plugins will appear to make Elements more like Photoshop.

    "Adobe says that piracy is one of the reasons for going to the cloud. CC has to verify the subscription periodically why not add that process to the perpetual license for a nominal yearly fee."
    Why on earth would anyone want to pay a "nominal yearly fee" for another nuisance created by Adobe.  They can already shut down perpetual license holder's software if they pull the plug on the activation servers.  Don't you know that their servers check our software now as it is?
    Just recently with the release of the CS2 software to previous owners, many people could no longer use their already-installed CS2 software because Adobe's CS2 servers were shut down.  What does that tell you?
    I'm not going the way of cruising the internet for cracks.  That's for disillusioned kids who think everything should be free.  When my CS6 software no longer works I'm going with another company.  Probably Corel for some of the programs.  CorelDRAW was always better than Illustrator anyway.  Industry standard doesn't always mean it's great.  It just means everyone is using it and they expect you to do as well.  Personally, I think Macromedia made better software before Adobe took them over and MM software was a lot cheaper.

  • Please note that i am seeking some assistance with adding the followng resource: A maximum of 4 resources can be added at an additional cost of $500 per day per resource.

    Hi, Please note that i am seeking some assistance with adding the following resource to a task in Microsoft Project 2010:
    A maximum of 4 resources can be added at an additional cost of $500 per day per resource.

    Hi Clemence,
    Your post has quite little details, but as far as I understand, you want to add 4 resources on a task, each resource costing 500$ a day. Since it is not mentionned if it is generic resources (skill or role) or named resources, I'll assume that it is generic
    resources. Based on this understanding, here is a proposal:
    Go to the resource sheet,
    Create a work generic resource ("type" field set to "work", "generic" field to "yes"), typing in the resource name field the skill name,
    In the max unit field, enter 400% for 4 resources (for just 4 if  your Project 2010 is configured to display units in decimals),
    Since the rate is an hourly rate, make the calculation : 500$ per day / 8 hours a day = 62,5$ per hour,
    Go back to the Gantt Chart, split the window (click "detail" in the view tab of the ribbon),
    Select your task and assign the newly created resource with a given % units.
    Hope this helps.
    Guillaume Rouyre - MBA, MCP, MCTS

  • Adding RAM for Oracle 9i

    Hi,
    I have an oracle database 9iR2 installed in my server (32 bits processor, 4GB ram). And I want to add more RAM for oracle database. It is possible? and do I need to do any setting for the previous oracle instance to fit the newly added RAM (Swap space)?
    Thank you.
    ~ vincent.

    Hello,
    yes, you can add more RAM without any problem. Take into account that if the O.S. it's windows 32b. , it will only see 3,3 GB (unless using PAE). Linux OS has no problem with that.
    You don't have to make any previous settings or changes in the database. Just add RAM. After you've added the RAM, the database just will continue using the same amount as before. You'll have to change this modifying all the memory related parameters (SGA, ...)
    Regards

  • Added RAM and now Applications won't open

    I just added RAM (8GB)to my computer and now my InDesign will not open.
    I did the OptApple+PR reset.
    Do I need to do something after upgrading RAM so things run smooth?

    Will the computer start up normally? If not, is the led on the front blinking? If it does start up normally how much RAM does it think it has?
    Have you tried going back to the original RAM?

  • Speed issue after adding RAM

    Hello all,
    I recently upgraded RAM on my IMac from 4gb to 16gb. I practically see no difference. My FCPX version is 10.0.9  Rendering takes forever. Am I missing something? Do I need to reconfigure iMac?
    Any suggestion is highly appreciated

    Am I missing something?
    Yes. Adding RAM may not provide any noticeable change.
    About OS X Memory Management and Usage
    Using Activity Monitor to read System Memory & determine how much RAM is used
    Understanding top output in the Terminal
    The amount of available RAM for applications is the sum of Free RAM and Inactive RAM. This will change as applications are opened and closed or change from active to inactive status. The Swap figure represents an estimate of the total amount of swap space required for VM if used, but does not necessarily indicate the actual size of the existing swap file. If you are really in need of more RAM that would be indicated by how frequently the system uses VM. If you open the Terminal and run the top command at the prompt you will find information reported on Pageins () and Pageouts (). Pageouts () is the important figure. If the value in the parentheses is 0 (zero) then OS X is not making instantaneous use of VM which means you have adequate physical RAM for the system with the applications you have loaded. If the figure in parentheses is running positive and your hard drive is constantly being used (thrashing) then you need more physical RAM.
    Adding RAM only makes it possible to run more programs concurrently.  It doesn't speed up the computer nor make games run faster.  What it can do is prevent the system from having to use disk-based VM when it runs out of RAM because you are trying to run too many applications concurrently or using applications that are extremely RAM dependent.  It will improve the performance of applications that run mostly in RAM or when loading programs.

Maybe you are looking for

  • Where can I find a tutorial to Excel features like using drop down tables and other little known features

    Using data from a table in an external sheet can be done but is tricky.  I would like to find rules supporting this feature.  Also, I'd like to find discussions of other little known features to improve and add sophistication to custom spreadsheets

  • Need Suggestion on DB2 HA Setup on AIX 6.1

    Dear All, We are configuring HA for PRD System having below configuration: OS: AIX 6.1 Database: DB2 9.1 Fix pack 7 System: ERP 6.0 EHP4 What we have done so far: We have installed DB & CI on node 1 (primary node) and ASCS & Dialog Instance on node 2

  • Creative Console doesn't show DD or DTS ic

    When I open my Creative Console in Entertainment Mode, I don't see any icons for Dolby Digital Controls or DTS. I have an X-Fi Elite Pro. I'm unable to hear my xbox audio, which I've plugged into the optical-in using an optical cable from the xbox to

  • Help!!  Since installing win 7, A notice comes up that says" cannot find network adapter".

    I already started over and partitioned twice with same result.  Before I needed more space and had to allocate more gb it worked fine for over a year. Now I cant even get online wirelessly!!

  • Optical Audio setup?

    Does anyone know if you need to set up the optical audio on the 6416 stb in order for it to work? In addition to the HDMI handshake issue now I cannot use my Yamaha home theatre because when I converted to component I had to use the (2) sound outputs