IMac & MBP - The Efficient Way

Hi,
I have an old 20' iMac(early 2008), and a new 17" MBP(mid 2010).
I'm thinking about UPGRADING my computers but have no clue how to do that.
Should I "get a new 27 " iMac", and use it with my 17" MBP(But that way I will not be able to use BOTH my desktop and my laptop and thus considering it a WASTE) or, should I "get a 27" Monitor, and link it to my 17" MBP"?
What ways should I do, to get the "Most Efficient Use of my desktop(iMac), and laptop(MBP)?
Thanks in advance.
Ed

Hi, clintonfrombirmingham,
Check out the following link:
iMac & MacPro
https://discussions.apple.com/thread/4790368
Thanks.
Ed

Similar Messages

  • What is the efficient way of working with tree information in a table?

    hi all,
    i have to design a database to store,access,add or delete(the tree nodes) the tree information. what is the efficient way of accomplishing it?
    let's assume the information to be stored in the table is parent,child and type(optional).The queries should be very generic(should be able to work with any database).
    anybody has any suggestions?I have to work with large data.
    quick response is highly appreciated.
    thanks in advance,
    rahul

    Did you check out this link?
    http://www.intelligententerprise.com/001020/celko1_1.shtml
    Joe Celko has really gave some interesting way to implement tree in a rdbms.
    Best wishes
    Anubrata

  • What is the efficient way of insert some bytes into a file?

    Hello, everyone:
    If I want to insert some bytes into a file (for example, insert the bytes before all the original content of the file, or append the bytes to a file), and the size of the original file is very big. I am wondering what is the efficient way? Where can I get some sample codes?
    regards,
    George

    Thanks, DrClap.
    I have tried your method and you are correct. I have written a simple program which can be used to insert "Hello World " to the start of a file ("c:\\temp\\input.txt"), and I have verified that it can work. Please help to see whether it is correct and whether it has a more efficient way.
    public class TestDriver {
         public static void main(String[] args) {
              byte[] back_buffer = new byte [1024];
              byte[] write_buffer = new byte [1024];
              System.arraycopy("Hello World".getBytes(), 0, write_buffer, 0, "Hello World".getBytes().length);
              int write_buffer_length = "Hello World ".getBytes().length;
              int count = 0;
              FileInputStream fis = null;
              FileOutputStream fos = null;          
              try {
                   fis = new FileInputStream (new File("c:\\temp\\input.txt"));
                   fos = new FileOutputStream (new File("c:\\temp\\output.txt"));
                   while ((count = fis.read (back_buffer)) >= 0)
                        fos.write(write_buffer, 0, write_buffer_length);
                        System.arraycopy (back_buffer, 0, write_buffer, 0, count);
                        write_buffer_length = count;
                   //write the last block
                   fos.write(write_buffer, 0, write_buffer_length);
                   fis.close();
                   fos.close();
                   //copy content back into original file
                   fis = new FileInputStream (new File("c:\\temp\\output.txt"));
                   fos = new FileOutputStream (new File("c:\\temp\\input.txt"));
                   while ((count = fis.read (back_buffer)) >= 0)
                        fos.write(back_buffer, 0, count);
                   fis.close();
                   fos.close();
                   //remove temporary file
                   File f = new File ("c:\\temp\\output.txt");
                   f.delete();
              } catch (FileNotFoundException e) {
                   // TODO Auto-generated catch block
                   e.printStackTrace();
                   try {
                        fis.close();
                   } catch (IOException e1) {
                        // TODO Auto-generated catch block
                        e1.printStackTrace();
                   try {
                        fos.close();
                   } catch (IOException e2) {
                        // TODO Auto-generated catch block
                        e2.printStackTrace();
              } catch (IOException e) {
                   // TODO Auto-generated catch block
                   e.printStackTrace();
                   try {
                        fis.close();
                   } catch (IOException e1) {
                        // TODO Auto-generated catch block
                        e1.printStackTrace();
                   try {
                        fos.close();
                   } catch (IOException e2) {
                        // TODO Auto-generated catch block
                        e2.printStackTrace();
    }regards,
    George

  • The efficient way to display 100 rendered images?

    Hello, i need your help...
    Do you guys know what is the efficient way to display 100 renderedimages (TiledImage) in a window frame?
    Thank you.
    :)

    a) Create a panel with a GridLayout. create a JLabel
    using the image and add the label to the panel.then add the panel to the JScrollPane, (I assume that the size of the image is the same) then add the JScrollPane to your frame. ^_^

  • The efficient way to repair broken EPM report

    Hi experts,
    I'm looking for the efficient way to repair broken EPM report.
    In following cases, the relevant EPM reports are broken and I can't push "Edit report" (gray out) and lose connection to model...
    i) In removing dimension from a model (ex. rename dimension ID, add & delete it)
    ii) in removing member which is located on page axis of EPM report.
    To avoid the situation, I make a point of taking following measures:
    i) - clean up the dimension that I want to delete from axis of EPM report before remove it from relevant model
    ii) - clean up the member from EPM report before removing it.
    However these measures terribly bothered me if number of relevant EPM reports is too large... If these reports can connect to model or be able to "Edit report", the work will be more efficient.
    Do you mind if you tell me a good idea?
    Regards,
    Masa

    Hi Vadim,
    Here's the example in appearing issue ii).
    No i) is my misconception, sorry...
    in opening the broken EPM report:
    This dialog appears if the member that has already removed is located in pase axis. Unfortunately, this broken report isn't able to execute "Edit Report".
    My question is how I can repair the report by something to do. Please help me...
    Best regards,
    Masa

  • I recently purchased an iMac, learned the hard way that iDVD is not loaded, tried to copy iDVD from  my MacBook but came up short.  Each time I launched iDVD I got a message that I have no Themes even though the Mac I took it off runs just fine.

    Tried to load iDVD from my MacBook onto my new iMac but keep getting a message that it can find any themes.  iDVD runs just fine in the MacBook but for some reason when I copied it from the MacBook using a flash drive and download it on the iMac the themes went away.  Suggestions?

    The themes, when installed, will be in a Themes folder inside the HD/Library/Application Support/iDVD folder.  Check there and when you find it copy it to the same folder on your new Mac. 
    Then go to iDVD's advanced preference pane and point iDVD to the location of the Themes folder:
    If you can't find the Themes folder on your old Mac you can do a custom install of iDVD and its  resources from the disk that came with the old Mac. It will look similar to one of these:
    OT

  • What is the efficient way to migrate the large DB (over 1TB) from 9i to 11g

    Can any body give a suggestion for migrate the large DB (Over 1TB) from 9i to 11g?

    Hi;
    Can any body give a suggestion for migrate the large DB (Over 1TB) from 9i to 11g?Please check below
    Minimizing Downtime During Production Upgrade [ID 478308.1]
    Master Note For Oracle Database Upgrades and Migrations [ID 1152016.1]
    Different Upgrade Methods For Upgrading Your Database [ID 419550.1]
    I suggest also Please check my blog
    http://heliosguneserol.wordpress.com/2010/06/17/move-to-oracle-database-11g-release-2-wiht-mike-dietrich/
    In this pdf you can see patch of to upgrade db from x to n wiht many senerios wiht all related metalinks notes which is created by Oracle worker Mike Dietrich
    Regard
    Helios

  • What is the best way to transfer iPhoto from MacBook Pro to iMac?

    What is the best way to transfer iPhoto from MacBook Pro to iMac?

    the only way is Connect the two Macs together (network, firewire target mode, etc) and drag the iPhoto library intact as a single entity from the old Mac to the pictures folder of the new Mac - launch iPhoto on the new mac and it will open the library and convert it as needed and you will be ready move forward.
    LN

  • Moving from 13" MBP to 15" MBP...  What is the best way to get data moved over?

    I have a 13" MBP Mid-2009 model, upgrading to a 15" MBP Late-2011 model.  Both are on Lion OS 10.7.3.  I use Time Machine to back up the old machine.  If I use Migration Assistant, is a recovery from the Time Machine Backup taken on the 13" MBP over to the 15" MBP the best way to bring over all of the Apps and Data?  I know that the 13" has different HW drivers, so am unsure if the recovery will overwrite these drivers on the new 15" MBP.  Just trying to make sure that I do not mess up the new system, but have not been able to find out if Time Machine will preserve the critical OS files during the recovery.
    Thanks all!

    Some suggest you use Setup Assistant when you first turn the new MBP on.
    I have never used it to transfer any data from one computer to another, from Mac or PC to new Mac.
    I simply don't trust it, never had on any platform.
    I find it easier for me to just network computers together and copy data over. That way only the data I want is copied to the new computer. I can make all the interface changes myself.
    Oh as eww says do NOT use a TM backup from the older Mac to move data over to the new one. You could possibly be looking at reinstalling the OS on the new unit.

  • Advice needed: Efficient way to scan users

    Hi all,
    I wish to know the efficient way to scan users in Lighthouse. I need to write a workflow that checkout all the users and perform some updates. This workflow should run everyday at midnight.
    I have created a scanner myself. Basically what It did are:
    1. call FormUtils.getUsers method to return all users' name into a variable.
    2. loop through this list and call a subprocess workflow to process every user. This subprocess checks out a user view, performs updates, and then checks in view.
    This solution is not efficient at all since it causes my JVM to be Out of Memory. (1G RAM assigned to JVM with about 78,000 users)
    Any advice is highly appreciated. Thank you.
    Steve

    Ok...I know understand what you are doing and why you need this.
    A long, long, long time ago (back in 3.x days) the deferred task scanner was really bad. Its nightly scan would scan ALL users each time. This is fine when your client had 4k users...but not when it has 140k users.
    Additionally, the "set deferred task" function had problems with two tasks with the same name "i.e. disable resource" since it used the name as the xml object name which can not be duplicated.
    soooo, to beat this I rewrote the deferred task handler to allow me to do all of this. Part of this was to add a searchable field called 'nextTaskDate' on the user object. After each workflow this 'date" is updated so it is always correctly populated with the users "next deferred task date"
    each night the scanner runs and querys all users with a nextTaskDate of today. This then gives us a result set that we can iterate over instead of having to list each user and search for tasks. It's a billion times faster.
    Your best bet is to store the task date in miliseconds and make your query a "all users with next task date BEFORE now"...this way if the server is hosed you can execute tasks you may have missed.
    We have an entire re-usable implmentation framework that we have patented (of which this code is a part) that answers most of these types of issues you are bringing up. It makes these implementations much much simpler, faster, scalable and maintainable.
    this make sense?
    Dana Reed
    AegisUSA
    Denver, CO 80211
    [email protected]
    773.412.3782
    "Now hiring best-in-class IdM architects. Inquire via emai"

  • Efficient way for Continous Creation of XML Content?

    Hi
    I have a requirement of creating xml content from the data extraced from a udp packet.
    As the packet arrives, i have to generate appropriate xml content from them and keep in the same single xml file.
    Problem:
    Since the xml file is not a flat file, i can't just append the new contents at the end. So if i have to write into xml file, Each and Every time i have to parse the content as a packet arrives and insert the new content under appropriate parent. I think this is not the most efficient way.
    Every time parsing the file may affect cpu time and as the file grows in size, the memory will also be a constraint.
    Other options i could think of
    * Hold the XML Document Object in memory until a certain event like timeout for receiving packet and write into the xml file at oneshot.
    * Serialize the objects containing the extracted packet content to a temp file and after some event, parse and create the xml file at oneshot
    Which is the efficient way or is there any design pattern to handle this situation? I am worried about the memory footprint and performance on peak loads
    I am planning to use JDOM / SAX Builder for xml creation.
    Thank you...

    Lot's of "maybe" and "I think" and "I'm worried about" in that question, and no "I have found" or "it is the case that". In short, you're worrying too much about problems you don't even know you have. XML is a verbose format anyway, efficiency isn't paramount when dealing with it. Even modestly powered machines can deal with quite a lot of disk I/O these days without noticeable impact. The most efficient thing you can do here is write something that works, and see if you can live with the performance

  • Efficient way of updating data to database table

    what is the efficient way of updating data to database table  .
    i have huge amount of data in my internal table  , how to use update statement in this case .
    1. database table having 20 fields  ,
    2. one is key field and suppose 20 th field i want to change  .
    3. I have data for only 2 fields , i.e for ( 1 st and last 20 th field ) .
    i can't use update statement in loop , as it is not good practice(hits database several times ) .
    do it effects all the 20 fields for  particular record .

    Hi,
    Use UPDATE statement , check below description from SAP help.
    UPDATE dbtab FROM TABLE itab. or UPDATE (dbtabname) FROM TABLE itab.
    Effect
    Mass update of several lines in a database table.Here, the primary key for identifying the lines tobe updated and the values to be changed are taken from the lines of theinternal table itab. 
    The system field SY-DBCNT contains the number of updated lines,i.e. the number of lines in the internal table itab which havekey values corresponding to lines in the database table.
    Regards
    L Appana

  • Alter primary key constraint - efficient way

    Please let me know the efficient way to alter the primary key constraint in a table(it has millions of records).
    Thanks.

    Do you want to have a NCI on PK instead of CI? You have to drop a constraint.. You know , a  constraint is a logical component  but the index (unique) that SQL Server creates behind the scene is a physical implementation...
    Best Regards,Uri Dimant SQL Server MVP,
    http://sqlblog.com/blogs/uri_dimant/
    MS SQL optimization: MS SQL Development and Optimization
    MS SQL Consulting:
    Large scale of database and data cleansing
    Remote DBA Services:
    Improves MS SQL Database Performance
    SQL Server Integration Services:
    Business Intelligence

  • Efficient way to read CLOB data

    Hello All,
    We have a stored procedure in oracle with CLOB out parameter, when this is executed from java the stored proc is executed fast but reading data from clob datatype using 'subString' functionality takes more time (approx 6sec for 540kb data).
    Could someone please suggest what is the efficient way to read data from Clob (We need to read data form clob and write into a file).
    Thanks & Regards,
    Prashant,

    Hi,
    you can try buffered reading / writing the data, it usually speeds the process up.
    See example here:
    http://www.oracle.com/technology/sample_code/tech/java/sqlj_jdbc/files/advanced/LOBSample/LOBSample.java.html

  • Just got girlfriend a new iPad2. Her iMac is a PowerPC G5 (Tiger version 10.4.11) with 512 mb RAM. What's the simplest, most efficient way to get her iPad2 up and running and synced to her Mac?

    Just got girlfriend a new iPad2. Her iMac is a PowerPC G5 (Tiger version 10.4.11) with 512 mb RAM. What's the simplest, most efficient way to get her iPad2 up and running and synced to her Mac?

    Most of the Apple store sales people and some of the genius bar people are only knowledgable on Apple's more recent offerings. They are not very knowledgable, I found, on older PowerPC based Apple computers, I'm afraid.
    Here's the real scoop.
    Your girlfriend's G5 can only install up to OS X 10.5 Leopard. This is the last compatible OS X version for PowerPC users.
    OS X 10.6 Snow Leopard and OS X10.7 Lion are for newer Intel CPU Apple computers.
    Early iMac G5's can only have up to 2 GBs of RAM.
    Later iMac G5's (2005-2006) could take up to 2.5 GBs of RAM
    2 GBs of RAM will run OS X 10.5 Leopard just fine.
    The very latest iTunes (10.5.2) can be installed and runs on both PowerPC and Intel CPU Macs.
    However, there are certain new iTunes feature that won't work without an Intel Mac.
    One of iOS and iTunes feature is sync'g wirelessly over WiFi.
    This will not work unless you have an iDevice running iOS 5 and Intel Mac running 10.6 Snow Leopard or better.
    Although, I was disappointed I would not be able to do this with my G4 Mac, it's not a biggie problem for me.
    So, your girlfriend's computer should be fine for what she intends to use it for.
    The Apple people either just plain didn't know or we're trying to get you to think about buying a new Mac.
    At least, as of now, not truly necessary.
    If Apple, at some later point, drops support for PowerPC users running 10.5, then would be the time to consider a new or "newer" Intel CPU Mac.
    My planned Mac computer upgrade is seeking out a " newer" last version G5 for my "new" Mac.
    I can't afford, right now, to replace all of my core PowerPC software with Intel versions, so I need to stick with the older PowerPC Macs for the time being. The last of the G5's is what I seek.

Maybe you are looking for

  • Can retrieve value from one table, but not the other (exception thrown)

    Hi I hope some friendly soul can help me out here. I have a local Access database file. I am able to get a value from all tables except for one, which throws this error: "System.NullReferenceException: Object reference has not been specified to an ob

  • What are the best approaches for mapping re-start in OWB?

    What are the best approaches for mapping re-start in OWB? We are using OWB repository 10.2.0.1.0 and OWB client 10.2.0.1.31. The Oracle version is 10 G (10.2.0.3.0). OWB is installed on Linux. We have number of mappings. We built process flows for ma

  • Transfer of all items reserved by previous user

    Hi, I have a request to transfer of all items reserved by previous user (who has left the org and his ID is also deleted) to a new user. How can I do this ? Thank You, SB.

  • Funcionallity "Add Drilldown to characteristic in new Worksheets" crashes

    Hello comunnity, in BEx 7 Analyzer I opened a query. In the filter section you have the characteristics to do your OLAP-activities and filtering. In the context menu of the characteristics there's the funcionallity "Add Drilldown to characteristic in

  • Unable to create RemoteSession

    Hi, I am trying to create an instance of Remotesession using the following code. RemoteSession = RemoteSessionFactory.GetExplicitLoginContext(NewUri("https://servername/ptapi/QueryInterfaceAPI.asmx"), "administrator", "") and it is resulting in the f