Early G5 disk size limitation

Hi everybody,
I'd like to upgrade my old G5 with a 750GB or even 1TB internal disk.
However, Apple specifications for early PM G5 (i.e. issued in june 2004) say HERE that G5's with order number M9020LL/A, M9031LL/A, M9032LL/A can't support more than 500GB (maximum system capacity). My hard disk reseller also warns me about this limitation. Such limit is not mentioned for models issued "late 2004" and later.
My G5 was bought in dec. 2004, but since it was bought in France, I'm not sure whether or not it belongs to the "june 2004" category.
In order to make sure, I wonder if there is a way to derive the "order number" from the serial number.
Thanks for any help

And Oh, thanks to ALI BROWN, I've just discovered that the model (date of issue) and Technical Specifications may be derived from the serial number HERE.

Similar Messages

  • Photoshop archaeology -- when was the 1TB disk size limitation removed?

    I've just joined a photo lab as IT manager. They are running a mixture of Photoshoop versions and have some horrible operational hacks to get around the 1TB drive size limiation in PS 7. We're a small shop and can't afford to upgrade all of our Photoshop licenses at once, so I want to start by upgrading everyone running a version that has the 1TB limitation.
    Does anyone know in which version that limtation was lifted? I have looked at release notes for CS3 onward; have not been able to find release notes for CS and CS2.
    TIA for any info.
    Stu

    I'm not sure of the answer to your question, but I would keep one or two "legacy" systems around, for plug-ins that won't run in more modern versions. You may also  want to do this to support legacy hardware.
    I say this as I am planning a new hardware purchase to build such a "throwback" system - a couple of scanners and a Xaos Tools plug-in don't like the modern era.

  • DV9000 Maximum Hard Disk Size Supported?

    I'm searching for a used HP Pavilion DV9000.  I need to know the maximum hard disk size the DV9000 series laptops support.  To be clear, could I put a 750G or 1TB disk in each bay?
    HP's web site isn't helpful. It gives only the "official" numbers at time of release which state it supports up to 240G, 120 in each bay. Those kinds of numbers are almost never an accurate statement of the limitation as I have seen similar "official" numbers for other HP laptops (supposedly limited to 120G or 250G or whatever) that have no problem supporting 640G in the real world.
    The size of hard disk a computer supports is usually a function of the main board's chipset and I don't know how to determine the exact chipset or find the chipset's hard disk support capability. Again, HP's web site isn't helpful here either. They may list it has an nVidia chipset; but, they don't mention which nVidia chipset.
    To be clear, I don't own a DV9000 yet and I need to know how to find this information maybe with the exact model number. The model series may be DV9000; but, there are dozens of more exacting model numbers on the bottoms of the DV9000 series computers (example: DV9025ea).  I don't want to purchase a DV9000 and then find that its limitation really is 120G per bay.

    Hard drive size is a limitation of the Bios, not the chipset, this was overcome with the implementation of Large Block Addressing 48bit in modern bios's in 2003
    Hp does not state which bios's on older laptops support this or not.
    Looking at the production dates of the 9000 series, I would be willing to bet it supports any size hard drive (48bit LBA), since 48bit LBA was introduced in 2003
    http://en.wikipedia.org/wiki/Logical_block_addressing
    That being said, there could be brand compatibility issues, so be sure to buy the hard drives from a source that has a friendly return policy.
    Maybe some one can post that has actually installed a large hard drive in a 9000 series.

  • Maximum disk size for azure site recovery

    Hi everyone,
    I am looking into Azure Site Recovery, and I can't seem to find the maximum disk size I would be able to replicate into Azure.  I have read some articles saying that 1TB is the maximum size, and some people have said that it is 64TB!! I have a File
    Server that I would like to protect which is 4TB in size, and if the limit is 1TB I think it is very limiting...
    Any help would be greatly appreciated.
    Many Thanks.
    Robert Milner | MCITP: Virtualization Administrator | Website: http://www.remilner.co.uk | Twitter: @robm82

    Hello Robert,
    The current size limits for a VM replicating to Azure are :
    For an OS VHD (the vhd that has the OS Installation) : 127 GB
    For a Data VHD  (<1 TB)
    Is the size of your file server running on a single 4 TB volume?
    Anoop KV
    Hi Anoop,
    Our File Server is currently running on a single 4TB volume.  Do I have any options with regards to replicating this VM to Azure using Site Recovery?
    Many thanks.
    Robert Milner | MCITP: Virtualization Administrator | Website: http://www.remilner.co.uk | Twitter: @robm82

  • Mailbox and Library Size Limits and Reporting

    I have seen quite a few postings on size limits.
    I have a system that has two Post Offices - one for mail and one for doc mgt. Users log into the e-mail PO and get redirected to the doc mgt PO when they need to access files in the libraries ( there are 6 libraries)
    I would like to find a utility other than GW Check - maybe a nice Windows GUI app that lists the amount of space each user is using in each PO (or total) that would include all e-mails, sent and received (with attachments) and documents in the libraries.
    I would also like to apply space limits to each user. "Client Options" sets it globally, but for each DOM or PO separately, and of course, you can be more granular on a per user basis. Setting the limit on a per user basis is daunting since it has to be done one at a time!
    Is there a utility that not only reports consumed space, but also allows you to set the limits?
    BTW, why are the disk space limits under the "Send" section of Client Options. Seems like it would be more of an "Environment" option.
    Using GW 7.02HP on NW 6.5.6
    Many thanks,
    Charlie Riale, B.E.E.,CNE6
    CARiale at bnetinc.com
    www bnetinc com
    610-645-7616 (Work)
    610-645-7617 (Fax)

    I'm not sure of any utilities that do a "report and set" of the limits. The GWCheck Log File Parser that Dave Parkes wrote (Caledonia Network Consulting) works well for seeing how much space a user has in use. That said, you CANNOT set limits on users in the DMS - just isn't possible. It's a different system, and does not allow for size limits of documents, etc.
    As for why size limits is located under "sent" I guess it's because it really affects the sending of mail the most. I.e., when a user reaches the size limit, incoming mail is not bounced. The size limit only prevents users from sending new mail until additional space is cleaned up.
    Danita

  • Max. disk size in A1000?

    Hi,
    Does anybody know what the max. disk size is that works in a A1000 array? We have so far used 76GB disks in an 8 slot A1000.
    Thanks

    Hello Erwin,
    SCSI is less limited than IDE. The next bigger size should work.
    The Seagate ST3146707LC (146.8GB - 10000 RPM Ultra-320) and Fujitsu MAT3147NC (146.8GB - 10000 RPM Ultra-320) are listed as compatible.
    http://sunsolve.sun.com/handbook_pub/Systems/A1000/component s.html#Disks
    Michael

  • Does Labview RT 8.5 impose a limit on FAT32 partition / hard disk size?

    We will be using a PXI RT system (spec below) to collect, analyse and store a
    large amount of data (~30 to 40GB per experiment).
    We don't want to keep the Windows host PXI attached
    and so the RT PXI must be able to store all the data from an
    experiment.
    It would be desirable to keep data from all
    experiments on the H/D, so a capacity of ~300GB would be useful.
    Q: Is there a limit on the hard disk size imposed by the Labview RT 8.5 Operating System and if so what is
    it?
    An older
    post (~2005) suggests that the partitioning tool limited FAT32 partitions to 32GB.
    Thus the question I'm asking that if I created a large FAT32 partition with
    a third party tool could I install and run Labview RT
    8.5?
    Q: Is there any other way of connecting a
    large capacity H/D to the RT System?  For example using H/D partitions, slave H/D. Other
    ideas?
    Thanks,
    Michael
    NI-1000B with
    NI-8176 embedded controller
    PIII
    1.266MHz
    128MB RAM
    Labview RT
    8.5
    Solved!
    Go to Solution.

    > The size limitation has nothing to do with
    the LabVIEW RT OS. FAT32 limits you to 32GB. This cannot be changed since it's a
    limitation of the file system format.
    Thanks for
    reply.
    I have, however, gained the impression from a
    number of web sites that there isn't a 32GB partition limit on FAT32 -- see [**]
    below -- but the limit is with the
    partitioning tool.  It looks like the
    partitioning tool that comes with RT has a 32GB limit (Is that correct?). Hence
    my question if you were able to create a large FAT32 partition could you install
    and run RT?
    Regards,
    Michael
    [**] The following was copied from http://en.wikipedia.org/wiki/File_Allocation_Table#FAT32
    Windows 2000 and Windows XP can read and write to
    FAT32 file systems of any size, but the format program included in Windows 2000
    and higher can only create FAT32 file systems of 32 GiB or less. ... This
    limitation can be bypassed by using third-party formatting
    utilities.

  • Premiere Pro CC 2014 file size limits?

    Hi a friend needs to create a 37hr uncompressed AVI file (by combining an avi of pictures and mp3 of audio of a performance) and is wondering if it can be done using Adobe Premiere Pro CC 2014, ie are there any file size limits? Any comments much appreciated.

    Would be interesting to know how you are going to store that. 37 hours of HD uncompressed in an AVI wrapper requires around 24 TB of free disk space on a single volume. That means you would need  something like 12 x 3 TB drives in Raid6 + 1 HS, requiring at least a 16 port raid controller for those 13 disks, just for the output file. Due to fill rate degradation, that is cutting it thin. Additionally at least a X79 or X99 motherboard with 2011 socket is necessary.
    Next question is, who would be crazy enough to marathon-watch 37 hours of a performance?
    You may consider another workflow, not uncompressed and not 37 hours.

  • E450 maximum disk size

    HI,
    a customer of mine have an Enterprise 450 that they're using as a file- and OPI server. At present, they're using an old external raid on the machine, but they're running out of disk space.
    They don't have very many users, so the server is fine with regards to power, but they need more disk space.
    They also have a second 450 they're not using with backplanes, scsi planes and extra PSU, as well as 18Gb disks with third party disk containers, so they basically have all the hardware they need to put extra disks inside the machine, except that the 18Gb disks won't cut it (not big enough).
    Now, Sun only offered up to 36Gb disks with the machine, but nowadays, you can get up to 146Gb generic SCA disks.
    The question is, will these disks work in the E450? How big disks have any of you tried to put in a 450 ?
    Johan-Kr
    Edited by: Fenalaar on Apr 21, 2008 4:38 AM
    Edited by: Fenalaar on Apr 21, 2008 4:39 AM

    I know this is very late but I've just seen this posting and thought I'd mention I am using 300 GB Ultra 320 disks in Enterprise 450 servers with no problems at all. In fact, I'm using 300 GB disks in Enterprise 150 and 250 machines - they are SCSI disks and generally not subject to any size limits.
    Andy

  • Size Limits to xmlgen.insertXML and Steves XMLLoader

    Hi,
    What are the size limits are for xmlgen.insertXML? Can you only insert a varchar sized document? i.e. not a clob directly? - It always seems to fail past a certain document size (about 1600 records of a table with one number column)
    I've also tried using XMLLoader from Steves book - but that failed on large tables too. (12,000 records and 20 columns).
    Any help much appreciated!
    Simon.
    null

    Hi,
    As everyone knows ;-) there is no memory probems in java, but...
    I was able load clobs and long varchar2 columns into the database, but it used to hung from time to time - I'm using the XSQL commandLine and servlet on both unix and NT.
    I've found out that some of the problems are related to the jdk, the hunging problem on mass load was solved when I moved to jdk 1.2.2... and no version earlier, also I think I've installed a solaris patch relating to java, but I don't realy remember.
    By the way, If you are using clobs try using thin driver instead of oci8 driver it solve some problems for me.
    It it always worse checking your rollback segments and so.
    Hadar

  • Building a dvd with custom disk size (disk, ISO, or dvd folder)

    My goal at the end of all this is to get a DVD format of video at 6mbps with menu's to be burned to a blu-ray 25GB disk.
    Using Encore CS6 on a Windows 7 (4.7Gzh 8-core, 16gb ram @2400mhz)
    1st attempt:  I used a bluray disk and tried to burn a dvd format and it ejected my disk and said please insert a black dvd. (used custom disk of 25GB showing extra space available)
    2nd attempt: I thought perhaps I would build it to an ISO (also tried dvd folder) to burn later. it responded with an error message saying, "disk capacity is too small - building navigation failed"
    I dont care how I can get this to work but I would like to get all the video and menu's onto a single disk. Encore limits the "custom disk size" to 99GB so I assume it should work for 25GB (23.2GB).
    The reason I dont want to make a bluray format is because it requires high bitrate and the video is so old it would be a waist of space. I dont need 720p of video from 1935. (unless I can make a bluray format at a 6mbps)
    Thank you for any help you can provide
    bullfrogberry

    You can do this in Encore.
    I am assuming you are only picking presets, and not customizing one. You pick prests in the Transcode Settings dialog. Do you see the "Edit Quality Presets" button? Pick it, customize one by setting the bitrates to get the results you want, then SAVE IT as your own. Then pick that in the transcode setting. (In the transcode setting image, you can see my custom example "Stan Custom 4Mb CBR".) And yes, you can select all your video assets and apply this custom preset to all of them at once. I would do one short one to see if you are getting in the ball park. I would do 2 samples: one in MPEG2-Bluray and one in H.264-Blu-ray. (I'd follow Richard's recommendation to  you and use H.264.)

  • UsrVol_sftfs_v1.pkg size limitation (App-V 4.6 SP2)

    Hello
    Im currently using App-V 4.6 SP2 Client and the applications are installed using the msi. The sequenced application is a trader application and has different functionalities.
    During one of the functions which was working correctly previously the application started giving a disk full error eventhough there was 130 GB free space on the machine.
    On checking more it was found the size of UsrVol_sftfs_v1.pkg in folder under "%Appdata%\Soft Client\" had grown upto 1.97 GB.
    Is there any size limitation on UsrVol_sftfs_v1.pkg file? If yes what is the limit and is there any official documentation for it as i could find it in the white paper as well.
    Thanks in advance
    Sujit Jadhav

    Apparently there is no public info about the .pkg's max size.  As far as I can remember discussed values ranged from about 120 MB up to 4 GB.
    Anyway you could try to identify _what_ is blowing up the .pkg and to avoid that beeing in the App-V Package (either by telling the application that it should store certain data on a different location [like Appdata] or by excluding certain folders from
    the App-V package [so that this data gets written to the local disk]).
    PkgView by Tim Mangan is small and free and should do the job: http://www.tmurgent.com/Tool_PkgView.aspx
    Application Virtualization Explorer Professional by Gridmetric / Kalle Saunamäki can open .pkg files as one of its features: http://www.gridmetric.com/products/ave_editions.html (not
    sure, but the free Trial could be sufficient).
    Falko
    Twitter
    @kirk_tn   |   Blog
    kirxblog   |   Web
    kirx.org   |   Fireside
    appvbook.com

  • Pages size limitation

    I'm wondering what Pages file size limitation might be. I put together an 83 page document that occupied around 750 MB of disk space when saved, and the next morning when I went to open it, the last 20 pages were trashed. It caused Pages to crash a couple times on launch before I got it to stay open, and I was unable to replace/repair the damage in the file. Anybody else suffered anything like this? Or have I run into a real limitation here?

    My G5 has 4 Gb of RAM and plenty of empty space on the boot drive.
    Interestingly, when I've tried to reload the images on the blown pages, Pages has reported the image files as corrupted/unopenable even though they open just fine in both PhotoShop CS2 and InDesign. For lack of time, I rebuilt the entire project, all 134 pages, each with a 5x7 300 ppi photo on it, using the very same image files, and had no trouble with it at all.
    For what it is worth, I repair permissions regularly, and carefully keep up to date with all updates/patches.
    Activity Monitor showed RAM filling upcompletely when I tried to open the file, with free memory down to about 40 Mb when Pages crashed while trying to open the damaged file. I did manage to open the damaged file when it first started misbehaving, and truncated it by chopping off the damaged 20 pages.
    This truncated file opens fine, but it also fills memory to about 40 Mb free, before the document opens and memory use frees up over 2 Gb.

  • 4gb file size limitation using java.io.* package in java stored procedure

    Does anyone know about file size limitations when using java.io.* package inside java stored procedures. I have some java stored procedures that read and write to files. They error out when run as a java stored procedure within oracle on files over 4gb. I have no problems with these procedures when I test them outside of oracle.

    I am using 10g Release 10.2.0.1.0, the java version returned by the Oracle JVM is 1.4.2_04
    When I tested it outside of oracle I ran it using the java run time in ORACLE_HOME\jdk\bin and it works perfectly on files larger than 4gb.
    Thank you for the UTL_FILE suggestion. I have considered using that but there is one method in Java that I can't seem to find a corresponding procedure for in PL/SQL.
    RandomAccessFile.setLength()
    This allows me to truncate a file. I have a need to strip bytes off the end of a file without creating a copy because I am working with large files and disk space may be limited. It is also much slower to read from the original file and writing the same bytes to a new file (minus the trailing bytes).
    If anybody has any insight about the 4gb java.io limitation in Oracle JVM or has a workaround in PL/SQL for the RandomAccessFile.setLength() method in Java, I would really appreciate the help.
    Thanks,
    Thach

  • AEBS File Size Limitation?

    I have a fat32 disk connected to a powered hub which is connected to my AEBS. I cannot copy or access files larger than 2gb. Any mp4 movie file over 2gb will not copy to/from Macbook or play in iTunes, Frontrow, or Quicktime.
    If I connect the disk directly to the Macbook the files copy and play in Frontrow and Quicktime without a problem.
    Is there a file size limitation on AEBS?

    OK. Here is what I got.
    After I roll back my AEBS firmware back to 7.0, I don't see any problem copying file larger than 1GB or 2GB or whatever even if with FAT32 any more.
    It is true that FAT32 limit file at 2GB but I guess it is not the main problem here. To me, it seems to be firmware 7.1 problem. Some may not see this problem but most of people see it.
    Here is how to roll back the firmware
    1. Make sure to unplug/disable all connection to Clients & WAN to be safe while firmware upgrading.
    2. Use MAC to roll back by hold "option" while click "Check for updates" then select firmware 7.0.
    3. After it is done downloading, select base station and click "Base station" menu -> upload firmware
    4. Choose 7.0 and wait until it done.
    Congrat!! you got 7.0 roll back. Dont' forget to disable automatic update.

Maybe you are looking for