Maximum disk size?

That's the maximum disk size or volume size supported? Thanks for any insights.

The maximum volume size, and file size, in Mac OS X 10.4 or later, is close to 8 EB (exabyte, or 10**18 bytes). The actual number is 2**63 - 2**31.
See Mac OS X: Mac OS Extended format (HFS Plus) volume and file limits

Similar Messages

  • Maximum disk size for Z61m?

    I want to replace the original 80 MB HDD of my Z61m with a faster and larger one. What't the maximum disk size the controller/BIOS can cope with?
    Gurk 
    Thinkpad Tablet
    Thinkpad T431s
    ThinkPad Yoga S240 with OneLink Dock
    Solved!
    Go to Solution.

    Any SATA laptop drive will do just fine, be it 160GB or 320GB. You know your needs as well as your budget the best...
    Hope this helps.
    Cheers,
    George
    In daily use: R60F, R500F, T61, T410
    Collecting dust: T60
    Enjoying retirement: A31p, T42p,
    Non-ThinkPads: Panasonic CF-31 & CF-52, HP 8760W
    Starting Thursday, 08/14/2014 I'll be away from the forums until further notice. Please do NOT send private messages since I won't be able to read them. Thank you.

  • Maximum disk size for azure site recovery

    Hi everyone,
    I am looking into Azure Site Recovery, and I can't seem to find the maximum disk size I would be able to replicate into Azure.  I have read some articles saying that 1TB is the maximum size, and some people have said that it is 64TB!! I have a File
    Server that I would like to protect which is 4TB in size, and if the limit is 1TB I think it is very limiting...
    Any help would be greatly appreciated.
    Many Thanks.
    Robert Milner | MCITP: Virtualization Administrator | Website: http://www.remilner.co.uk | Twitter: @robm82

    Hello Robert,
    The current size limits for a VM replicating to Azure are :
    For an OS VHD (the vhd that has the OS Installation) : 127 GB
    For a Data VHD  (<1 TB)
    Is the size of your file server running on a single 4 TB volume?
    Anoop KV
    Hi Anoop,
    Our File Server is currently running on a single 4TB volume.  Do I have any options with regards to replicating this VM to Azure using Site Recovery?
    Many thanks.
    Robert Milner | MCITP: Virtualization Administrator | Website: http://www.remilner.co.uk | Twitter: @robm82

  • T41 - Maximum disk size

    Does anyone know what the biggest harddrive the T41 can handle?
    I looking into buying a 250GB disk for my T41 and can't find out if it can handle this size.

    @eluzion:
    Welcome to the forum!
    Any T4x series machine should be able to run 320GB since there's no known BIOS limit on them. The drive might be defective, or not sitting correctly in the slot, it takes way less than 1mm for it  not to be "seen" by the machine.
    Try re-inserting the drive after you've checked for bent pins. You can also test it out in another machine, just to be on the safe side.
    Good luck.
    Cheers,
    George
    In daily use: R60F, R500F, T61, T410
    Collecting dust: T60
    Enjoying retirement: A31p, T42p,
    Non-ThinkPads: Panasonic CF-31 & CF-52, HP 8760W
    Starting Thursday, 08/14/2014 I'll be away from the forums until further notice. Please do NOT send private messages since I won't be able to read them. Thank you.

  • E450 maximum disk size

    HI,
    a customer of mine have an Enterprise 450 that they're using as a file- and OPI server. At present, they're using an old external raid on the machine, but they're running out of disk space.
    They don't have very many users, so the server is fine with regards to power, but they need more disk space.
    They also have a second 450 they're not using with backplanes, scsi planes and extra PSU, as well as 18Gb disks with third party disk containers, so they basically have all the hardware they need to put extra disks inside the machine, except that the 18Gb disks won't cut it (not big enough).
    Now, Sun only offered up to 36Gb disks with the machine, but nowadays, you can get up to 146Gb generic SCA disks.
    The question is, will these disks work in the E450? How big disks have any of you tried to put in a 450 ?
    Johan-Kr
    Edited by: Fenalaar on Apr 21, 2008 4:38 AM
    Edited by: Fenalaar on Apr 21, 2008 4:39 AM

    I know this is very late but I've just seen this posting and thought I'd mention I am using 300 GB Ultra 320 disks in Enterprise 450 servers with no problems at all. In fact, I'm using 300 GB disks in Enterprise 150 and 250 machines - they are SCSI disks and generally not subject to any size limits.
    Andy

  • How to allocate disk size for each user in iMac?

    Hi folks,
    I have iMac (Mac OS X, version 10.7.5) for Family, and each of my family member has account on it.
    Now, kids download many large files (several GB size files) and HD is getting full.
    So, I'd like to set maximum disk size for each user.
    Could you prvide instruction how to configure disk size for each user?
    Regards,
    Hiro

    I don't know of any way to do that by user.
    I used to partition my HD and that sets a hard limit by partition, but by user? I don't think it can be done.

  • Cisco wave virtual blade disk size shrunk, unable to restore vblade

    Hi All,
    I applied the workaround listed by Cisco for bug CSCsy47235 on our wave device, which is also mentioned numerous times on the forum here, to do: disk delete-data-partitions and reload. after the reload, we are unable to restore the backed up blade image file, we get the error: '162GiB image is too large! 160GiB image is allowed'. when i try to create a virtual blade, the maximum disk size has reduced from 162GB(before the change) to 160GB(after the change).
    Any idea or solutions would be greatly appreciated.
    Thank you
    Jack

    Hi Tim,
    These disk size are fixed and cannot be altered, hence you will not be able to allocate the unused cifs/dre to vb disk or the unused vb disk for cifs and dre.
    There is no way to increase this as well, as most of these are determined whne you load the software and the partition table is fixed. Hence you wont find any maximum numbers. 32 gb is what you will get as part of wave 274 and wave 474 and that is the maximum disk space. You can split this across two virual blades , allocating them 16gb each , but however the total still cannot exceed the maximum of 32gb for all the virual machines.
    Regards
    Abijith

  • DV9000 Maximum Hard Disk Size Supported?

    I'm searching for a used HP Pavilion DV9000.  I need to know the maximum hard disk size the DV9000 series laptops support.  To be clear, could I put a 750G or 1TB disk in each bay?
    HP's web site isn't helpful. It gives only the "official" numbers at time of release which state it supports up to 240G, 120 in each bay. Those kinds of numbers are almost never an accurate statement of the limitation as I have seen similar "official" numbers for other HP laptops (supposedly limited to 120G or 250G or whatever) that have no problem supporting 640G in the real world.
    The size of hard disk a computer supports is usually a function of the main board's chipset and I don't know how to determine the exact chipset or find the chipset's hard disk support capability. Again, HP's web site isn't helpful here either. They may list it has an nVidia chipset; but, they don't mention which nVidia chipset.
    To be clear, I don't own a DV9000 yet and I need to know how to find this information maybe with the exact model number. The model series may be DV9000; but, there are dozens of more exacting model numbers on the bottoms of the DV9000 series computers (example: DV9025ea).  I don't want to purchase a DV9000 and then find that its limitation really is 120G per bay.

    Hard drive size is a limitation of the Bios, not the chipset, this was overcome with the implementation of Large Block Addressing 48bit in modern bios's in 2003
    Hp does not state which bios's on older laptops support this or not.
    Looking at the production dates of the 9000 series, I would be willing to bet it supports any size hard drive (48bit LBA), since 48bit LBA was introduced in 2003
    http://en.wikipedia.org/wiki/Logical_block_addressing
    That being said, there could be brand compatibility issues, so be sure to buy the hard drives from a source that has a friendly return policy.
    Maybe some one can post that has actually installed a large hard drive in a 9000 series.

  • MS 6147 max memory and disk sizes

    For the MS 6147 can anybody confirm the max Hard disk size and max memory.  
    It will not recognise the 40Gb HD I am trying to fit, and only recognises half of the 256Mb memeory simm I have fitted.
    The BIOS version is 1.9 but I think that is a 'special' by Packard Bell.  The MSI BIOS download site makes no mention of disk problems rectified right up to V1.8 which is the latest, with the exception of one for the ZX chipset only which addresses EDMA 66 problem.
    Anybody got a definitive answer on this?

    Supports a maximum memory size of 256MB (8M x 8) or
       512MB (16M x 4) registered DIMM only
    how many chips on dimm is what counts with older boards
    go to drive makers web site get jumper settings to limit it to 32gb and try

  • What is the maximum file size for CSV that Excel can open ? (Excel 2013 64bit)

    Hello,
    Before anyone jumps in, I am not talking about the maximum worksheet size of 1048576 rows by 16384 columns.
    I have  client whom has a 1.5 Gb CSV file, 1.9, 2.6, 5, 17 and 89 Gb file (Huge).
    If I open the 1.5 Gb, the file opens (After waiting 5 minutes) and then a warning pops up that only the first 1048576 rows have loaded. That is fair enough.
    If I try and open any of the others, Excel comes up to a blank worksheet. No errors. It just seems to ignore the file I tried to open. This happens from within Excel (File - open) or from double clicking the file in explorer.
    Excel goes to this blank page almost imeadiatly. It does not even try to open the file.
    If I try with Ms Access, I get a size warning and it refuses to load the file. (At least I get a warning)
    I would have expected Excel to load at least the first 1048576 rows  (If that is what there are in the file), and give an error.
    The computer is more than capable (Xeon processors, 16 Gb ram, SSD hard disks top of the line HP Z820 power workstation).
    With the 1.5 Gb file loaded to 1048576 rows, it uses 15% ram/pagefile. CPU's hit about 5%.
    I have confirmed it is Win 7 64bit, Excel 64bit. I am fairly confident we are over the file size but without an error message, I don't know what to tell my client whom is looking to me for answers.
    I have already discussed that the 89gb file in Excel is unreasonable and they are looking at a stat's package but I need an answer on these smaller files.
    Anyone got any ides ?
    Michael Jenkin (Mickyj) www.mickyj.com (Community website) - SBS MVP (2004 - 2008) *5 times Microsoft MVP award winner *Previously MacWorld Australia contributer *Previously APAC Vice Chairman Culminis (Pro IT User group support system)* APAC chairman GITCA
    *Director Business Technology Partners, Microsoft Small Business Specialist, SMB150 2012 Member

    Hi,
    The 1,048,576 rows & 16,384 columns is the
    workbook size limitation in Excel 2013. Thus, I recommend we try the Mr. Bernie's suggestions to import the large CSV file.
    1. Use VBA to read the file line by line and split/examine the import file in sections. If you have further question about the VBA, please post your question to the MSDN forum for Excel
    http://social.msdn.microsoft.com/Forums/en-US/home?forum=exceldev&filter=alltypes&sort=lastpostdesc
    2. Use Excel 2013 add-ins. Power Pivot and Power Query. For more detailed information, please see the below articles: 
    http://social.technet.microsoft.com/Forums/en-US/9243a533-4575-4fd6-b93a-4b95d21d9b10/table-with-more-than-1-048-576-rows-in-power-query-excel-2013?fo
    http://www.microsofttrends.com/2014/02/09/how-much-data-can-powerpivot-really-manage-how-about-122-million-records/
    Please Note: Since the web site is not hosted by Microsoft, the link may change without notice. Microsoft does not guarantee the accuracy of this information.
    Thanks
    George Zhao
    Forum Support
    Come back and mark the replies as answers if they help and unmark them if they provide no help.
    If you have any feedback on our support, please click "[email protected]"

  • Export-csv output maximum row size

    Hi all. May I know the maximum row size that can output to .csv using 'export-csv'??

    There's no limit on writing to a CSV file.
    You can try:
    # max rows to export to CSV:
    $i=0
    While ($true) { #Forever
    $i++
    $i
    "Some text" | Export-Csv -Path .\test12.csv -Append -NoTypeInformation
    until you run out of disk space, which may take a while. A million lines using the above script makes a 5 BM file..
    Now reading it back is an entirely different story. Try
    Delimit..
    Sam Boutros, Senior Consultant, Software Logic, KOP, PA http://superwidgets.wordpress.com (Please take a moment to Vote as Helpful and/or Mark as Answer, where applicable)

  • Maximum file size in picture ring?

    Hello folks!
    I am planing to use a picture ring with a quite big amount of data needed.
    My question: is there a maximum data size that i can embed in a picture ring (number of pictures or overall file sizes)?
    Thanks!

    If you have enough memory to keep all the images open simultaneously, then something like this might help.  Put all your images in the same directory on disk and have no other files in that directory. Then use List Folder from the Advanced File palette to get an aaray of the filenames.  Feed that array to a for loop where you open all the files and place the images into the pict ring.  I have written a "slide show" program which does this. Never tried it with 400 images though.
    If you do not have enough memory for all the images, then you need to manage the iamges much more carefully.
    Lynn

  • Maximum file size for export into MP4?

    Hello,
    I am not able to export 2 hour HD video into standard MP4 file. It seems that reaching 100% export algorithm gets into loop. I was waiting for hours and still had seen progress at exactly 100% with final file size on hard disk to be 0 bytes. I am using CS5 on MAC OS X. I had to split my timeline to 2 parts and to export them separately (which is embarrasing). Is there something like maximum file size for export? I guess that 2h video would have about 25-35GB.
    Thank you
    jiri

    You are right.
    So I am running AP Pro 5.0.4, Adobe Media Encoder 5.0.1.0 (64bit). Operating system Mac OS X ver 10.7.3. All applications are up to date.
    MacBook Pro Intel i5 2.53GB, 8GB RAM, nVidia GT 330M 256 MB, 500GB HDD
    Video is 1920x1080 (AVCHD) 25fps in a .MTS container  (major part of timeline), 1280x720 30fps in .MOV container (2mins), Still images 4000x3000 in .JPG
    No error message is generated during export - everything finishes without any problem...just file created has 0 byte size (as described above).
    This is my largest video project (1h 54min) I dont have any other problem with other projects.
    I dont run any other special software, at the moment of export all usual applications are closed so that MacBook "power" can go to Media Encoder. No codecs installed, using VLC Player or Quick Time.
    Attached please find printscreen from Export settings (AP Pro). Writing this ppost I tried to export only the first 4mins from timeline where all kind of media is used...and it was OK.
    As a next step I will try to export (same settings) 1h 30mins as I still believe problem comes with length of video exported.
    Let me know your opinion

  • Maximum file size of 2 GB exceeded please choose a shorter bounce time

    I have done a thorough search online (w/google) trying several combinations of words but I seem to be the only person on the planet with this error. I guess I will try to remake the project but I don't think it will fix the problem. I also will update my OS to 10.4.10, anyways here is the the error message.
    "Maximum file size of 2 GB exceeded please choose a shorter bounce time"
    I just bought and installed iLife 08 and the 8.1 update for Garageband. I have a 3 hr and 50 track on the time line and I have added chapter marks with 16k pics. The original combined track size (wav files) did exceed 2GB but I edited the files in itunes (to mono wav files) and swapped in the new files so that the max file size should only be 1.5GB. I still get the error message above when I try to export the podcast to disk using AAC w/Mono Podcast setting. Your help is appreciated.
    Message was edited by: Thad

    I am now the third person in the world to get this error message, except my GB project is a mere 1 hour and 45 minutes long. I've outputted longer projects before and never got this error before. It's a new one to me, and frustrating. How exactly am I supposed to choose a "shorter bounce time" if there is no explanation anywhere in Appleworld of what a bounce time is or how to set it shorter?
    Again, I'm not doing anything different with this project than others that shared successfully, and the dialog box estimates my outputted file size to be approx. 100 MB.
    Can anybody help me get through this error blockage? Please?

  • Maximum file size cfolders

    Hi all,
    I am searching for a hint concerning the maximum file size which can be obtained by cfolders.
    Is there any limit for a file, which can be handled by cfolders 4.0 ?
    Our customer wants to deal with files about 50-80 MB.
    Thanks a lot
    Best regards
    Andreas

    Hi,
    Is there any limit for a file, which can be handled by cfolders 4.0 ?
    No Limit for cFolders. it spurely depending upon your had disk size, based on this you can give size for cFolders.
    https://websmp203.sap-ag.de/~sapidb/012003146900000184472008E/Sizing_cProjects_40_V4.pdf
    http://service.sap.com/notes
    Note 900310 - cProjects 4.0 FAQ: General notes
    Note 994727 - Sizing Guide for cProjects 4.0
    Regards,
    Srini Nookala

Maybe you are looking for

  • Error while booting maverick

    error while booting maverick

  • Problem with STORE_LOC in BAPI_SALESORDER_CREATEFROMDAT2

    Hi gurus. I'm using BAPI_SALESORDER_CREATEFROMDAT2 to create sales orders and it works almost fine ... I've only one problem: is it possible to get automatically the storage? If I set the field STORE_LOC in the item, I get that storage in my order, o

  • Problem in UWL

    Hi, We have deployed ESS/MSS 1.0 business Package on EP 7.0. sp13 I am encountering problem in UWL. Whenever i click on any task in UWL, it displays following error :- Services is not started. We had then change setting for sysstem >>WAS properties i

  • Regarding the first day of the previous month

    hi experts,               suppose the date is '010406' what i want that is der any function module so that i can get the first date of the previous month i.e '010306' plz let me know......or sud i add logic for that if yes plz help me........

  • HT3939 How can I find out how much memory is used to run the iphone 5c and the 5s ?

    How or where can i find out how much memory the OS  iphone 5c and installed apps. use?  Then to find out how much memory the apps i plan to use uses? THANK YOU ALL. I HAVE A PURRING KITTEN IN MY ARMS.