Maximum disk size for Z61m?

I want to replace the original 80 MB HDD of my Z61m with a faster and larger one. What't the maximum disk size the controller/BIOS can cope with?
Gurk 
Thinkpad Tablet
Thinkpad T431s
ThinkPad Yoga S240 with OneLink Dock
Solved!
Go to Solution.

Any SATA laptop drive will do just fine, be it 160GB or 320GB. You know your needs as well as your budget the best...
Hope this helps.
Cheers,
George
In daily use: R60F, R500F, T61, T410
Collecting dust: T60
Enjoying retirement: A31p, T42p,
Non-ThinkPads: Panasonic CF-31 & CF-52, HP 8760W
Starting Thursday, 08/14/2014 I'll be away from the forums until further notice. Please do NOT send private messages since I won't be able to read them. Thank you.

Similar Messages

  • Maximum disk size for azure site recovery

    Hi everyone,
    I am looking into Azure Site Recovery, and I can't seem to find the maximum disk size I would be able to replicate into Azure.  I have read some articles saying that 1TB is the maximum size, and some people have said that it is 64TB!! I have a File
    Server that I would like to protect which is 4TB in size, and if the limit is 1TB I think it is very limiting...
    Any help would be greatly appreciated.
    Many Thanks.
    Robert Milner | MCITP: Virtualization Administrator | Website: http://www.remilner.co.uk | Twitter: @robm82

    Hello Robert,
    The current size limits for a VM replicating to Azure are :
    For an OS VHD (the vhd that has the OS Installation) : 127 GB
    For a Data VHD  (<1 TB)
    Is the size of your file server running on a single 4 TB volume?
    Anoop KV
    Hi Anoop,
    Our File Server is currently running on a single 4TB volume.  Do I have any options with regards to replicating this VM to Azure using Site Recovery?
    Many thanks.
    Robert Milner | MCITP: Virtualization Administrator | Website: http://www.remilner.co.uk | Twitter: @robm82

  • How to allocate disk size for each user in iMac?

    Hi folks,
    I have iMac (Mac OS X, version 10.7.5) for Family, and each of my family member has account on it.
    Now, kids download many large files (several GB size files) and HD is getting full.
    So, I'd like to set maximum disk size for each user.
    Could you prvide instruction how to configure disk size for each user?
    Regards,
    Hiro

    I don't know of any way to do that by user.
    I used to partition my HD and that sets a hard limit by partition, but by user? I don't think it can be done.

  • Maximum file size for export into MP4?

    Hello,
    I am not able to export 2 hour HD video into standard MP4 file. It seems that reaching 100% export algorithm gets into loop. I was waiting for hours and still had seen progress at exactly 100% with final file size on hard disk to be 0 bytes. I am using CS5 on MAC OS X. I had to split my timeline to 2 parts and to export them separately (which is embarrasing). Is there something like maximum file size for export? I guess that 2h video would have about 25-35GB.
    Thank you
    jiri

    You are right.
    So I am running AP Pro 5.0.4, Adobe Media Encoder 5.0.1.0 (64bit). Operating system Mac OS X ver 10.7.3. All applications are up to date.
    MacBook Pro Intel i5 2.53GB, 8GB RAM, nVidia GT 330M 256 MB, 500GB HDD
    Video is 1920x1080 (AVCHD) 25fps in a .MTS container  (major part of timeline), 1280x720 30fps in .MOV container (2mins), Still images 4000x3000 in .JPG
    No error message is generated during export - everything finishes without any problem...just file created has 0 byte size (as described above).
    This is my largest video project (1h 54min) I dont have any other problem with other projects.
    I dont run any other special software, at the moment of export all usual applications are closed so that MacBook "power" can go to Media Encoder. No codecs installed, using VLC Player or Quick Time.
    Attached please find printscreen from Export settings (AP Pro). Writing this ppost I tried to export only the first 4mins from timeline where all kind of media is used...and it was OK.
    As a next step I will try to export (same settings) 1h 30mins as I still believe problem comes with length of video exported.
    Let me know your opinion

  • Maximum HD size for MDD G4?

    Hi there
    I had a real pain in the proverbial when trying to do a clean install of Panther (or Tiger) on a 1Ghz G4 on a new 200 Gb HD. There were instablilities, Hands on shout down/restarts, etc, before running any updates or 3rd party apps.
    Is there a maximum HD size for these Macs? It came with a 80Gb HD.
    Thanks for any help...

    Some RAM that worked under prior versions of OS X has been known to fail. Tiger makes much more use of all available RAM for cache is part of the reason. Memtest would give it a good workout overnight.
    MDDs support Cable Select, it depends in part on the IDE cable itself. In fact, you could use it on older Macs and controllers most of the time.
    Never apply an update without first doing a backup and some disk repair and maintenance. Even flushing the cache folders with something like Applejack. Never use your Mac while an update is taking place. Reboot prior to applying an update, might want to use Safe Boot Mode. And use the standalone updates, and the combo update if possible. On some occasions it was necessary to reapply the combo update.
    With Tiger, I found that while it was possible to update, a full erase and install resulted in a better OS. So backup.
    If you keep your home directory or backup on another drive, so much the easier and better.

  • Event ID: 4, Source: Microsoft-Windows-Kernel-EventTracing, maximum file size for session "ReadyBoot" has been reached.

    Hello,
    I upgraded my machine to Win7 x64 Pro about 3 weeks ago. My HW is an Asus mobo, Intel Q9450 w/8GB RAM. The boot drives are two Raptors configured as RAID01. All the drivers are the latest available from Intel, Asus and 3rd party vendors. My WEI is 5.9, limited by the disk transfer rates, otherwise 7.1 and 7.2 on the other indexes.
    I've been receiving these errors at boot;
    Log Name:      Microsoft-Windows-Kernel-EventTracing/Admin
    Source:        Microsoft-Windows-Kernel-EventTracing
    Date:          11/10/2009 7:51:03 AM
    Event ID:      4
    Task Category: Logging
    Level:         Warning
    Keywords:      Session
    User:          SYSTEM
    Computer:      herbt-PC
    Description:
    The maximum file size for session "ReadyBoot" has been reached. As a result, events might be lost (not logged) to file "C:\Windows\Prefetch\ReadyBoot\ReadyBoot.etl". The maximum files size is currently set to 20971520 bytes.
    Event Xml:
    <Event xmlns="http://schemas.microsoft.com/win/2004/08/events/event">
      <System>
        <Provider Name="Microsoft-Windows-Kernel-EventTracing" Guid="{B675EC37-BDB6-4648-BC92-F3FDC74D3CA2}" />
        <EventID>4</EventID>
        <Version>0</Version>
        <Level>3</Level>
        <Task>1</Task>
        <Opcode>10</Opcode>
        <Keywords>0x8000000000000010</Keywords>
        <TimeCreated SystemTime="2009-11-10T12:51:03.393985600Z" />
        <EventRecordID>28</EventRecordID>
        <Correlation />
        <Execution ProcessID="4" ThreadID="164" />
        <Channel>Microsoft-Windows-Kernel-EventTracing/Admin</Channel>
        <Computer>herbt-PC</Computer>
        <Security UserID="S-1-5-18" />
      </System>
      <EventData>
        <Data Name="SessionName">ReadyBoot</Data>
        <Data Name="FileName">C:\Windows\Prefetch\ReadyBoot\ReadyBoot.etl</Data>
        <Data Name="ErrorCode">3221225864</Data>
        <Data Name="LoggingMode">0</Data>
        <Data Name="MaxFileSize">20971520</Data>
      </EventData>
    </Event>
    The image for PID 4 is listed as System.
    My searches have turned up similar events listed but no solutions.
    Any help would be appreciated.
    Cheers!

    Session "Circular Kernel Context Logger" failed to start with the following error: 0xC0000035
    As suggested above I assume this is a microsoft issue?  It has been discussed here and other forums for quite some time.  I never have seen a fix?  I wish when we received errors of this nature microsoft would tell us what they were.  How is this related to superfetch?  What is superfetch?  Why would superfetch have changed?
    BY THE WAY....  Superfetch is on(started) is on automatic and logs on as local system.  So this is not the cause of my issue.  Also what is readyboot?  Does the average computer really know what these programs/services or unique microsoft words/terms are?
    System
    Provider
    [ Name]
    Microsoft-Windows-Kernel-EventTracing
    [ Guid]
    {B675EC37-BDB6-4648-BC92-F3FDC74D3CA2}
    EventID
    2
    Version
    0
    Level
    2
    Task
    2
    Opcode
    12
    Keywords
    0x8000000000000010
    TimeCreated
    [ SystemTime]
    2010-04-11T14:35:49.829600000Z
    EventRecordID
    25
    Correlation
    Execution
    [ ProcessID]
    4
    [ ThreadID]
    48
    Channel
    Microsoft-Windows-Kernel-EventTracing/Admin
    Computer
    Daddy-PC
    Security
    [ UserID]
    S-1-5-18
    EventData
    SessionName
    Circular Kernel Context Logger
    FileName
    ErrorCode
    3221225525
    LoggingMode
    268436608
    Windows7, Windows, Win7

  • Maximum disk size?

    That's the maximum disk size or volume size supported? Thanks for any insights.

    The maximum volume size, and file size, in Mac OS X 10.4 or later, is close to 8 EB (exabyte, or 10**18 bytes). The actual number is 2**63 - 2**31.
    See Mac OS X: Mac OS Extended format (HFS Plus) volume and file limits

  • What is an idea of maximum file size for a film in Captivate?

    Hi there,
    I'm creating an elearning course in Captivate 7, and it is being published as HTML5. This means the films I've imported are being converted to MP4s, and they are around 10-20mb in size once they've converted. They seem very slow to load on some computers - do you think the file size is too big? Or could it be another issue? Does anyone have a recommendation for maximum file size for films? They are 572 x 322px and around 1-2 mins in length.
    Thanks in advance

    Probably better guidelines that dictates 'size' of a vi are:
    typically no more than 1 video screen in size
    is it legible
    and is it's function and operation clear.
    I have seen examples of '1 vi does it all' that were many screens wide and tall, totally a flustercuck, and nearly 1 MB in size.
    Globals and local variables (except for LV2 style) are typcially shunned for they can create a host of problems (race conditions, indeterminant data).
    Use connector pane to wire controls and indicators to. Then use wires between vi's to transfer data. I tend to use clusters to hold shared data.
    ~~~~~~~~~~~~~~~~~~~~~~~~~~
    "It’s the questions that drive us.”
    ~~~~~~~~~~~~~~~~~~~~~~~~~~

  • Maximum font size for auto-sized form text

    You need to let the user select a maximum font size for automatically sized text. If most of your fields are set at 10pt type, but you have one field for which you want the text to resize automatically, it looks stupid if that cell contains a small amount of text that’s set at 16pt or whatever. If you could set the maximum font size to 10pt for that field, then the text would be consistent with the text in the other fields unless there is so much text that it needs to scale down.

    Can anyone please advise?
    Cheers

  • Maximum file size for importing into Premiere Pro CS6

    Hi,
    I'm considering recording raw video, no compression at 720p, 30 fps for 20-30 min.  This woudl result in a file that is close to 80 Gbytes.  I'm wondering what the maximum file size for importing into premiere Pro is and if I need to split this file up into smaller pieces.  Any recommendations on how I may deal with uncompressed raw video that is pretty long in length?
    Thanks,
    Serena

    There is bound to be some limit somewhere in the program, but you should be well below that. The biggest issue might be whether your computer is up to the task of handling it. If you will post your computer specs., someone will give you comments on that aspect.
    Good luck,
    Hunt

  • Maximum heap size for 64bit JVM

    Hi,
    I am trying to set the maximum heap size for a java process in a 64bit JVM . I am not able to set more then 3G
    command line config:
    java -Xms64m -Xmx3g -Dcom.sun.management.jmxremote.authenticate=false -Dcom.sun.management.jmxremote.ssl=false -Dcom.sun.management.jmxremote.port=8000 com.superpages.puboptions.CampaignFeedStarter >> publisher.out 2>&1 &
    Hardware / software configs
    *$uname -a*
    SunOS labsbear 5.9 Generic_122300-19 sun4u sparc SUNW,Sun-Fire-V440
    *16GB total physical memory*
    *4 processor machine*
    *64 bit JVM*
    JDK1.6
    where is this limitation coming from. How to set the heap size to 6g.
    Thanks for your time
    Meena

    You need to use the -d64 switch to request the 64-bit JVM. E.g.,$ java -showversion -Xmx6g HelloWorld
    Invalid maximum heap size: -Xmx6g
    The specified size exceeds the maximum representable size.
    Could not create the Java virtual machine.
    $ java -showversion -d64 -Xmx6g HelloWorld
    java version "1.6.0_07"
    Java(TM) SE Runtime Environment (build 1.6.0_07-b04)
    Java HotSpot(TM) 64-Bit Server VM (build 10.0-b23, mixed mode)
    Hello world!

  • Please can you tell me the default maximum file size for an attachment in Case Management v12 ?

    Hi,
    Please can you tell me the default maximum file size for an attachment in Case Management v12+? I am able to define a maximum attachment size but I am not able to see what the default is set to.
    Thank you
    Regards,
    Anthony

    Hi,
    The default max attachment size is 8MB.
    Regards.
    Mike

  • Maximum package size for data packages was exceeded and Process terminated

    Hello Guru,
    When i am execute the process chain i got this message Maximum package size for data packages was exceeded and Process terminated,any body help to me in this case how can i proceed.
    Thanks & Regards,
    Suresh.

    Hi,
    When the load is not getiing processed due to huge volume of data, or more number of records per data packet, Please try the below option.
    1) Reduce the IDOC size to 8000 and number of data packets per IDOC as 10. This can be done in info package settings.
    2) Run the load only to PSA.
    3) Once the load is succesfull , then push the data to targets.
    In this way you can overcome this issue.
    You can also try RSCUSTV* where * is an integer to change data load settings.
    Change Datapackage size for extraction, use Transaction RSCUSTV6.
    Change Datapackage size when upload from an R/3 system, set this value in R/3 Customizing (SBIW -> General settings -> Control parameters for data transfer).
    IN R/3, T-Code SBIW --> Genaral settings --> Maintain Control Parameters for Data Transfer (source system specific)
    Hope this helps.
    Thanks,
    JituK

  • "Maximum package size for data packages was exceded".

    Hi,
    We are getting the below error.
    "Maximum package size for data packages was exceded".
    In our scenario we are loading the data product key wise (which is a semantic key as well) to the DSO thro’ a start routine.
    The logic in the start routine is such a way that it calculates the unique product counts , product key wise. Hence we are trying to
    group  the product key thro’ semantic groups.
    Ex: In this example the product counts should be A = 1,B=2 ,C = 1
      Product Key
      Products
      A
      1000100
      B
      2000100
      C
      3000100
      B
      2000300
      C
      3000100
    For some product keys the data is so huge that we could not load the data & we are getting the error.
    Please suggest any alternate way to  handle this thro’ code or introducing any other flow.
    Regards,
    Barla

    HI
    we can solve the issue by opening the system setting of data packer size
    like below we have create 2 programs, 1 for open the system settings,2 for
    close the settings .
    1 start program
    data: z_roidocprms like table of
    roidocprms.
    data: wa like line of z_roidocprms.
    wa-slogsys = 'system_client' . wa-maxsize = '50000'. wa-statfrqu = '10'.
    wa-maxprocs = '6'. wa-maxlines = '50000'.
    insert wa into table z_roidocprms.
    Modify roidocprms from table z_roidocprms .
    2 close program
    data: z_roidocprms like table of roidocprms.
    data: wa like line of z_roidocprms.
    wa-slogsys = 'syetm_client' . wa-maxsize = '50000'. wa-statfrqu = '10'.
    wa-maxprocs = '6'. wa-maxlines = '50000'.
    insert wa into table z_roidocprms.
    modify roidocprms from table z_roidocprms .
    data load infopakage settings we have to maintain like
    below
    we have create the process chain like as below
    1 start progarm
    data load infopakage
    2 close program.
    This might fix the problem.
    Regards,
    Polu.

  • Maximum package size for data packages was exceeded?

    Hi Experts,
    I am facing this problem "Maximum package size for data packages was exceeded". When I am trying to laod. I even tried to reduce data packet and change DTP settings in EXtraction to Get All New Data Request by Request but still same error is occuring. Can u Plz focus light on this.
    Thanks,
    Krishna

    You can refer to the below OSS note:
    Note 1144332 - Consulting note: Message RSBK 250: Package size exceeded
    And other related notes: 352038, 417307
    Hope this helps.
    Murali

Maybe you are looking for