Generate large swap file (Use qtparted to generate partition

Desire to generate a 30GB swap file in Linux using the full capacity of a hard drive as swap.
Can this be done with the arch0.5 install CD?
I assume I must skip the auto feature to do so.  I do not know if the CD program will allow skipping the program install steps and just install lilo.
Perhaps someone has done a similar thing?

i3839;
Thanks for the comments.
The desired entry is not to hdd but to ram such that all operations take place in ram...no hdd used...no CD orDVD used (except tp start the ram install). 
While in ram, all video is improved when uncompressed video is entered (.vob for instance) and the result is higher quality and better resolution.
The basic question is in reference to arch install CD and whether it can be utilized to generate a HDD with just swap (one or more partitions) . 
My experience with the arch CD indicates it doesn't like the install to skip any steps.  Perhaps I can try a partition install and then abort the rest.  I think it will allow three separate swap (type 82) partitions:  at the least one of 2GB.  Perhaps it is programmed however, to require a root and a boot as well as a swap.
I suggest you read the previous post which describes the install to ram...called 'toram' in knoppix.  This doesn't use the hdd, just the dvd reader.  It can however utilize a hdd file entered in a dvd install....tohd.. and add the cheat code ...toram..
The end result is a ram only OS which can utilize USB hotplug devices for entry and storage, turning them off when not in use.
The present system here has 2.5GB of ram but it is not able to accept the DVD of 2GB when attempting ..toram.   The same ...toram..cheat code is available for the CD version which operates well in ram (only 680MB).  The use of swap to augment ram is recommended.
Dvd versions have close to 5GB of programs in compressed form in the 2.1GB DVD media.
I hope this post helps describe the need for the swap a little better.

Similar Messages

  • Large swap file (900MB), but no pageouts

    My MacBook Air 13" with 4GB of RAM accumulates a large swap file over time, but pageouts are 0 (page ins: 1.1million). Anyone know why? Is it possible to see which process/application currently have pages stored in the swap file?

    Sorry to dig this old thread up, but I am seeing an identical behavior to the original poster, and I just wanted to say—you did an excellent job of explaining how page ins can be very large with no pageouts, but I don't think this explains the real mystery, which is that there is a large amount of swap space, and a large amount the system says is used, but there are no page outs. You have not explained how a swap file gan grow in usage with no page outs, and if I understand things correctly, this should not be possible.
    I'm having the same issue on my new MacBook Pro with Retina display. I have 16GB of RAM and for the most part I don't use more than 4-6GB of that—I bought it for the occasional times I need to do a lot of VM testing, but I haven't needed to do that yet. I consistently see my swap usage grow to be as large as 2-3GB with a total size for all the swapfiles in /var/vm being 3-4GB.
    I don't need the space, and the system isn't slow or anything. I just want to know how this is possible. I have been using Mac OS X for 10 years now, and working on linux servers for 5 years or so. I've never seen swap usage be more than 0KB when there are no page outs.
    I've attached some screenshots of what I am seeing:
    Screen capture from Activity Monitor.
    Screen capture from Terminal executing 'du -hsc /var/vm/swapfile*' to tally the total size of the swapfiles.
    I should note that it tends to take a day or two of use to start to see this, in a series of sleep cycles here and there. I put my laptop to sleep at night as well as to and from work, etc. It probably sleeps/wakes 5-7 times a day in all. I tend to notice that the usage creeps up, starting atound 50 MB, then I will notice it being a few hundred some time later. It really makes me wonder if this has to do with some kind of discrete vs. dedicated graphics switching or something, perhaps a very low level operation that is somehow avoiding getting counted by the system's resource tracking facilities. I have no idea, but I would love it if there were someone out there who could explain it or point me in the right direction.
    Thanks for your time.

  • Large Swap file

      Model Name:    MacBook Pro
      Model Identifier:    MacBookPro6,2
      Processor Name:    Intel Core i5
      Processor Speed:    2.53 GHz
      Number Of Processors:    1
      Total Number Of Cores:    2
      L2 Cache (per core):    256 KB
      L3 Cache:    3 MB
      Memory:    8 GB
      Processor Interconnect Speed:    4.8 GT/s
    Running the latest version of Parallels and Windows 7 64 bit.
    Using the Activity monitor I find that I always have  a large swap file (even when parallels is not running), a huge  amount of inactive memory and a very small free memory. It slows down my system. Any suggestions?

    First, upgrade your system to 10.6.7 or 10.6.8. Second, large swap files don't necessarily mean a thing. You might see the following:
    About OS X Memory Management and Usage
    Reading system memory usage in Activity Monitor
    Memory Management in Mac OS X
    Performance Guidelines- Memory Management in Mac OS X
    A detailed look at memory usage in OS X
    Understanding top output in the Terminal
    The amount of available RAM for applications is the sum of Free RAM and Inactive RAM. This will change as applications are opened and closed or change from active to inactive status. The Swap figure represents an estimate of the total amount of swap space required for VM if used, but does not necessarily indicate the actual size of the existing swap file. If you are really in need of more RAM that would be indicated by how frequently the system uses VM. If you open the Terminal and run the top command at the prompt you will find information reported on Pageins () and Pageouts (). Pageouts () is the important figure. If the value in the parentheses is 0 (zero) then OS X is not making instantaneous use of VM which means you have adequate physical RAM for the system with the applications you have loaded. If the figure in parentheses is running positive and your hard drive is constantly being used (thrashing) then you need more physical RAM.
    Adding RAM only makes it possible to run more programs concurrently.  It doesn't speed up the computer nor make games run faster.  What it can do is prevent the system from having to use disk-based VM when it runs out of RAM because you are trying to run too many applications concurrently or using applications that are extremely RAM dependent.  It will improve the performance of applications that run mostly in RAM or when loading programs.
    Bear in mind you are running Parallels and a VM concurrently with some other OS X applications. Too many concurrent applications will result in using too much memory and increasing swapping.

  • Reading large XML file using a file event generator and a JPD process

    I am using a FileEventGenerator and a JPD Subscription process to read a large XML file. The large XML file basically contains repeated XML elements. My understanding is that the file subscription method reads the whole file in memory which causes lots of problem for huge file size like 1MB. Is there a way to read the file size-wise or is there a way to read chunks of data from a large size file..or any other alternative. I would like to process the file in a loop iteration by iteration.

    Hitejain,
    Here are a couple of pointers you could try. One is that the file event generator has a pass by reference (filename) functionality which you could use so that you could do the following inside of your process.
    1) Read file name from the reference
    2) Move the file to a processed directory (so it doesn't get picked up again. Note: I don't know how the embedded archive methods of the file event generator plays with pass by reference.
    3) Open a stream to the file.
    4) Use a SAX or SAX - DOM combined approach to parse your XML while managing the memory usage inside of your process
    There is another possibility which might fit your needs and it is related to the RawData object that BEA provides. If I understand it correctly provides wrapping functionality around a stream object, but depending on your parsing methods might just postpone the problem.
    Hope this helps
    Chris Falling
    Stormforge Software

  • Reading and Writing large Excel file using JExcel API

    hi,
    I am using JExcelAPI for reading and writing excel file. My problem is when I read file with 10000 records and 95 columns (file size about 14MB), I got out of memory error and application is crashed. Can anyone tell me is there any way that I can read large file using JExcelAPI throug streams or in any other way. Jakarta POI is also showing this behaviour.
    Thanks and advance

    Sorry when out of memory error is occurred no stack trace is printed as application is crashed. But I will quote some lines taken from JProfiler where this problem is occurred:
              reader = new FileInputStream(new File(filePath));
              workbook = Workbook.getWorkbook(reader);
              *sheeet = workbook.getSheet(0);* // here out of memory error is occured
               JProfiler tree:
    jxl.Workbook.getWorkBook
          jxl.read.biff.File 
                 jxl.read.biff.CompoundFile.getStream
                       jxl.read.biff.CompoundFile.getBigBlockStream Thanks

  • Error uploading large txt file using GUI_UPLOAD

    Hi everyone.
    The situation is as follows: I have to process (batch input) an extremely large text file, of about 80 MB. As you can imagine, all I receive when I run my program is a dump somewhere inside the GUI_UPLOAD function routine due to excessive memory usage.
    Does anybody know of a way to deal with this kind of files? Is there any function that allows partial processing or something like that?
    Thanks a lot,
    Fernando.

    If you have to process it all at once, you can have basis FTP it to your application server and then use OPEN DATASET and TRANSFER rather than GUI_UPLOAD. this would probably be the quickest solution.
    Rob

  • Loading Large XML files  using plsql

    I have a process where there is a need to load large xml files (i.e. easily over 500k or more) into Oracle via an interface. Preference would be to use plsql or some plsql based utility if possible. I am looking for any suggestions on the best method to accomplish this. Currently running on 9.2.0.6. Thanks in advance.

    I have a process where there is a need to load large xml files (i.e. easily over 500k or more) into Oracle via an interface. Preference would be to use plsql or some plsql based utility if possible. I am looking for any suggestions on the best method to accomplish this. Currently running on 9.2.0.6. Thanks in advance.

  • Issue with generating large pdf file using cfdocument tags in CF9

    We are in the process of upgrading our code to use cf9 and the cfdocument tag (from the old cfx_pdf tags).  We have successfully gotten one piece of our code to work but the file size of the pdf that we are generating now is huge in comparison to what it was using the CFX_PDF tags. (I.E.  with the new code the file is 885 KB in comparison to the old code generating only a 11KB file). We are not embedding fonts so the Fontembed = "no" didn't work for us.  We do have all of our images as .jpgs but unfortunately due to the volume of images that we have we can not switch all these files into another format.  Is there is way to shrink or optimize the pdf file size that we are generating? 
    Thanks so much for your help.
    Claudia

    We are in the process of upgrading our code to use cf9 and the cfdocument tag (from the old cfx_pdf tags).  We have successfully gotten one piece of our code to work but the file size of the pdf that we are generating now is huge in comparison to what it was using the CFX_PDF tags. (I.E.  with the new code the file is 885 KB in comparison to the old code generating only a 11KB file). We are not embedding fonts so the Fontembed = "no" didn't work for us.  We do have all of our images as .jpgs but unfortunately due to the volume of images that we have we can not switch all these files into another format.  Is there is way to shrink or optimize the pdf file size that we are generating? 
    Thanks so much for your help.
    Claudia

  • Editing a large PHP file, using {, }, (, ), or comma causing DW to lock-up

    I have a PHP file totalling around 7,000 lines of code. When I write { or }, ( or ), or a comma, at any point in the file, DW locks-up for a few seconds, then continues. This has been happening ever since I started using Dreamweaver CS4 - it never occurred with older versions.
    Any ideas?
    [Post moved to more appropriate forum -Forum Moderator]

    I don't have PHP files that are as long as that, so have no way of testing. However, one possibility is that Syntax Error Alerts are turned on. Dreamweaver CS4 doesn't check for PHP syntax errors; the syntax checker is for JavaScript. However, having the option turned on might cause the problem, because PHP and JavaScript both use braces and parentheses. Try turning off real-time syntax checking. It's toggled on and off by clicking the button in the Coding toolbar, as shown in the following screenshot:
    When turned on, the button looks pressed in like the two immediately above it in the screenshot.
    If that doesn't improve the situation, I suggest that you file a report to the Dreamweaver engineering team through the bug report form at http://www.adobe.com/cfusion/mmform/index.cfm?name=wishform. Give as much detail as possible to help the engineers try to reproduce the problem.

  • Handling large xml files using tree

    hi,
    I have an issue with a tree having xml data as the
    dataprovider. The front end makes a http service call to a servlet
    at a server to get the string representation of an xml file . The
    string format is changed to xml in the flex front end and made to
    display as a tree. Everything works fine when the xml data is small
    but as xml data grows in size the handling of data into the tree is
    very slow and the application does not respond sometimes..Can
    anyone tell me how i could get this issues solved or is there a
    better way to solve it. Or is there a way that the tree gets
    displayed only when all the nodes or loaded properly and there are
    no lags while user tries to navigate through it..
    Thanks in advance...

    Hi,
    Did you get a reply for this? If so,can you pl share it with me too?
    i need to send a XML +also need to set 2 parameters over Http Post to a servlet...
    Thanks,
    -uday.
    [email protected]

  • How to diagnose large swap usage

    Hi,
    We have a java application that is connecting to our oracle 10g database. When the application is not connected the swap file usage on the solaris box is about 1GB. But soon as we startup the java application and connect it to oracle the swap usage jumps to a massive 16GB (and stays like that until with shutdown the application - it does not really grow any more)
    The application actually looks fine and works without problems - expect the database is hammering the machine with the swap file. can anyone give me some hints at where I can look on the database to try and diagnose what is acutally causing this large swap file creation - in an attempt what is this application doing? At the moment I am confused as to what could actually cause this.
    thx.
    S.

    Hi, I'd use the iosnoop, iotop, rwsnoop and rwtop commands available in dtrace package (eg. http://users.tpg.com.au/adsln4yb/DTrace/dtracevstruss.html). It can tell a lot about i/o operations, also according to swap. But it only applies to the OS, I can't help you with the Oracle...

  • VAR and Swap Files - Taking up 3 gigs of space!

    My 60 gig HD has been getting close to its size limit so I ran DiskSweeper to see if there were some unneeded files etc. that I could throw away. I discovered that the VAR folder was using 1.95 gigs and in the VM folder, the swap files were taking up about 1 gig...
    I guess these are essential system files and can’t be deleted or reduced in size, or can they?
    Thanks in advance!

    To clear the swap file(s) restart your computer. I've never had more than two swap files, but then I have enough RAM for what I do. If your system keeps creating large swap files then it is a sign you need more RAM. If you can't get more, you'll need to limit the number of RAM hungry programs you have running at the same time. Because of the things that get dynamically created as you work, such as swap files and temp files, you should have a minimum of 10% of your drive free, and 15 would be safer. If I were you I would make sure I always had 10GBs free. You might look into getting an external drive, and move some of your files over to it. For instance, you can move your iTunes music library to another drive--I've done that because my startup drive is also a "mere" 60GBs.
    Francine
    Schwieder

  • Is it safe to have the swap file on a separate partition?

    I've just bought a second hand MBP 2012 (9,2 I think) with a 500GB hard drive and Yosemite pre-installed. Not going to get into a debate about it here, but I want to regress to Lion or Mountain Lion until Apple improves Yosemite's bugs and software manufacturers improve compatibility. However, I wouldn't mind also getting to know Yosemite. So my plan is to partition my drive and keep Yosemite on one partition, have Lion on the other, and use Lion for day to day stuff for the time being.
    Obviously this involves a shrinking hard drive. I have an image of Lion's installer on another partition in case I lose the DVD (which happens to me far too often), so at the moment my partition scheme breaks down as 10GB for Mountain Lion's installer, 100GB for Yosemite, and the remaining 390 (roughly) for Mountain Lion.
    Since the Yosemite partition is quite small, would it work if I made a symlink from /private/var/vm to the same folder on the Lion partition? Both folders will never be in use at the same time, so I can't think of any reason this wouldn't work - which would mean that the slippage and the swap files would all be located on the bigger partition, and the small size of my Yosemite partition wouldn't be a problem. Obviously when the time comes I would get rid of the Mountain Lion partition altogether and make the Yosemite one a lot bigger, but would that be an ok setup for now? Would it degrade performance for any reason if the swap files were on a separate partition?
    Anyone ever tried this?

    Thanks for the answer. The project is stored, saved, or burned to a DVD.
    When I put the burnt DVD in my "E" drive (or DVD player/burner) installed in my computer it comes up as a my project in "E" drive.
    It is a complete Primire Pro CS4 project. I has all my edits and effects just like it does when I open it from my HHD.
    I can put the DVD in my wives computer and it shows it is there but it will not open because she does not have CS4 installed.
    So it is not a movie. It is an exact copy of what the project looks like on my HHD.
    I can edit on it and do everything I did from the HHD copy.
    Hope this helps. Please feel free to question my responces.
    I really do want to clean up my HHD and start over with a single file. Hopefully gererated by the DVD.
    It seems like I could delete my files. It would be like I made this copy and sent it to you to do a final edit and add menus. You would not have the origional files on your HHD.
    After you loaded to you could then send it to your HHD and do whatever - Right?
    Jim

  • How do I break a pdf into smaller pdf files using Pro?

    How do I make a smaller pdf (i.e. pgs 17-22) from a larger pdf file using Adobe Acrobat Pro?

    It would help if you told us what version of Acrobat Pro you have. (The interface changed substantially in Acrobat X Pro.)
    If you're using Acrobat 9 Pro or Acrobat 8 Professional, choose Document > Extract Pages.

  • Changing /updating an xml file using JAXP(DOM)

    Hello,
    i am fairly new to xml and am using it in my degree project.I am able to retrieve and read data from a fairly large xml file using JAXP(DOM) and/or XMLBeans.I am having difficulties in updating the xml document. Any updation i believe is ito be saved into a new xml document,but dont know how to proceed with it. Any help would be appreciated.
    Following is a snippet of my code using JAXP. Here i am able to retrieve data from the source file.
    File document=new File("C:\\tester.xml");
    try {
    DocumentBuilderFactory factory
    = DocumentBuilderFactory.newInstance();
    DocumentBuilder parserr = factory.newDocumentBuilder();
    Document doc=parserr.parse(document);
    System.out.println(document + " is well-formed.");
    NodeList n2=doc.getElementsByTagName("Top");
    NodeList n3=doc.getElementsByTagName("Base");
    int x=n2.getLength();
    System.out.println("There are " x "players");
    for(int g=0;g<=x;g++)
    System.out.println("Top is" + n2.item(g).getFirstChild().getNodeValue()+" Base is" +n3.item(g).getFirstChild().getNodeValue());
    --------------------------------------------------------------------------------

    Following is my updation code to the dom tree:
    NodeList list=doc.getElementsByTagName("Information");
    for(int i=0; i<list.getLength();i++){
    Node thissampnode=list.item(i);
    Node thisNameNode=thissampnode.getFirstChild();
    if(thisNameNode==null) continue;
    if(thisNameNode.getFirstChild()==null)continue;
    // if(thisNameNode.getFirstChild() !(instanceof org.w3c.dom.Text) continue;
    String data=thisNameNode.getFirstChild().getNodeValue();
    if (! data.equals("0.59")) continue;
    Node newsampNode = doc.createElement("Samp");
    Node newsampTopNode = doc.createElement("Top");
    Text tnNode = doc.createTextNode("0.50");
    newsampTopNode.appendChild(tnNode);
    Element newsampRef = doc.createElement("Ref");
    Text tsr = doc.createTextNode("0");
    newsampRef.appendChild(tsr);
    Element newsampType = doc.createElement("Type");
    Text tt = doc.createTextNode("z");
    newsampType.appendChild(tt);
    Element newsampbase = doc.createElement("Base");
    Text sb = doc.createTextNode("0.55");
    newsampbase.appendChild(sb);
    newsampNode.appendChild(newsampTopNode);
    newsampNode.appendChild(newsampRef);
    newsampNode.appendChild(newsampType);
    newsampNode.appendChild(newsampbase);
    rootNode.insertBefore(newsampNode, thissampnode);
    Here i dont see any changes to the original xml source file.

Maybe you are looking for

  • Older forms not editable in Adobe X

    We currently have Adobe Acrobat Pro X installed on a Windows 7 computer.  Forms created in an older version of Adobe (exact version unknown) will not allow a copy to be made so the form can be edited. It doesn't allow us to actually "Save a Copy".  I

  • Outer join with NVL function

    Hello I am trying to translate the following Sybase Query to Oracle . I am running the query on a 10gR1 database. The problem is with the NVL clause on the join I am getting a runtime exception . Is there a way to re-write this query without breaking

  • Incorporating an album created with iweb into a website created with Dreamw

    Hi, is it possible to create photo albums with iweb (I know, the answer to that is YES) and then incorporate them into my website created with Dreamweaver? If so, can anyone explain to me how to do this? I cannot find, on the 'puter, where the albums

  • Copy to USB to Fast / Can't Unmount

    So, the title pretty much says it all.  I copy a file to the USB drive (~700mb) which goes super fast, then I can't unmount the drive for a long time after that (~3-4min).  There's no copy dialogue open, same issue with all file managers/basic cli cp

  • My macbook air became too slow, my macbook air became too slow

    Hi dear Mac users My macbook air became very slow.... is it because I have too many pics? or can I do some type of organization that I never did before?