Difficulty compressing a large file - compact command fails

I am trying to compress a 419 GB SQL Server database, following the instructions in the compression example described here:
http://msdn.microsoft.com/en-us/library/ms190257(v=sql.90).aspx
I have set the filegroup to read-only, and taken the database offline.  I logged into the server that hosts the file, and ran this from a command prompt:
===================================================== 
F:\Database>compact /c CE_SecurityLogsArchive_DataSecondary.ndf
Compressing files in F:\Database\
CE_SecurityLogsArchive_DataSecondary.ndf [ERR]
CE_SecurityLogsArchive_DataSecondary.ndf: Insufficient system resources exist to
complete the requested service.
0 files within 1 directories were compressed.
0 total bytes of data are stored in 0 bytes.
The compression ratio is 1.0 to 1.
F:\Database>
===============================================
As you can see, it gave me an error: "Insufficient system resources exist to complete the requested service."  The drive has 564 GB free, so I doubt that is the issue.  Here are some specs on the server:
MS Windows Server 2003 R2, Enterprise x64 Edition, SP2
Intel Xeon E7420 CPU @ 2.13 GHz; 8 logical processors (8 physical)
7.99 GB RAM
Any suggestions how to handle this?  I really need to get this large file compressed, and this method seems appealing if I can make it work, because you can supposedly continue to query from the database even though it has been compressed.  If
I use a compression utility like 7Zip, I doubt that I'll be able to query from the compressed file.

Hi,
Based on my knowledge, if you compress a file that is larger than 30 gigabytes, the compression may not succeed.
For detailed information:
http://msdn.microsoft.com/en-us/library/aa364219(VS.85).aspx
Regards,
Yan Li
Regards, Yan Li

Similar Messages

  • Difficulty compressing a huge database - compact command fails

    I am trying to compress a 419 GB database, following the instructions in the compression example described here: 
    http://msdn.microsoft.com/en-us/library/ms190257(v=sql.90).aspx
    I have set the filegroup to read-only, and taken the database offline.  I logged into the server that hosts the file, and ran this from a command prompt:
    ===================================================== 
    F:\Database>compact /c CE_SecurityLogsArchive_DataSecondary.ndf
     Compressing files in F:\Database\
    CE_SecurityLogsArchive_DataSecondary.ndf [ERR]
    CE_SecurityLogsArchive_DataSecondary.ndf: Insufficient system resources exist to
     complete the requested service.
    0 files within 1 directories were compressed.
    0 total bytes of data are stored in 0 bytes.
    The compression ratio is 1.0 to 1.
    F:\Database>
    ===============================================
    As you can see, it gave me an error: "Insufficient system resources exist to complete the requested service."  The drive has 564 GB free, so I doubt that is the issue.  Here are some specs on the server:
    MS Windows Server 2003 R2, Enterprise x64 Edition, SP2
    Intel Xeon E7420 CPU @ 2.13 GHz; 8 logical processors (8 physical)
    7.99 GB RAM
    Any suggestions how to handle this?  I really need to get this huge file compressed, and this method seems appealing if I can make it work, because you can supposedly continue to query from the database even though it has been compressed.

    I hope that you observed that this operation is only supported on read-only filegroups. Don't do this, if you plan to write to the tables in this filegroup.
    I don't know the fine-print of the COMPACT command, and a Windows forum is probably a better forum to ask. But a guess is that the command works in-memory only, in which case 419 GB is far more than a mouthful for a machine with 8 GB of memory.
    There are compression facilities in SQL Server itself, but you need Enterprise Edition for this. (And SQL 2008 or later.) But they are likely to give better effect than NTFS compression. And you don't have to set the filegroup as read-only.
    By the way, while 419 GB is not exactly a small database, it does not qualifies as huge. There are databases with thousandfold size out there.
    Erland Sommarskog, SQL Server MVP, [email protected]

  • BT Cloud - large file ( ~95MB) uploads failing

    I am consistently getting upload failures for any files over approximately 95MB in size.  This happens with both the Web interface, and the PC client.  
    With the Web interface the file upload gets to a percentage that would be around the 95MB amount, then fails showing a red icon with a exclamation mark.  
    With the PC client the file gets to the same percentage equating to approximately 95MB, then resets to 0%, and repeats this continuously.  I left my PC running 24/7 for 5 days, and this resulted in around 60GB of upload bandwidth being used just trying to upload a single 100MB file.
    I've verified this on two PCs (Win XP, SP3), one laptop (Win 7, 64 bit), and also my work PC (Win 7, 64 bit).  I've also verified it with multiple different types and sizes of files.  Everything from 1KB to ~95MB upload perfectly, but anything above this size ( I've tried 100MB, 120MB, 180MB, 250MB, 400MB) fails every time.
    I've completely uninstalled the PC Client, done a Windows "roll-back", reinstalled, but this has had no effect.  I also tried completely wiping the cloud account (deleting all files and disconnecting all devices), and starting from scratch a couple of times, but no improvement.
    I phoned technical support yesterday and had a BT support rep remote control my PC, but he was completely unfamiliar with the application and after fumbling around for over two hours, he had no suggestion other than trying to wait for longer to see if the failure would clear itself !!!!!
    Basically I suspect my Cloud account is just corrupted in some way and needs to be deleted and recreated from scratch by BT.  However I'm not sure how to get them to do this as calling technical support was futile.
    Any suggestions?
    Thanks,
    Elinor.
    Solved!
    Go to Solution.

    Hi,
    I too have been having problems uploading a large file (362Mb) for many weeks now and as this topic is marked as SOLVED I wanted to let BT know that it isn't solved for me.
    All I want to do is share a video with a friend and thought that BT cloud would be perfect!  Oh, if only that were the case :-(
    I first tried web upload (as I didn't want to use the PC client's Backup facility) - it failed.
    I then tried the PC client Backup.... after about 4 hrs of "progress" it reached 100% and an icon appeared.  I selected it and tried to Share it by email, only to have the share fail and no link.   Cloud backup thinks it's there but there are no files in my Cloud storage!
    I too spent a long time on the phone to Cloud support during which the tech took over my PC.  When he began trying to do completely inappropriate and irrelevant  things such as cleaning up my temporary internet files and cookies I stopped him.
    We did together successfully upload a small file and sharing that was successful - trouble is, it's not that file I want to share!
    Finally he said he would escalate the problem to next level of support.
    After a couple of weeks of hearing nothing, I called again and went through the same farce again with a different tech.  After which he assured me it was already escalated.  I demanded that someone give me some kind of update on the problem and he assured me I would hear from BT within a week.  I did - they rang to ask if the problem was fixed!  Needless to say it isn't.
    A couple of weeks later now and I've still heard nothing and it still doesn't work.
    Why can't Cloud support at least send me an email to let me know they exist and are working on this problem.
    I despair of ever being able to share this file with BT Cloud.
    C'mon BT Cloud surely you can do it - many other organisations can!

  • FTP and HTTP large file ( 300MB) uploads failing

    We have two IronPort Web S370 proxy servers in a WCCP Transparent proxy cluster.  We are experiencing problems with users who upload large video files where the upload will not complete.  Some of the files are 2GB in size, but most are in the hundreds of megabytes size.  Files of sizes less than a hundred meg seem to work just fine.  Some users are using FTP proxy and some are using HTTP methods, such as YouSendIt.com. We have tried explict proxy settings with some improvment, but it varies by situation.
    Is anyone else having problems with users uploading large files and having them fail?  If so, any advice?
    Thanks,
       Chris

    Have you got any maximum sizes set in the IronPort Data Security Policies section?
    Under Web Security Manager.
    Thanks
    Chris

  • Compressing a large file into a small file.

    So i have a pretty large file that i am trying to make very small with good quality. the file before exporting it is about 1gig. I need to make it 100mbs. Right now i've tried compressing it with the h.264 compressing type, and i am having to go as low as 300kbits. I use aac 48 for the audio. It is just way to pixelated to submit something like this. But i guess i could make the actual video a smaller size something like 720x480 and just letterboxing it to keep it widescreen? Any hints on a good way to make this 21 minute video around 100mbs?

    There are three ways to decrease the file size of a video.
    1. Reduce the image size. For example, the change of a 720x480 DV image to a 320x240 will decrease the size by a factor of 4
    2. Reduce the frame rate. For example, changing from a 30 fps to 15 fps will decrease the size by a factor of 2
    3. Increase the compression/ change code. This is the black magic part of online material. Only you can decide what's good enough.
    x

  • Downloading via XHR and writing large files using WinJS fails with message "Not enough storage is available to complete this operation"

    Hello,
    I have an issue that some user are experiencing but I can't reproduce it myself on my laptop. What I am trying to do it grab a file (zip file) via XHR. The file can be quite big, like 500Mb. Then, I want to write it on the user's storage.
    Here is the code I use:
    DownloadOperation.prototype.onXHRResult = function (file, result) {
    var status = result.srcElement.status;
    if (status == 200) {
    var bytes = null;
    try{
    bytes = new Uint8Array(result.srcElement.response, 0, result.srcElement.response.byteLength);
    } catch (e) {
    try {
    Utils.logError(e);
    var message = "Error while extracting the file " + this.fileName + ". Try emptying your windows bin.";
    if (e && e.message){
    message += " Error message: " + e.message;
    var popup = new Windows.UI.Popups.MessageDialog(message);
    popup.showAsync();
    } catch (e) { }
    this.onWriteFileError(e);
    return;
    Windows.Storage.FileIO.writeBytesAsync(file, bytes).then(
    this.onWriteFileComplete.bind(this, file),
    this.onWriteFileError.bind(this)
    } else if (status > 400) {
    this.error(null);
    The error happens at this line:
    bytes = new Uint8Array(result.srcElement.response, 0, result.srcElement.response.byteLength);
    With description "Not enough storage is available to complete this operation". The user has only a C drive with plenty of space available, so I believe the error message given by IE might be a little wrong. Maybe in some situations, Uint8Array
    can't handle such large file? The program fails on a "ASUSTek T100TA" but not on my laptop (standard one)
    Can somebody help me with that? Is there a better way to write a downloaded binary file to the disk not passing via a Uint8Array?
    Thanks a lot,
    Fabien

    Hi Fabien,
    If Uint8Array works fine on the other computer, it should not be the problem of the API, but instead it could be the setting or configuration for IE.
    Actually using XHR for 500MB zip file is not suggested, base on the documentation:
    How to download a file, XHR wraps an
    XMLHttpRequest call in a promise, which is not a good approach for big item download, please use
    Background Transfer instead, which is designed to receive big items.
    Simply search on the Internet, and looks like the not enough storage error is a potential issue while using XMLHttpRequest:
    http://forums.asp.net/p/1985921/5692494.aspx?PRB+XMLHttpRequest+returns+error+Not+enough+storage+is+available+to+complete+this+operation, however I'm not familiar with how to solve the XMLHttpRequest issues.
    --James
    We are trying to better understand customer views on social support experience, so your participation in this interview project would be greatly appreciated if you have time. Thanks for helping make community forums a great place.
    Click
    HERE to participate the survey.

  • Uploading large files to share fails - advice appreciated

    I wondered if anyone could help me with uploading large files. 
    The ability to distribute files to clients with the included 20Gb of server space was part of the reason I signed up to Creative Cloud; unfortunatly I'm having very mixed results and I'm constantly having to fall back on the free DropBox service which seems much more reliable.
    I'm working on a MacPro - 10.6.8 with a optical fibre based ISP 20Mb/s service; the service is, dare I say, usually very quick and reliable.  Before uploading a large file I set energy saver to "never sleep" in case I wonder off for a cup of tea, but enevitably come back to what looks like a "stuck" upload.  I've tried three times today to send a 285Mb file and they all failed so I reverted to sending the file by DropBox... with no problem - I've also sent files in excess of 1Gb by DropBox in the past without an issue.
    Todays file had reached about 50% according to the little progress bar.. it then stopped.   (it's a shame the progrees bar can't relay the actual upload speed and a percentage like DropBox so you can see what's going on)
    I have to state, I'm not a DropBox "fanboy" but it just seems to work...
    Are their any tricks or settings that could make Adobe's 20Gb of space a functioning part of my workflow?  It would be great to feel confident in it.
    Thanks in advance for any advice.

    Either Firefox or Chrome would be a good alternative.
    I would see if one of these browsers works for upload but we can also debug the issue on Safari.
    Here are the steps to debug on Safari:
    Add the Develop menu to Safari. From the menu choose Safari > Preferences. Select the Advanced tab and check the box for "Show Develop menu in menu bar".
    Once logged onto the Creative Cloud Files page (https://creative.adobe.com/files) Ctrl + Click anywhere on the page and from the browser context menu choose Inspect Element.
    In the window that opens in the bottom half of the browser click on the Clock icon (mouse over info tip says Instrument) and on the left select Network Requests.
    Try the upload and you will see various network requests. Look for those whose Name is "notification". When you select one in the list there will be an arrow next to "notification" and clicking on it will take you to the details. See screen shot. If successful you will have a Status of 200. Some of these "notification" items are for your storage quota being updated, but the one we want is for the successful upload.

  • Adobe Acrobat X Pro ZIP Compression Creates Larger Files

    Hello,
    I made a flyer in InDesign CS 6 which contains a fairly complicated vector image made with Illustrator. When I export to PDF, InDesign creates a file of the size of 26 MB (ish). I then run preflight in Acrobat which finds that the vector image isn't compressed.
    Note that my PDF settings in InDesign are set to bicubic subsampling for colour images that are above 450 px and JPG compression.
    When I tell Acrobat to apply fixes, it tries to apply ZIP compression but the file size is then 38 MB. Could anyone shed light on this for me please? There is obviously something basic about vector images and compression that I am missing.
    It also finds elements that are completely off the board, although I didn't put anything there with InDesign.

    Test Screen Name: After seeing your response I run a little test. I made a PDF straight from Illustrator just using that image then preflighted it and I got the same error.
    I then opened that same PDF in Illustrator, a lot of that artwork was turned into images. I think it's the transparency flattener as I am converting to PDF/X-1A with compatibility for Acrobat 4 (which I should have probably mentioned in the first post but I didn't make the connection between the two). As far as I understand the transparency flattener converts an image into a mix of paths and rasters (according to Adobe's own help files.)
    Preset used for preflight is sheetfed offset CMYK btw.

  • Large File Upload ~50mb Fails using Firefox/Safari. Upload Limit is 1gig set at Web App. IIS timeout set at 120 seconds.

    Strange issue.  User can upload the file using IE9, but cannot do the same using Safari, Chrome, or FireFox.  The library resides in a Site Collection that was migrated from a 2010 Site Collection.  The timeout on IIS is 120 seconds.Very strange
    indeed.
    Haven't been able to trace the Correlation ID at this point.

    try using rest api for this much file upload if you are using custom solution.
    try below links for reference:
    http://blogs.technet.com/b/praveenh/archive/2012/11/16/issues-with-uploading-large-documents-on-document-library-wss-3-0-amp-moss-2007.aspx
    https://angler.wordpress.com/2012/03/21/increase-the-sharepoint-2010-upload-file-size-limit/
    http://social.technet.microsoft.com/wiki/contents/articles/20299.sharepoint-2010-uploading-large-files-to-sharepoint.aspx
    http://stackoverflow.com/questions/21873475/sharepoint-2013-large-file-upload-1gb
    https://answers.uchicago.edu/page.php?id=24860
    http://sharepoint.stackexchange.com/questions/91767/multiple-file-upload-not-working
    Please mark it as answer if you find it useful happy sharepointing

  • How to compress a large file before downloading it?

    hiii guyz
    this is my first post
    i have a large excel sheet or a number of excel sheets that r generated using Jakarta poi...so b4 downloading them i hav 2 compress them into any zipped format like Zip or Gzip .....
    i dont know how to Start n really confused at this stage how to do that...
    please help me complete my task.....
    Thanx n Regards
    .......Chinnu..........

    hey thanx very much man....
    u had da solution in a flicker....
    if there is a problem i will getback 2 u...
    Thanx
    Chinnu

  • Compress/segment large file for DVD data burn?

    I have a file that is about 10meg (uncompressed video) and i would like to compress it or segment it into 4.5-4.7 chunks that i can burn to DVD and reassemble fairly easily.
    any ideas?
    thanks.

    Greetings,
    I use Stuffit Deluxe 12 for that sort of thing - It works real well for all forms of compression.
    Cheers,
    M.

  • Is there a way to "zip" or compress a large file so I can email it?

    I did not buy any zip software so I'm just wondering if there is something that comes with OSX snow leopard that allows me to compress a file.

    Hi,
    How big is this file that you like to sent by mail?
    There are very good 3rt party softwares that you can use for free to zip ore compress bigger files.
    I use Stuffit Expander, I think the most of us will used.
    You can download a copy on their website or go to VersionTracker or MacUpdate.
    Dimaxum

  • Compressing large files

    How do I compress a large file 25mb to send via email. I have stuffit, but no clue how to use it!
    Rose

    I used yousendit.com. problem solved.

  • Large File Copy fails in KDEmod 4.2.2

    I'm using KDEmod 4.2.2 and I often have to transfer large files on my computer.  Files are usually 1-2Gb and I have to transfer about 150Gb of them.  The files are stored on one external 500Gb HD and are being transferred to another identical 500Gb HD over firewire.  Also I have another external firewire drive that I transfer about 20-30Gb of the same type of 1-2Gb files to the internal HD of my laptop over firewire.  If I try to drag and drop in Dolphin, it gets through a few hundred Mb of the transfer then fails.  Now if I use the cp in the terminal, the transfer is fine.  Also when I was still distro hopping and using Fedora 10 with KDE 4.2.0 I had this same problem.  When I use Gnome this problem is non existent.  I do this often with work so it is a very important function to me.  All drives are FAT32 and there is no option to change them as they are used on serveral different machines/OS's before all is said and done and the only file system that all of the machines will read is FAT32 (thanks to one machine of course).  In many cases time is very important on the transfer and that is why I prefer to do the transfer in a desktop environment, so I can see progress and ETA.  This is a huge deal breaker for KDE and I would like to fix it.  Any help is greatly appreciated and please don't reply just use Gnome.

    You can use any other file manager under KDE that works, you know? Just disable that nautilus takes command of your desktop and you should be fine with it.
    AFAIR the display of the remaining time for a transfer comes at the cost of even more transfer time. And wouldn't some file synchronisation tool work too for this task? (someone with more knowledge please tell me if this would be a bad idea ).

  • Compressing large file into several small files

    what can i use to compress a 5gb file in several smaller files, and that will easy rejoined at a later date?
    thanks

    Hi, Simon.
    Actually, what it sounds like you want to do is take a large file and break it up into several compressed files that can later be rejoined.
    Two ideas for you:
    1. Put a copy of the file in a folder of its own, then create a disk image of that folder. You can then create a segmented disk image using the segment verb of the hditutil command in Terminal. Disk Utility provides a graphical user interface (GUI) to some of the functions in hdiutil, but unfortunately not the segment verb, so you have to use hditutil in Terminal to segment a disk image.
    2. If you have StuffIt Deluxe, you can create a segmented archive. This takes one large StuffIt archive and breaks it into smaller segments of a size you define.2.1. You first make a StuffIt archive of the large file, then use StuffIt's Segment function to break this into segments.
    2.2. Copying all the segments back to your hard drive and Unstuffing the first segment (which is readily identifiable) will unpack all the segments and recreate the original, large file.I'm not sure if StufIt Standard Edition supports creating segmented archives, but I know StuffIt Deluxe does as I have that product.
    Good luck!
    Dr. Smoke
    Author: Troubleshooting Mac® OS X

Maybe you are looking for