Empty trash with large file causes immediate restart screen

I had a 200+ gig iMovie which I dragged to trash. During the delete, the system "crashed" with a gray screen prompt that tells me to perform a forced hard restart. After rebooting, I look at the file size and it's 70+ gig. So perhaps some bad sectors or something? Anyway, unable to trash what is left of this file as I get the gray restart screen every time?
Any help would be appreciated. Thanks!!

Hi, and a warm welcome to the forums!
How much free space on what sized HD please?
If uyou havent already, start with these two steps, which may fix it also...
"Try Disk Utility
1. Insert the Mac OS X Install disc that came with your computer, then restart the computer while holding the C key.
2. When your computer finishes starting up from the disc, choose Disk Utility from the Installer menu. (In Mac OS X 10.4 or later, *you must select your language first.)*
*Important: Do not click Continue in the first screen of the Installer. If you do, you must restart from the disc again to access Disk Utility.*
3. Click the First Aid tab.
4. Click the disclosure triangle to the left of the hard drive icon to display the names of your hard disk volumes and partitions.
5. Select your Mac OS X volume.
6. Click Repair. Disk Utility checks and repairs the disk."
Then Safe Boot off the HD & use Disk Utility from there to Repair Permissions, reboot once more.

Similar Messages

  • Can't Empty Trash With Large Number of Files

    Running OS X 10.8.3
    I have a very large external drive that had a Time Machine backup on the main partition. At some point, I created a second partition, then started doing backups on the new partition. On Wed, I finally got around to doing some "housecleaning" tasks I'd been putting off. As part of that, I decided to clean up my external drive. So... I moved the old, unused and unwanted Backups.backupdb that used to be the Time Machine backup, and dragged it to the Trash.
    Bad idea.
    Now I've spent the last 3-4 days trying various strategies to actually empty the trash and reclaim the gig or so of space on my external drive.  Initially I just tried to "Empty Trash", but that took about four hours to count up the files just to "prepare to delete" them. After the file counter stopped counting up, and finally started counting down... "Deleting 482,832 files..." "Deleting 482,831 files..." etc, etc...  I decided I was on the path to success, so left the machine alone for 12-14 hours.
    When I came back, the results were not what I expected. "Deleting -582,032 files..."  What the...?
    So after leaving that to run for another few hours with no results, I stopped that process.  Tried a few other tools like Onyx, TrashIt, etc...  No luck.
    So finally decided to say the **** with the window manager, pulled up a terminal, and cd'ed to the .Trash directory for my UID on the USB volume and did a rm -rfv Backups.backupdb
    While it seemed to run okay for a while, I started getting errors saying "File not found..." and "Invalid file name..." and various other weird things.  So now I'm doing a combination of rm -rfing individual directories, and using the finder to rename/cleanup individual Folders when OSX refuses to delete them.
    Has anyone else had this weird overflow issue with deleting large numbers of files in 10.8.x? Doesn't seem like things should be this hard...

    I'm not sure I understand this bit:
    If you're on Leopard 10.5.x, be sure you have the "action" or "gear" icon in your Finder's toolbar (Finder > View > Customize Toolbar).  If there's no toolbar, click the lozenge at the upper-right of the Finder window's title bar.  If the "gear" icon isn’t in the toolbar, selectView > Customize Toolbar from the menubar.
    Then use the Time Machine "Star Wars" display:  Enter Time Machine by clicking the Time Machine icon in your Dock or select the TM icon in your Menubar.
    And this seems to defeat the whole purpose:
    If you delete an entire backup, it will disappear from the Timeline and the "cascade" of Finder windows, but it will not actually delete the backup copy of any item that was present at the time of any remaining backup. Thus you may not gain much space. This is usually fairly quick
    I'm trying to reclaim space on a volume that had a time machine backup, but that isn't needed anymore. I'm deleting it so I can get that 1GB+ of space back. Is there some "official" way you're supposed to delete these things where you get your hard drive space back?

  • Photoshop CS6 keeps freezing when I work with large files

    I've had problems with Photoshop CS6 freezing on me and giving me RAM and Scratch Disk alerts/warnings ever since I upgraded to Windows 8.  This usually only happens when I work with large files, however once I work with a large file, I can't seem to work with any file at all that day.  Today however I have received my first error in which Photoshop says that it has stopped working.  I thought that if I post this event info about the error, it might be of some help to someone to try to help me.  The log info is as follows:
    General info
    Faulting application name: Photoshop.exe, version: 13.1.2.0, time stamp: 0x50e86403
    Faulting module name: KERNELBASE.dll, version: 6.2.9200.16451, time stamp: 0x50988950
    Exception code: 0xe06d7363
    Fault offset: 0x00014b32
    Faulting process id: 0x1834
    Faulting application start time: 0x01ce6664ee6acc59
    Faulting application path: C:\Program Files (x86)\Adobe\Adobe Photoshop CS6\Photoshop.exe
    Faulting module path: C:\Windows\SYSTEM32\KERNELBASE.dll
    Report Id: 2e5de768-d259-11e2-be86-742f68828cd0
    Faulting package full name:
    Faulting package-relative application ID:
    I really hope to hear from someone soon, my job requires me to work with Photoshop every day and I run into errors and bugs almost constantly and all of the help I've received so far from people in my office doesn't seem to make much difference at all.  I'll be checking in regularly, so if you need any further details or need me to elaborate on anything, I should be able to get back to you fairly quickly.
    Thank you.

    Here you go Conroy.  These are probably a mess after various attempts at getting help.

  • Wpg_docload fails with "large" files

    Hi people,
    I have an application that allows the user to query and download files stored in an external application server that exposes its functionality via webservices. There's a lot of overhead involved:
    1. The user queries the file from the application and gets a link that allows her to download the file. She clicks on it.
    2. Oracle submits a request to the webservice and gets a XML response back. One of the elements of the XML response is an embedded XML document itself, and one of its elements is the file, encoded in base64.
    3. The embedded XML document is extracted from the response, and the contents of the file are stored into a CLOB.
    4. The CLOB is converted into a BLOB.
    5. The BLOB is pushed to the client.
    Problem is, it only works with "small" files, less than 50 KB. With "large" files (more than 50 KB), the user clicks on the download link and about one second later, gets a
    The requested URL /apex/SCHEMA.GET_FILE was not found on this serverWhen I run the webservice outside Oracle, it works fine. I suppose it has to do with PGA/SGA tuning.
    It looks a lot like the problem described at this Ask Tom question.
    Here's my slightly modified code (XMLRPC_API is based on Jason Straub's excellent [Flexible Web Service API|http://jastraub.blogspot.com/2008/06/flexible-web-service-api.html]):
    CREATE OR REPLACE PROCEDURE get_file ( p_file_id IN NUMBER )
    IS
        l_url                  VARCHAR2( 255 );
        l_envelope             CLOB;
        l_xml                  XMLTYPE;
        l_xml_cooked           XMLTYPE;
        l_val                  CLOB;
        l_length               NUMBER;
        l_filename             VARCHAR2( 2000 );
        l_filename_with_path   VARCHAR2( 2000 );
        l_file_blob            BLOB;
    BEGIN
        SELECT FILENAME, FILENAME_WITH_PATH
          INTO l_filename, l_filename_with_path
          FROM MY_FILES
         WHERE FILE_ID = p_file_id;
        l_envelope := q'!<?xml version="1.0"?>!';
        l_envelope := l_envelope || '<methodCall>';
        l_envelope := l_envelope || '<methodName>getfile</methodName>';
        l_envelope := l_envelope || '<params>';
        l_envelope := l_envelope || '<param>';
        l_envelope := l_envelope || '<value><string>' || l_filename_with_path || '</string></value>';
        l_envelope := l_envelope || '</param>';
        l_envelope := l_envelope || '</params>';
        l_envelope := l_envelope || '</methodCall>';
        l_url := 'http://127.0.0.1/ws/xmlrpc_server.php';
        -- Download XML response from webservice. The file content is in an embedded XML document encoded in base64
        l_xml := XMLRPC_API.make_request( p_url      => l_url,
                                          p_envelope => l_envelope );
        -- Extract the embedded XML document from the XML response into a CLOB
        l_val := DBMS_XMLGEN.convert( l_xml.extract('/methodResponse/params/param/value/string/text()').getclobval(), 1 );
        -- Make a XML document out of the extracted CLOB
        l_xml := xmltype.createxml( l_val );
        -- Get the actual content of the file from the XML
        l_val := DBMS_XMLGEN.convert( l_xml.extract('/downloadResult/contents/text()').getclobval(), 1 );
        -- Convert from CLOB to BLOB
        l_file_blob := XMLRPC_API.clobbase642blob( l_val );
        -- Figure out how big the file is
        l_length    := DBMS_LOB.getlength( l_file_blob );
        -- Push the file to the client
        owa_util.mime_header( 'application/octet', FALSE );
        htp.p( 'Content-length: ' || l_length );
        htp.p( 'Content-Disposition: attachment;filename="' || l_filename || '"' );
        owa_util.http_header_close;
        wpg_docload.download_file( l_file_blob );
    END get_file;
    /I'm running XE, PGA is 200 MB, SGA is 800 MB. Any ideas?
    Regards,
    Georger

    Script: http://www.indesignsecrets.com/downloads/MultiPageImporter2.5JJB.jsx.zip
    It works great for files upto ~400 pages, when have more pages than that, is when I get the crash at around page 332 .
    Thanks

  • Unable to empty trash with data from Time Machine Drive

    Hi there,
    Cause of the current issue:
    I am hoping the experts can help me with this one. I have an external hard drive which is used for my Time Machine Backups. Recently there has been an error on the backup to this drive on occasion. This error only occurs when the system does the back up, and appears only periodically, never when I do the back up manually. One of the threads I read to solve this issue is to delete some older backup files from your time machine drive, and all is good. This worked for awhile.... but now a new problem:
    New Problem:
    I have been able to empty all of the trash save one particular set of folders. So the trash always appears with content.
    The structure of the trash content appears as such...
    Macintosh HD
    +-Users
    +--<username>
    +---Library
    +----FK6T0KGc9y4KBB <-- This changes each time I try to empty the trash.
    The library folder always has the time/date of the last time I tried to empty the trash.
    What I have tried:
    1. I have tried to empty the trash Securely. No good.
    2. I have tried the terminal command : sudo rm -rf ~/.Trash with no success
    3. I have tried this terminal command : sudo rm -rf ~/.Trash/* with no success
    4. I have tried just plain old Empty Trash. But I get the message "This operation cannot be completed because the item 'FK6T0KGc9y4KBB' is in use.
    Any thoughts, recommendations?

    Hey thanks for the help anyway VK.
    I tried that command and no go either.
    Last login: Fri Jul 25 08:13:39 on console
    iMac:~ tmlohnes$ sudo rm -rf /Volumes/Time\ Machine\ Backups/.Trashes/501/*
    Password:
    rm: /Volumes/Time Machine Backups/.Trashes/501/2008-03-28-160634/Macintosh HD/Users/tmlohnes/Library/8xMlmPGUHzilET: Directory not empty
    rm: /Volumes/Time Machine Backups/.Trashes/501/2008-03-28-160634/Macintosh HD/Users/tmlohnes/Library: Directory not empty
    rm: /Volumes/Time Machine Backups/.Trashes/501/2008-03-28-160634/Macintosh HD/Users/tmlohnes: Directory not empty
    rm: /Volumes/Time Machine Backups/.Trashes/501/2008-03-28-160634/Macintosh HD/Users: Directory not empty
    rm: /Volumes/Time Machine Backups/.Trashes/501/2008-03-28-160634/Macintosh HD: Directory not empty
    rm: /Volumes/Time Machine Backups/.Trashes/501/2008-03-28-160634: Directory not empty
    iMac:~ tmlohnes$
    I am wondering, would it make sense to try this same command if I were to do a safe boot?
    What about formatting the external hard drive and starting fresh?

  • Is it possible to create a secure empty trash shortcut in file menu?

    I have second finger click enabled so that I can quickly move files to the trash bin from the contextual menu. This is fine and dandy, however, for files containing sensitive informaton, I'd like to have a "Secure File Delete" option available (file bypasses the trash bin and is wiped after selecting said option).
    I've searched the threads and checked system prefs. Only options are to change "empty trash" to "secure empty trash". This wouldn't work as I have more files that are typically trashed and not wiped. It'd be nice to have it available as a convenience I suppose
    I know theres an app in the app store that offers this functionality. It's $3-4.
    So my question...is it possible to create such a contextual menu option on my own?
    heres an example of what I'm ltalking about...

    Create a new Service in Automator.
    Set it to receive files an folders in the Finder.
    Drag in an Ask for Confirmation action if you want it.
    Drag in a Run Shell Script action from the Utilities section of the Library.
    Set it to Pass Input as Arguments and Replace the code with:
    srm -r "$@"
    --That's ess-ar-em for Secure ReMove (just copy and paste)
    Save it and it will show up in the Services menu when you right-click on an item in the Finder.
    There are options you can add to it like -s for simple, -m for medium. The default (without options) is 35-pass Gutman.
    You can see what options are available by opening Terminal and typeing
    man srm
    Hit space to scroll down. Q to quit the man page.
    The Verbose and Interactive options won't work since you can respond.
    Note that depending on the algorithm chosen, it may take some time for the file/folder to disappear from the Finder view.

  • Sample editor display jumps to file begin on zoom with large files

    Hey - does anyone else have this problem:
    When I zoom in to the max possible zoom level in the sample editor in Logic (eg to edit single samples with the pencil), in any audio file longer than 12:41, the waveform display suddenly jumps to very beginning of the file.
    It happens using either the zoom-in key, zoom slider or zoom tool. It is somewhat infuriating because I have to guess when to stop pressing zoom-in to get as close as i can without triggering the jump. (If i go one zoom level too far, I have to go back, zoom out and re-find my place and try again.)
    I did some investigation and the bug in zoom behavior starts happening with audio files a little shy of 12 min 41 sec ( 12:40.871ms to be more exact).
    Here are the results in "length in samples" of a test audio file (AIFF 24-bit Stereo, 44100Hz):
    33554453 samples and greater => sample editor jumps to beginning when zoomed in to max
    33554432 - 33554452 samples => sample editor jumps to END of file when zoomed to max (bizarre, eh? a 20-sample window in which the bug works in the OPPOSITE direction!)
    33554431 samples and less => sample editor zoom is normal and zooms in perfectly to the proper location at max zoom.
    I also tested other things like trashing my logic prefs and starting from an empty song with nothing in it - none of which make any difference. This bug is present in Logic 8.0.2 on both my Macbook Pro with OS X 10.5.7 and my Powerbook G4 with 10.4.11
    Maybe time to report this to apple - can anyone corroborate by just continually pressing your zoom-in key with the cursor in the middle of an audio file longer than 12:41 and see if the display jumps to the beginning?
    Thanks!

    bump?

  • IPhoto won't quit! Empty trash with 8000 +....

    Cleaned up iPhoto, deleting, now have 8000+ images in iPhoto trash. Click iPhoto Trash>Empty Trash, and separate upper right "Empty Trash" button... neither one will dump all these photos.... And the program freezes with swirling after the box shows it is emptying iPhoto trash.... NOT. Please help, after many tries this is becoming ugly.

    Hello Old Toad:
    Thank you for your suggestion. I've seen it in other discussions.
    I've been using iPhoto for many years. I've been using a 2 GHz Intel Core Duo 2007 era iMac the whole time. I'm now using iPhoto v8.1.2.
    My iPhoto library is now quite large with about 40,000 photos and about 175 G in total size. I have put a lot of time into organizing the photos. Everything is religiously backed up.
    Over the past year or so the performance of iPhoto in terms of speed has been slowly declining. It takes a long time to open, but the most cumbersome is the spinning beach ball when I "commit" to enhancements to a photo I've made (contrast, saturation, sharpness etc). Recently that beach ball could actually take 3-5 minutes. I find this intolerable.
    So I started researching this to see if others have the problem. As best I can make out after a few hours of research is that my library is too large and my computer too old. I don't want to replace my computer becuase of iPhoto so I started researching the idea of splitting my library into smaller libraries.
    I found several 3rd party programs (iPhoto Buddy, iPhoto Library Manager) that made it seem they may be able to help me easily split my library into several libraries. Unfortunately neither actually does this but instead tells you how you need to do that manually.
    The methods to split your library manually they suggest are horrendously tedious for a 40,000 photo library with over 1,000 carefully constructed Events.
    And this brings me to where I stand today. I copy my entire library to an external hard drive, open the copied library and what I WANT to do is prune it to say 1/10 of the size (say two years of my photos). (Realize I'll have to do this process of copying my entire library 10 times, and pruning each time.)
    But then I find out that iPhoto totally chokes when you try to trash more than "80-100" photos at a time! Morever, I come to find out that emptying the trash of even 80 photos takes 3-5 minutes!
    Although I appreciate the help and advice, perhaps you can understand my frustration with Apple. I'm literally an indentured servant to iPhoto if I want to preserve all the work I've done on my library. I'd switch to another program if it would preserve all my events but I can not find one that does.
    1. The performance of their software declines with no warning to the point where it can't be used unless you're unemployed and have nothing else to do.
    2. You look for a fix and find you have to split your library.
    3. When you try to split your library you find it will takes days and days of tedious work to complete.
    So that is my story. That is why I'm so angry with Apple.
    Have I missed anything or am I doomed to this dreary work?
    Thank you for any help and guidance you can provide.

  • Dealing with large files, again

    Ok, so I've looked into using BufferedReaders and can't get my head round them; or more specifically, I can't work out how to apply them to my code.
    I have inserted a section of my code below, and want to change it so that I can read in large files (of over 5 million lines of text). I am reading the data into different arrays and then processing them. Obvioulsy, when reading in such large files, my arrays are filling up and failing.
    Can anyone suggest how to read the file into a buffer, deal with a set amount of data, process it, empty the arrays, then read in the next lot?
    Any ideas?
    void readV2(){
         String line;
         int i=0,lineNo=0;
            try {
              //Create input stream
                FileReader fr = new FileReader(inputFile);
                 BufferedReader buff = new BufferedReader(fr);
                while((line = buff.readLine()) != null) {
              if(line.substring(0,2).equals("V2")){
                     lineNo = lineNo+1;
              IL[i] = Integer.parseInt(line.substring(8,15).trim());
                    //Other processing here
                     NoOfPairs = NoOfPairs+1;
                     }//end if
                     else{
                      break;
            }//end while
            buff.close();
            fr.close();
            }//end try
            catch  (IOException e) {
            log.append("IOException error in readESSOV2XY" + e + newline);
            proceed=false;
            }//end catch IOException
            catch (ArrayIndexOutOfBoundsException e) {
                   arrayIndexOutOfBoundsError(lineNo);
         }//end catch ArrayIndexOutOfBoundsException
         catch (StringIndexOutOfBoundsException e) {
              stringIndexOutOfBoundsError(e.getMessage(),lineNo);
    }//end V2Many thanks for any help!
    Tim

    Yeah, ok, so that seems simple enough.
    But once I have read part of the file into my program,
    I need to call another method to deal with the data I
    have read in and write it out to an output file.
    How do I get my file reader to "remember" where I am
    up to in the file I'm reading?
    An obvious way, but possibly not too good technically,
    would be to set a counter and when I go back to the
    fiel reader, skip that number of lines in the inpuit
    file.
    This just doesn't seem too efficient, which is
    critical when it comes to dealing with such large
    files (i.e. several million lines long)I think you might need to change the way you are thinking about streams. The objective of a stream is to read and process data at the same time.
    I would recommend that you re-think your algorithm : instead of reading the whole file, then doing your processing - think about how you could read a line and process a line, then read the next line, etc...
    By working on just the pieces of data that you have just read, you can process huge files with almost no memory requirements.
    As a rule of thumb, if you ever find yourself creating huge arrays to hold data from a file, chances are pretty good that there is a better way. Sometimes you need to buffer things, but very rarely do you need to buffer such huge pieces.
    - K

  • Adobe Illustrator crashes with large files

    when i open illustrator to view or edit a larger file i get
    once i hit ok, this is what i get
    this particular file is 588MB and it is an iAI file
    the computer im running this on is spec out to the following
    Win7 Ult 64bit
    I7-3770K cpu
    32 gb ram
    WD 500 GB Veloci Raptor
    samsung ssd 840 evo 120 gb scratch drive
    paging file set up as  min 33068 max 98304
    i tried with and with out extra scratch disk, with paging file set and with paging file set to system managed.
    i have uninstalled ai and re installed same response

    A 588 MB AI File? From where? CS5 is 32bit and expanding such a large file may simply go way beyond the measly 3GB you have in AI plus if it was generated in another program there may be al lsorts of otehr issues at play. This may be a hopeless cause and require you to at least instal lCS6 or CC 64bit as a trial...
    Mylenium

  • Working with Large files in Photoshop 10

    I am taking pictures with a 4X5 large format film camera and scanning them at 3,000 DPI, which is creating extremely large files. My goal is to take them into Photoshop Elements 10 to cleanup, edit, merge photos together and so on. The cleanup tools don't seem to work that well on large files. My end result is to be able to send these pictures out to be printed at large sizes up to 40X60. How can I work in this environment and get the best print results?

    You will need to work with 8bit files to get the benefit of all the editing tools in Elements.
    I would suggest resizing at resolution of 300ppi although you can use much lower resolutions for really large prints that will be viewed from a distance e.g. hung on a gallery wall.
    That should give you an image size of 12,000 x 18,000 pixels if the original aspect ratio is 2:3
    Use the top menu:
    Image >> Resize >> Image Size

  • Probs with large files in adapter engine

    Hi all,
    in a scenario I'm currently working on, I have to process files (from file adapter). These files have 1KB up to 15MB. So I set up the tuning parameters, that files with more than 2MB are queued in this special queue for large messages. This works fine. small messages are processed directly and the larger messages are queued and processed in sequence by the integration engine.
    My problem is now on the point where the integration engine sends this large files to the adapter engine. There are always 4 messages in parallel sent to the adapter engine and this slows down the whole system due to an additional self-developed module.
    So my question is: Can I restrict the sending of the messages to only 1 at a time from integration engine to adapter engine?
    The time of processing is not important. I think I can handle this with EOIO, but is there an other way? Perhaps depending on the queue?
    Thanks
    Olli

    Hi William,
    thx for your reply.
    Yes I think it is the easiest way. But the problem is, if a message runs to a failure the others will not be processed until the failure is fixed or processing is canceled.
    So I hoped to find a solution where I can restrict the sending to the adapter engine to one message at a time in order to not affect the processing of the other messages as in EOIO.
    Regards
    Olli

  • Bug report - Finder crashes in Cover Flow mode with large files

    I just came back from the Apple store and was told that I have discovered another bug in Leopard. When in Cover Flow view in Finder and trying to browse directories with large ( multiple GB) files, Finder continually crashes and reboots, oftentimes with 2 new FInder windows.
    I created a new user on my MBP to remove the effect of any preferences and the problem repeated itself.
    Come up Apple... get on top of these bugs and issue some patches...
    Not the kind of new OS I expected from a top notch company like Apple.

    Ah... that'll be it then, they are 512 x 512. well i guess that bugs been hanging around for quite some time now, anyone got any ideas if anyone's trying to fix it? i guess making 128 x 128 icons wouldn't be the end of the world but it does seem like a step backwards.
    one thing that still confuses me... an icon file contains icons of all sizes, why does cover flow not select the right size icon to display?
    thanks for that info V.K., much obliged.
    regards, Pablo.

  • ColdFusion Builder 3 - Open a large file causing CF Builder 3 to Hang

    When I try to open a large file having 2700 lines of code in CF Builder 3, it becomes completely unusable and in the task manager showing Not Responding.

    Verified this with a few other guys to make sure this is not just my environment.
    After a closer look, it looks like this file with 3000 lines of code is including (cfinclude) many other files and we think CF Builder 3 is trying to load them as well.
    In Dreamweaver above scenario could be turned off. And I do not see any options in CF Builder 3 to do that.

  • Mounting CIFS on MAC with large file support

    Dear All,
    We are having issues copying large files ( > 3.5 GB) from MAC to a CIFS Share (smb mounted on MAC) whereby the copy fails if files are larger than 3.5 GB in size and hence I was wondering if there is any special way to mount CIFS Shares (special option in the mount_smbfs command perhaps ) to support large file transfer?
    Currently we mount the share using the command below
    mount_smbfs //user@server/<share> /destinationdir_onMAC

    If you haven't already, I would suggest trying an evaluation of DAVE from Thursby Software. The eval is free, fully functional, and supported.
    DAVE is able to handle large file transfer without interruption or data loss when connecting to WIndows shared folders. If it turns out that it doesn't work as well as you like, you can easily remove it with the uninstaller.
    (And yes, I work for Thursby, and have supported DAVE since 1998)

Maybe you are looking for

  • SPUCHostService errors in logs

    Hi, I found below errors related to SPUCHostService.exe in log files. 1) Failed to create the sandboxed process 2) Error starting worker process. - Inner Exception: System.ComponentModel.Win32Exception: CreateSandBoxedProcess failed     at Microsoft.

  • Crashed when watching youtube in Firefox

    clean install Yosemite on mid 2014 MacBook Retina 15" first time in my life it crashes while watching YouTube - shutted down and restarted.

  • Upgrade cucm 4.1(3) to 7.1(2) using dma - import failed

    Hi every body, I have cucm 4.1(3) and I want to upgrade to 7.1(2). I used DMA to export the cucm database and the DMA result is : Validation = Success with Warnings DMA has generate the lic file : "licupgrade.lic" I created the answer file to install

  • How to start over in lightroom?

    I really have never learned how to maximize this program and normally use Adobe Bridge and Photoshop for my workflow.  I have imported over 100,000 photos in lightroom over time.  I would like to know how to wipe the slate clean so I can start over. 

  • Blackberry System software, core applications, and all non-third party apps are missing

    I'm using a BB Torch 9800. I just upgraded to the latest and greatest Desktop Software to 6.0.1 from 6.0, and all the subject items are missing from the Applications area. I tried going back to 5.0 since 6.0 is not available. Same problem there. Did