Dense Restructure 1070020 Out of disk space. Can't create new data file

During a Dense Restructure we receive: Error(1070020) Out of disk space. Cannot create a new [Data] file.
Essbase 6.5.3 32-bit
Windows 2003 32bit w/16GB RAM
Database is on E: drive with 660GB space total, database is ~220GB.
All cubes are unlimited
Tried restoring from backup same problem.
Over years and years the database is never recalculated, never exported and imported, never verified. Only new data loaded and dense restructured.
Towards the end of a dense restructure (about 89 pan files through about 101 2GB pag files), getting an error: Error(1070020) Out of disk space. Cannot create a new [Data] file.
There are still several hundred GB of free space available, and we can write to this free space outside of the essbase application within windows.
The server's file system is consistent, defragmented, and can prove use of additional space. Hard drive controller and system does not report any "hardware issues".
Essbase.cfg file
; The following entry specifies the full path to JVM.DLL.
JvmModuleLocation C:\Hyperion\Essbase\java\jre13\bin\hotspot\jvm.dll
;This statement loads the essldap.dll as a valid authentication module
;AuthenticationModule LDAP essldap.dll x
DATAERRORLIMIT 30000
;These settings are here to deal with error 1040004
NETRETRYCOUNT 2000
NETDELAY 1600
App log
[Sat Oct 17 13:59:32 2009]Local/removedfrompost/removedfrompost/admin/Info(1007044)
Restructuring Database [removedfrompost]
[Sat Oct 17 15:48:42 2009]Local/removedfrompost/removedfrompost/admin/Error(1070020)
Out of disk space. Cannot create a new [Data] file. [adIndNewFile] aborted
[Sat Oct 17 15:48:42 2009]Local/removedfrompost///Info(1008108)
Essbase Internal Logic Error [7333]
[Sat Oct 17 15:48:42 2009]Local/removedfrompost///Info(1008106)
Exception error log [C:\HYPERION\ESSBASE\app\removedfrompost\log00002.xcp] is being created...
log00002.xcp
Assertion Failure - id=7333 condition='((!( dbp )->bFatalError))'
- line 11260 in file datbuffm.c
- arguments [0] [0] [0] [0]
Additional log info from database start to restructure failure
Starting Essbase Server - Application [removedfrompost]
Loaded and initialized JVM module
Reading Application Definition For [removedfrompost]
Reading Database Definition For [removedfrompost]
Reading Database Definition For [TempOO]
Reading Database Definition For [WTD]
Reading Database Mapping For [removedfrompost]
Writing Application Definition For [removedfrompost]
Writing Database Definition For [removedfrompost]
Writing Database Definition For [TempOO]
Writing Database Definition For [WTD]
Writing Database Mapping For [removedfrompost]
Waiting for Login Requests
Received Command [Load Database]
Writing Parameters For Database [removedfrompost]
Reading Parameters For Database [removedfrompost]
Reading Outline For Database [removedfrompost]
Declared Dimension Sizes = [289 125 2 11649 168329 1294 622 985 544 210 80 2016 11 9 9 8 8 1 1 6 1 3 1 2 2 1 2 1 2 77 2 65 1 1 1 1 1 1 1 1 1 1 1 260 4 3018 52 6 39 4 1577 6 ]
Actual Dimension Sizes = [289 119 1 1293 134423 1294 622 985 544 210 80 2016 11 9 9 8 8 1 1 6 1 3 1 2 2 1 2 1 2 77 2 65 1 1 1 1 1 1 1 1 1 1 1 260 4 3018 52 6 39 4 1577 5 ]
The number of Dynamic Calc Non-Store Members = [80 37 0 257 67 ]
The number of Dynamic Calc Store Members = [0 0 0 0 0 ]
The logical block size is [34391]
Maximum Declared Blocks is [1960864521] with data block size of [72250]
Maximum Actual Possible Blocks is [173808939] with data block size of [17138]
Formula for member [4 WK Avg Total Sls U] will be executed in [CELL] mode
Formula for member [Loc Cnt] will be executed in [CELL] mode
Formula for member [OH Str Cnt] will be executed in [CELL] mode
Formula for member [Current Rtl] will be executed in [CELL] mode
Essbase needs to retrieve [1017] Essbase Kernel blocks in order to calculate the top dynamically-calculated block.
The Dyn.Calc.Cache for database [removedfrompost] can hold a maximum of [76] blocks.
The Dyn.Calc.Cache for database [removedfrompost], when full, will result in [allocation from non-Dyn.Calc.Cache memory].
Writing Parameters For Database [removedfrompost]
Reading Parameters For Database [removedfrompost]
Unable to determine the amount of virtual memory available on the system
Index cache size ==> [1048576] bytes, [128] index pages.
Index page size ==> [8192] bytes.
Using buffered I/O for the index and data files.
Using waited I/O for the index and data files.
Unable to determine the amount of virtual memory available on the system
Reading Data File Free Space Information For Database [removedfrompost]...
Data cache size ==> [3145728] bytes, [22] data pages
Data file cache size ==> [0] bytes, [0] data file pages
Missing Database Config File [C:\HYPERION\ESSBASE\APP\removedfrompost\removedfrompost\removedfrompost.cfg], Query logging disabled
Received Command [Get Database Volumes]
Received Command [Load Database]
Writing Parameters For Database [TempOO]
Reading Parameters For Database [TempOO]
Reading Outline For Database [TempOO]
Declared Dimension Sizes = [277 16 2 1023 139047 ]
Actual Dimension Sizes = [277 16 1 1022 138887 ]
The number of Dynamic Calc Non-Store Members = [68 3 0 0 0 ]
The number of Dynamic Calc Store Members = [0 0 0 0 0 ]
The logical block size is [4432]
Maximum Declared Blocks is [142245081] with data block size of [8864]
Maximum Actual Possible Blocks is [141942514] with data block size of [2717]
Essbase needs to retrieve [1] Essbase Kernel blocks in order to calculate the top dynamically-calculated block.
The Dyn.Calc.Cache for database [TempOO] can hold a maximum of [591] blocks.
The Dyn.Calc.Cache for database [TempOO], when full, will result in [allocation from non-Dyn.Calc.Cache memory].
Writing Parameters For Database [TempOO]
Reading Parameters For Database [TempOO]
Unable to determine the amount of virtual memory available on the system
Index cache size ==> [1048576] bytes, [128] index pages.
Index page size ==> [8192] bytes.
Using buffered I/O for the index and data files.
Using waited I/O for the index and data files.
Unable to determine the amount of virtual memory available on the system
Reading Data File Free Space Information For Database [TempOO]...
Data cache size ==> [3145728] bytes, [144] data pages
Data file cache size ==> [0] bytes, [0] data file pages
Missing Database Config File [C:\HYPERION\ESSBASE\APP\removedfrompost\TempOO\TempOO.cfg], Query logging disabled
Received Command [Get Database Volumes]
Received Command [Load Database]
Writing Parameters For Database [WTD]
Reading Parameters For Database [WTD]
Reading Outline For Database [WTD]
Declared Dimension Sizes = [2 105 2 11649 158778 1279 609 971 531 208 78 2017 11 9 9 1 1 1 1 6 1 2 1 1 2 1 1 1 2 77 1 1 1 1 1 1 1 1 1 1 1 1 1 260 3 2954 52 6 39 4 1581 6 ]
Actual Dimension Sizes = [1 99 1 1293 127722 1279 609 971 531 208 78 2017 11 9 9 1 1 1 1 6 1 2 1 1 2 1 1 1 2 77 1 1 1 1 1 1 1 1 1 1 1 1 1 260 3 2954 52 6 39 4 1581 5 ]
The number of Dynamic Calc Non-Store Members = [0 29 0 257 57 ]
The number of Dynamic Calc Store Members = [0 0 0 0 0 ]
The logical block size is [99]
Maximum Declared Blocks is [1849604922] with data block size of [420]
Maximum Actual Possible Blocks is [165144546] with data block size of [70]
Formula for member [Loc Cnt] will be executed in [CELL] mode
Formula for member [OH Str Cnt] will be executed in [CELL] mode
Formula for member [Current Rtl] will be executed in [CELL] mode
Essbase needs to retrieve [1017] Essbase Kernel blocks in order to calculate the top dynamically-calculated block.
The Dyn.Calc.Cache for database [WTD] can hold a maximum of [26479] blocks.
The Dyn.Calc.Cache for database [WTD], when full, will result in [allocation from non-Dyn.Calc.Cache memory].
Writing Parameters For Database [WTD]
Reading Parameters For Database [WTD]
Unable to determine the amount of virtual memory available on the system
Index cache size ==> [1048576] bytes, [128] index pages.
Index page size ==> [8192] bytes.
Using buffered I/O for the index and data files.
Using waited I/O for the index and data files.
Unable to determine the amount of virtual memory available on the system
Reading Data File Free Space Information For Database [WTD]...
Data cache size ==> [3145728] bytes, [5617] data pages
Data file cache size ==> [0] bytes, [0] data file pages
Missing Database Config File [C:\HYPERION\ESSBASE\APP\removedfrompost\WTD\WTD.cfg], Query logging disabled
Received Command [Get Database Volumes]
Received Command [Set Database State]
Writing Parameters For Database [removedfrompost]
Writing Parameters For Database [removedfrompost]
Received Command [Get Database State]
Received Command [Get Database Info]
Received Command [Set Database State]
Writing Parameters For Database [TempOO]
Writing Parameters For Database [TempOO]
Received Command [Get Database State]
Received Command [Get Database Info]
Received Command [Set Database State]
Writing Parameters For Database [WTD]
Writing Parameters For Database [WTD]
Received Command [Get Database State]
Received Command [Get Database Info]
Received Command [SetApplicationState]
Writing Application Definition For [removedfrompost]
Writing Database Definition For [removedfrompost]
Writing Database Definition For [TempOO]
Writing Database Definition For [WTD]
Writing Database Mapping For [removedfrompost]
User [admin] set active on database [removedfrompost]
Clear Active on User [admin] Instance [1]
User [admin] set active on database [removedfrompost]
Received Command [Restructure] from user [admin]
Reading Parameters For Database [Drxxxxxx]
Reading Outline For Database [Drxxxxxx]
Reading Outline Transaction For Database [Drxxxxxx]
Declared Dimension Sizes = [289 126 2 11649 168329 1294 622 985 544 210 80 2016 11 9 9 8 8 1 1 6 1 3 1 2 2 1 2 1 2 77 2 65 1 1 1 1 1 1 1 1 1 1 1 260 4 3018 52 6 39 4 1577 6 ]
Actual Dimension Sizes = [289 120 1 1293 134423 1294 622 985 544 210 80 2016 11 9 9 8 8 1 1 6 1 3 1 2 2 1 2 1 2 77 2 65 1 1 1 1 1 1 1 1 1 1 1 260 4 3018 52 6 39 4 1577 5 ]
The number of Dynamic Calc Non-Store Members = [80 37 0 257 67 ]
The number of Dynamic Calc Store Members = [0 0 0 0 0 ]
The logical block size is [34680]
Maximum Declared Blocks is [1960864521] with data block size of [72828]
Maximum Actual Possible Blocks is [173808939] with data block size of [17347]
Formula for member [4 WK Avg Total Sls U] will be executed in [CELL] mode
Formula for member [Loc Cnt] will be executed in [CELL] mode
Formula for member [OH Str Cnt] will be executed in [CELL] mode
Formula for member [Current Rtl] will be executed in [CELL] mode
Essbase needs to retrieve [1017] Essbase Kernel blocks in order to calculate the top dynamically-calculated block.
The Dyn.Calc.Cache for database [Drxxxxxx] can hold a maximum of [75] blocks.
The Dyn.Calc.Cache for database [Drxxxxxx], when full, will result in [allocation from non-Dyn.Calc.Cache memory].
Reading Parameters For Database [Drxxxxxx]
Unable to determine the amount of virtual memory available on the system
Index cache size ==> [1048576] bytes, [128] index pages.
Index page size ==> [8192] bytes.
Using buffered I/O for the index and data files.
Using waited I/O for the index and data files.
Unable to determine the amount of virtual memory available on the system
Data cache size ==> [3145728] bytes, [22] data pages
Data file cache size ==> [0] bytes, [0] data file pages
Performing transaction recovery for database [Drxxxxxx] following an abnormal termination of the server.
Restructuring Database [removedfrompost]
Out of disk space. Cannot create a new [Data] file. [adIndNewFile] aborted
Essbase Internal Logic Error [7333]
Exception error log [C:\HYPERION\ESSBASE\app\removedfrompost\log00002.xcp] is being created...
Exception error log completed -- please contact technical support and provide them with this file
RECEIVED ABNORMAL SHUTDOWN COMMAND - APPLICATION TERMINATING

To avoid all these things as a best practice
we didn't allow dense restructure on the cubes size>30 GB
As an altrnative, we will export the level0 data, clear the DB, and load the new data. After that aggregate the cube to store the data at all the consolidation levels.

Similar Messages

  • "error: out of disk space message" while creating 20gig QT file

    Got an "error: out of disk space message" while exporting a 94min hdv project to the quicktime conversion (format:QT mvie, use mose recent settings / Options-dv/dvcpro -ntsc, size-dimensions compressor native) I also tried using compressor to creat a QT movie and got same error message. It gives me an estimate of 4-6 hrs, runs and at the end of 4hrs it gives me the error message. It is a large project, transitions filters 4+vid tracks 9 audio tracks, but Im using a dual4core, 4 gigs of ram.
    Is this a ram or hard drive problem. I asume its the disk but cant imagine Im running out of space. I am sending it to an external hard drive with over 80gigs of free space. This makes no sense, please help.
    Any help would be great since We have our sound mixer waiting for this as a video reference. This has been over a years worth of work and our sound mixing is the last thing to do. Please help us over the hump and I'll thank you on our website, www thelaughingplanet com, maybe the movie credits and maybe even send you a copy of the movie. Thanks, Marc

    The 20 gig qt file was what the mixer asked for. I guess he wants it instead of a dvd because it can be inserted to his pro tools. He asked for the 16:9 size dv/dvcpro ntsc format. He may be wrong and a half frame size would work. he did say he wanted the precision info provided by the QT file. Will the half frame give him the same frame and precision movie info of the 20 gig file with only a lower quality display image?
    The problem with dropping everything into a dv seq is that we have many sections that have multiple layers of video with movement and size changes. What is the problem with making a 16:9 QT file?

  • "error: out of disk space message" while exporting Quick time file

    Got an "error: out of disk space message" while exporting a 94min hdv project to the quicktime conversion (format:QT mvie, use mose recent settings / Options-dv/dvcpro -ntsc, size-dimensions compressor native) I also tried using compressor to creat a QT movie and got same error message. It gives me an estimate of 4-6 hrs, runs and at the end of 4hrs it gives me the error message. It is a large project, transitions filters 4+vid tracks 9 audio tracks, but Im using a dual4core, 4 gigs of ram.
    Is this a ram or hard drive problem. I asume its the disk but cant imagine Im running out of space.
    I've tried sending it to the "hard drive bay 1" with 30 free gigs and to the "hard drive bay2" which has 22 gigs open. can't imagine we are making a file bigger than that. I'm thinking about sending it to an external drive with 60 gigs free.
    Any help would be great since We have our sound mixer waiting for this as a video reference. Also trying to export OMF files but they are greater than 2 gigs so we're gonna try and split the track up with 4 tracks, 3 tracks & 2 tracks. This has been over a years worth of work and our sound mixing is the last thing to do. Please help us over the hump and I'll thank you on our website, thelaughingplanet and maybe even send you a copy of the movie. Marc

    ""hard drive bay 1" with 30 free gigs and to the "hard drive bay2" which has 22 gigs"
    Sounds like you're getting perilously low on drive space, you should always leave about 15% free on a drive for the OS to do it's thing. Either backup and clear some room, or get another drive.

  • How can I create a data file?

    I'm back. I found the CFA & PEF files and I am getting ready to go to work and delete all these on all my drives.
    What I need to know now is how to create a data only file on a DVD for back up.  I have successfully burnt a DVD in Encore, but I do not want to use that as a back up
    I feel like it would be degraded somewhat. Besides it is an Encore file.
    What I want is a Premiere file from CS4. This way I can always go back to it and make any changes or edits in the original time-line.
    If I can do this I was planning on using a DVD-RW so I could make changes later on.
    I also want to make a transfer DVD to install my projects on another computer. This will be a Video only computer with no internet connection to mess with my OS and slow things down.
    Thanks Again

    For the Projects, the trick will be the Asset files. If they will fit onto DVD DL discs, you're OK. If they are too large (DV-AVI is ~ 13GB per hour), then you'll need some type of backup, or spanning software, that will write files across multiple discs. A better choice, IMO, would be a 1 - 2 TB external, where you can use Project Manager, or just Windows Explorer and Copy, to do the entire Project folder with Assets. I'd recommend against a USB and go at least for FW-400. FW-800 would be better, but you'll need FW-800 connections on your computer. eSATA would be the best, but then you'll need eSATA connectors.
    Good luck,
    Hunt

  • Urgent: Function GUI_DOWNLOAD can't create any data file!

    We are using this function GUI_DOWNLOAD to download internal table data to a flat file.  But after running the program, we got no any data file created!  The code is in the following:
    data : v_file type string.
    PARAMETERS: p_file(128) TYPE c OBLIGATORY LOWER CASE.
    AT SELECTION-SCREEN ON VALUE-REQUEST FOR p_file.
      PERFORM get_local_file_name USING p_file.
      v_file = p_file.
    CALL FUNCTION 'GUI_DOWNLOAD'
      EXPORTING
        filename = v_file
        write_field_separator = '|'
      TABLES
        data_tab = itab.  "Our internal talbe filled with data
    FORM get_local_file_name USING p_p_file.
      CALL FUNCTION 'KD_GET_FILENAME_ON_F4'
        CHANGING
          file_name     = p_p_file
        EXCEPTIONS
          mask_too_long = 1
          OTHERS        = 2.
    ENDFORM. " get_local_file_name
    Please help and we will give you reward points!

    Hi Buddy,
    KD_GET_FILENAME_ON_F4 is used to locate file in a directory (on value request). It should be at the time of uploading data from local PC to internal table.
    Insted u can use a simple file locator F4_FILENAME.
    Even the i mentioned also a file locater.
    I think u don't need to locate a file, u need to save the file with new file.
    so u can use the standard function as follows:
    Class - CL_GUI_FRONTENDSERVICES
    Method - FILE_SAVE_DIALOG
    I think ur problem will be resolved.
    If ur problem is not solved go through following code:
    PARAMETERS : pa_fname LIKE rlgrap-filename.
    data : lf_fname TYPE string.
    INITIALIZATION.
    Initializing the File Name
      pa_fname = 'C:\temp\'.
    AT SELECTION-SCREEN ON VALUE-REQUEST FOR pa_fname.
    F4 functionality for File Name
      CALL FUNCTION 'F4_FILENAME'
        EXPORTING
          program_name  = syst-cprog
          dynpro_number = syst-dynnr
          field_name    = ' '
        IMPORTING
          file_name     = pa_fname.
      lf_fname = pa_fname.
      CALL FUNCTION 'GUI_DOWNLOAD'
        EXPORTING
          filename              = lf_fname
          filetype              = 'ASC'
          append                = ' '
          write_field_separator = 'X'
        TABLES
          data_tab              = ltab.

  • Flash builder 4.5 can't create new mysql file

    need help, have set up database and can get data onto datagrid but when add 'create' it adds fields and button, but when
    press button to save it does not save.
    tried the same setup on flash builder 4 and works, what do i have to set up to work on 4.5
    thanks.
    paul

    For god's sake will anybody from Adobe say something about this?
    I'm having the same issue and it's f***ing annoying.
    "Please uninstall and reinstall the product" is not an acceptable error message.
    Please provide a workaround.

  • Exporting 90 min project into Q/T. Error out of disk space??

    After an atmmpted export reaching 99% - Message "Error out of disk space"
    Can't export full movie.
    Tried exporting in smaller chunks and that seems to work but obviously not the result  i'm after.

    OK, let's play 20 questions!
    Here are the first four:
    Where is your media stored?
    What codec are you exporting?
    What drive are you exporting to?
    How full is that drive?

  • Tried installing Illustrator CC but ran out of disk space on primary drive... how can I install on secondary drive?

    Tried installing Illustrator CC but ran out of disk space on primary drive... how can I install on secondary drive?

    Run_Esperanza how much space do you still have available on your primary drive?  It is possible to customize the installation destination within the Creative Cloud Desktop application.  You can find more details at Install and update apps - https://helpx.adobe.com/creative-cloud/help/install-apps.html.
    Please be aware that you will need disk space on the primary drive to installed shared components plus contain the temporary files for installation.

  • "AE error: render can't continue -- out of disk space on all overflow volumes."

    Hi !
    I get this message when trying to render (no matter what format I use)
    "AE error: render can't continue -- out of disk space on all overflow volumes."
    So ... I googled it, and the only thing I could find was, that I should change my preferences (overflow volumes and select drive), BUT I can't find overflow volumes in pref.? (CS 5.5, windows).
    But why this error all of a sudden? I haven't changed anything and normally AE works fine.

    At the risk of stating the obvious, that sounds like the hard drive you're trying to render to is full or almost full.  Did you check to see if there's enough space for your render on that drive?  Depending on the codec you choose and the length of the comp you're trying to render, the file size could be huge and it might not fit on your hard drive even if it looks like you have enough space.  If you're render is stopping in the middle, that could be the reason why.  How much space is on the drive you're trying to render to and what are your render settings?  Maybe a screenshot of the following info as it shows up in your Render Queue would be useful:

  • "Error:Out of Disk Space" message. Why?

    I've been using an Apple G4 with FCP2 to capture footage through a Sony DSR-11 deck. I've set up a LaCie 500GB 7200, with most of its 500GB available, as capture disk. As late as last summer I used this system to capture 70+ hours of footage which was then edited on another system.
    This time I've met only problems. I've tried capturing 5 different DV tapes. They seem to be capturing but at the end of the hour I get the message "Error: Out of Disk Space".
    Does anyone know why this might be happening? I've checked the set up and everything seems in order. (These tapes are 4:3, the ones captured last year were 16:9 DV, so the only adjustment I had to make was un-checking "anamorphic".) I even tried capturing to the MAC hard drive to see if the problem was the LaCie drive, but I get the same same message. (The LaCie I've been using had been full with last year's capture scratch and the disk had been erased through LaCie's Silverlining program, so for a while I began to think that this may be the problem; that maybe I'd have to reformat the disk, etc. but this test seems to rule that out.)
    Thanks.
    John

    Since I last wrote I’ve changed hard drives, run tests on them, as well as trash my preferences, which I did earlier. Still, if I try to capture a tape (to disks with up to 460 GB available) I get the message “Error: Out of Disk Space”. The only thing I can think of doing now is deleting and then reinstalling the Final Cut software.
    Here I’m kind of dumb and don’t want to make any mistakes. This Apple has been a stand-alone computer used only for Final Cut and the software was installed when I got it. The rest of my life has been conducted on a PC. So is there anything I should do before deleting Final Cut? Should I be doing anything to protect exiting project files, captured clips, thumbnail cache etc.? Will Final Cut just recognize these when I re-install? What do I do? Just open the Mac hard drive, identify the Final Cut application and drag it to Trash? Do I need to empty the trash first or should I leave it there for safety?
    I look at the instructions that came with my ancient Final Cut 2 disks. Here’s what they say in full:
    To install QuickTime, follow these steps
    1) Insert the Final Cut Pro CD in your CD drive. (I see I have two disks: one “Final Cut Pro 2”, the other “Final Cut Pro 2:Tutorial Media” Do I need to install the second?)
    2) Double-click the CD’s icon, then double-click the QuickTime Installer folder.
    3) Double-click the QuickTime Installer icon and follow the onscreen instructions.
    4) When the Choose Installation Type window appears, select Custom, then click Continue.
    5) In the next window, click Select All, then continue
    This replaces any previous versions of QuickTime and ensures a complete installation of all components required by Final Cut Pro.
    6) When a message appears saying installation was successful, click Restart.
    Is this all straightforward or might there be any potential pitfalls in the above instructions for someone unfamiliar with installing software on a Mac (In my case old software and an old Mac – a G4)? I guess I should also ask if you folks out there agree that re-installing Final Cut may be the answer to my problem, that this error message is the result of a glitch that has arisen in the installed version of Final Cut.
    Thanks in advance for any help.
    John

  • My mac is runnug out of disk space. Do /i need to by external drives?  how much memory should i get

    My imac is running out of disk space. I download a lot of media from itunes. The HD movies and TV shows use so  much space. Should I switch to the cloud, clean up my memory or buy an external memory drive? If i get an external drive how much memory is suggesyed? 1TB, 2TB or more?

    Buy an external hard drive on which to save your media. The drive should be large enough to handle your entire iTunes Library plus how much you would anticipate adding to the library over the next few years. Then backup your existing library to the external drive: iTunes- Back up your iTunes library by copying to an external hard drive.
    After you have backed up the library to the external drive you can configure iTunes to use the library on the external drive in iTunes > Preferences > Advanced. Change the path to the external drive. Once you do that quit and relaunch iTunes. It should now be using the library on the external drive. Now you can delete the library on the internal hard drive to free up the space.

  • IBook g4 ran out of disk space, internet doesn't work

    Hi.
    I have an iBook G4, running OSX 10.3. Yesterday, we were looking at some pictures on the machine and we put too much stuff on hard drive; iBook said it ran out of space (presumably for virtual memory) and then froze.
    We rebooted, did a big archive/clean-up (ten gigs free, now). It ran sluggishly, I zapped the pram and nvram, it runs well. Now, we can't get the internet to work through our Airport. But here's the weird thing:
    -We have 4 bars. Airtunes works perfectly.
    -Our other iBook (G3, 10.4) connects to the internet just fine.
    -If we connect to our neighbor's airport, we get internet but it's really slow, slower than usual.
    Any thoughts?
    Gortmend

    Here is the info from the user under normal operation.
    1. HDD Size: 74G
    2. Data used: 63G
    3. Free disk space: 11G
    4. Two OS version running: OS X 10.3.9 and OS 9.2
    Here is the user report:
    When he access internet or check email, out of disk space message will alert him after a period. And he notice that the HDD has only few mega bytes free disk space. For this moment, he need to restart computer and free disk capacity will return to 11G. Thanks.
    Powerbook G4   Mac OS X (10.3.9)   Memory: 1.5G

  • How to set the limit for the warning 'running out of disk space'

    I am wondering if there's a way to set the number of GB before I get the system warning 'running out of disk space'.
    I saw this question asked in 2010 and it did not get an answer. It was archived so I am asking again. I know how to monitor the available space on my startup disk, and how to empty the trash to clear up disk space. I am looking for an OS X setting that I can change.

    My guess is that you want to lower the limit. IMHO I think the limit is alreay too low.
    If you are seeing this warning regularly then your disk is severely overfull and needs to be replaced with a larger one.
    Running the free space low eenough to get the warning greatly increases the chances for disk corruption. Once the disk gets corrupted then then next thing that happens is data lose.
    You have a serious problem with your Mac that should be dealt with immediately.
    Backing your data up would also be a good idea if you are not already doing that.
    Allan

  • [nQSError: 46118] Out of disk space.

    Hello,
    I'm having this error: [nQSError: 46118] Out of disk space. after launching several (heavvy) reports.
    The software configuration is an oracle bi 10.1.3.3.3 and oracle 10g db.
    What does it mean? I know it means that there isn't space left but where? In the db machine or in the bi server (presentatio? server?) machine?
    Is it a db problem? I don't think so beacuse there isn't an ORA-XXXX error code so i think is not an oracle error caught by the BI server and we discovered that clearing all cursors from the administration prevent this error to appear for a while.
    Can someone tell me if is an Oracle Bi (Server? Presentation?) problem and how to solve it? Do i have to set a parameter somewhere to limit the cursors or to high the cache?
    Thank you
    Maurizio

    Are you on Windows or Unix? If unix, what flavour?
    If you're on unix, try this:
    1) Just before running the reports, enter this command:
    while [ 1 -lt 2 ]; do date;ls -lrt;bdf;echo "";sleep 5;done
    (if not HPUX substitute bdf for df or whatever gives you the state of your filesystems)
    2) Run your report
    You'll get a snapshot every five seconds of how much space is available on your disk eg:
    Filesystem 1K-blocks Used Available Use% Mounted on
    /dev/sda3 18374532 8232104 9209052 48% /
    /dev/sda1 194442 11691 172712 7% /boot
    none 1037444 0 1037444 0% /dev/shm
    If it were a permissions issue then I'd have thought you'd get a permissions rather than space error.
    Is 10Gb the total free on the BI Server ? That's not a great deal of space. I've seen tmp files grow to 2Gb each, so you'd potentially only need a handful of users running big queries to break things.

  • DPM is out of disk space for the replica

    Trying to make a bare metal backup of a phyiscal server running 2008 R2 SP1 . I've checked im not out of space. I've uninstalled and reinstalled the agent, I've manually tried to make the disk allocation larger same error.
     DPM is out of disk space for the replica. (ID 58 Details: Internal error code: 0x809909FF)
    This is in the event log on the Client machine
    The backup operation attempted  has failed to start, error code '2155348040' (There is not enough free space on the backup storage location to back up the data.). Please review the event details for a solution, and then rerun the backup operation once
    the issue is resolved.
    I have several other machines doing bare metal backups they are virtual but I can't see why that would be any different. They have no problems doing this. I have seen hacks on the internet editing xml files to get this to work changing allcritical to "C:".
    I'd like an actual fix for this please?

    Mike quick question. How can I determine which share my particular server is using to backup to? Here is what I get when I try to do a DIR on the folder as system
    C:\Windows\system32>whoami
    nt authority\system
    C:\Windows\system32>dir
    \\kdnap-util1\0ae3c77eaab54ea09b0724737a125f48
    Access is denied.
    C:\Windows\system32>dir
    \\kdnap-util1\12068c1992634c92addf44bfbee5cada
    Access is denied.
    I was just trying to poke around to browse the folder to see which one to back it up to . But with the access denied error this isn't helpful :)
    Here is my list of shares
    C:\Windows\system32>net view \\kdnap-util1
    Shared resources at \\kdnap-util1
    I also tried doing a DIR from a server that is backing up correctly on any of those shares listed above after running psexec and I get access denied from them as well so I guess thats normal
    Share
    name                        Type  Used as  Comment
    0ae3c77eaab54ea09b0724737a125f48  Disk           0ae3c77eaab54ea09b0724737a125f4
    8
    12068c1992634c92addf44bfbee5cada  Disk           12068c1992634c92addf44bfbee5cad
    a
    32d87ea3f0ea4570bd1ae215c818686d  Disk           32d87ea3f0ea4570bd1ae215c818686
    d
    3429bef9926347babbf20bbb258fdec0  Disk           3429bef9926347babbf20bbb258fdec
    0
    5c9c98bbc69c4f0ab75df48e89f52fc3  Disk           5c9c98bbc69c4f0ab75df48e89f52fc
    3
    600460aa29bb4f198d0d0fb6b1636068  Disk           600460aa29bb4f198d0d0fb6b163606
    8
    62abc840b38444abaa9d8dc9bd43b852  Disk           62abc840b38444abaa9d8dc9bd43b85
    2
    6833cf7dec3e4f19a3bf7961d53c5ced  Disk           6833cf7dec3e4f19a3bf7961d53c5ce
    d
    68437b942573463b9437a15056bccf7c  Disk           68437b942573463b9437a15056bccf7
    c
    805320dcb886401fa3dc6a8213b3deec  Disk           805320dcb886401fa3dc6a8213b3dee
    c
    8404f7468dea47ce9e92a99536ebe13d  Disk           8404f7468dea47ce9e92a99536ebe13
    d
    8bcbc4deae44458da77fbe59661f60fb  Disk           8bcbc4deae44458da77fbe59661f60f
    b
    9bc5182a9d6b43478b7c09062ee1c9b0  Disk           9bc5182a9d6b43478b7c09062ee1c9b
    0
    b588fc67397e4a5a81a15bb5b52dbe31  Disk           b588fc67397e4a5a81a15bb5b52dbe3
    1
    Backups                           Disk
    bf941a02f6e847a095b03d01963b8df1  Disk           bf941a02f6e847a095b03d01963b8df
    1
    bfc06209ce514b939098a9fd701e05d7  Disk           bfc06209ce514b939098a9fd701e05d
    7
    BLAZELG                           Disk
    d35de342510248a0a34dac84ecd55d08  Disk           d35de342510248a0a34dac84ecd55d0
    8
    df1658496f364923b25f615625254195  Disk           df1658496f364923b25f61562525419
    5
    dfc6d6061f314fabbed0fb144c18df7c  Disk           dfc6d6061f314fabbed0fb144c18df7
    c
    e8fcdd5155c940bb8ecddd0e7a99be7b  Disk           e8fcdd5155c940bb8ecddd0e7a99be7
    b
    ee23ec6bc709410a9cfce4759c935f8f  Disk           ee23ec6bc709410a9cfce4759c935f8
    f
    f0f1272aeab7417d82a6473b3f8b3a91  Disk           f0f1272aeab7417d82a6473b3f8b3a9
    1
    library.old                       Disk
    The command completed successfully.
    OK figured out which share it is by viewing the sharename in the shared folders snap-in (mmc) it listed my server name which access on the share. its "600460aa29bb4f198d0d0fb6b1636068"

Maybe you are looking for