Oradata folder size over 6.5 gb
What should I do in this case? One temp.dbf file>3.5 gb. Which files can I delete from oradata folder
Amitesh
You should also check dba_temp_files to see if the temp file belongs to your temporary tablespace and v$logfile to see if the redo log files are used by the instance. You should also make sure that the files do no belong to another database instance running on the same server.
John
Similar Messages
-
Mystery folder size of over 400GB
Whilst cleaning up my boot drive i found the User account/Library folder size is over 400GB, but when I go to check which folder has the most data i find the biggest folder (out of approx 20) only holds 11GB all the rest are tiny. So how come the Libray folder is stating 410GB ?
Appreciate any help as to where these mystery files are and how to simply reduce the size of the library folder.
Can I also state I am no techy so please keep it simple.Do you have your boot drive allowed as the default primary volume for scratch?
Use a dedicated scratch drive and erase the drive between projects and have it be primary - no matter how much RAM CS6 at least is still using disk scratch.
As for your user account folder on the boot drive, another thing you can do - though maybe not always for the ~/Library - is to separate user account to a 2nd disk drsive to help performance and to manage the system boot drive as well. After doing that makes it easier to move the system to an SSD and enjoy its performance - new Samsung 840s even on SATA II.
Booting from PCIe 6G is another option with Sonnet Tempo Pro PCIe SSD device, and use a 2nd for your scratch SSD volume too.
With those changes you will see a marked improvement in processing and smooth as silk (or butter) fluid operations.
Premiere you do not want using the boot drive or home account for cache for primary scratch in any event. -
Hello every one! I have this script:
$log = ".\logfile.log"
$startFolder = "C:\VMs"
$colItems = Get-ChildItem $startFolder | Where-Object {$_.PSIsContainer -eq $True} | Sort-Object
foreach ($i in $colItems){
$itemSum = Get-ChildItem ("$startFolder\" + $i.Name) -recurse | Measure-Object -property length -sum
"$startFolder\$i -- " + "{0:N2}" -f ($itemSum.sum / 1KB) + " KB" >> $log
import-csv $log -delimiter "`t" | export-csv .\TEST.csv
$body = Get-Content $log | Sort | Out-String
Send-MailMessage -From [email protected] -To [email protected] -Subject "Test Theme" -attachment .\TEST.csv -Encoding ([System.Text.Encoding]::UTF8) -Body $body -SmtpServer test123.test.local
Remove-Item $log
Remove-Item .\TEST.csv
When this script done I
receive something like this in email message:
C:\VMs\Backup -- 0,00 KB
C:\VMs\Vbox -- 82 874 750,42 KB
C:\VMs\VMWARE_BACKUP -- 182 818,77 KB
How I can change this script to sort by folder size?
Not alphabetically.
Thanks in advance!Hello Darkwind,
you can do this, but it'll require some reformatting your script. I've done the honors:
$startFolder = "C:\VMs"
# Get folders
$colItems = Get-ChildItem $startFolder | Where-Object {$_.PSIsContainer -eq $True} | Sort-Object
# This one will do the measuring for us
$fso = New-Object -ComObject Scripting.FileSystemObject
# iterate over each folder and measure length
$FolderReport = @()
foreach ($i in $colItems)
# 1: Get Size
$folder = $fso.GetFolder($i.FullName)
$size = $folder.size
# 2: Set Properties in a hashtable
$props = @{
name = $i.FullName
size = $size
displaysize = (("{0:N2}" -f ($size / 1KB)) + " KB")
# 3: Add to results
$FolderReport += New-Object PSObject -Property $props
# Sort results
$FolderReport = $FolderReport | Sort size
# Build CSV attachment
$FolderReport | export-csv .\TEST.csv
# Build body
$lines = @()
$FolderReport | %{$lines += ($_.name + " -- " + $_.displaysize)}
$body = $lines -join "`n"
# Send mail
Send-MailMessage -From [email protected] -To [email protected] -Subject "Test Theme" -attachment .\TEST.csv -Encoding ([System.Text.Encoding]::UTF8) -Body $body -SmtpServer test123.test.local
# Cleanup file
Remove-Item .\TEST.csv
As you can see, some slight modifications :)
Now, just what did I change ...
I used the Scripting.FileSystemObject Comobject to measure size, since that's a bit faster.
When selecting the size, I built custom objects containing the information (Name & size), since if we want to sort by size, we need to keep it as a separate property.
Then I sorted by size, and exported the result into a csv file for attachment
I built the body from strings in memory - no point writing it to a file, just to reread it. (that "`n" should work as a linebreak)
Cheers,
Fred
There's no place like 127.0.0.1
Thank You! It's work for me!
But this script first shows the
folders from the small to the larger
size. Is it possible to do the opposite? -
Hi All,
My machine has been crashed, I was using windows O/S , I never took any type of backup i.e. Import/ Export or RMAN. All I have right now copy of ORADATA folder. Now I installed windows again and I want to recover my databases from ORADATA folder, could you please guide me step by step so that I could be able to recover my lost data.
Any immediate help will be highly appreciated.
Thanks in advance.hi,
I have got only data files in my ORADATA Folder
You have no redo log files[i]
I don't have control file there.
You have to create control file like
STARTUP NOMOUNT
CREATE CONTROLFILE REUSE DATABASE "taj"
NORESETLOGS ARCHIVELOG
MAXLOGFILES 5
MAXLOGMEMBERS 3
MAXDATAFILES 10
MAXINSTANCES 1
MAXLOGHISTORY 113
LOGFILE
GROUP 1 'D:\ORACLE\PRODUCT\10.1.0\ORADATA\TAJ\REDO01.LOG' SIZE 10M,
GROUP 2 'D:\ORACLE\PRODUCT\10.1.0\ORADATA\TAJ\REDO02.LOG' SIZE 10M,
GROUP 3 'D:\ORACLE\PRODUCT\10.1.0\ORADATA\TAJ\REDO03.LOG' SIZE 10M
DATAFILE
'D:\ORACLE\PRODUCT\10.1.0\ORADATA\TAJ\SYSTEM01.DBF' SIZE xxx,
'D:\ORACLE\PRODUCT\10.1.0\ORADATA\TAJ\USERS01.DBF' SIZE xxx,
CHARACTER SET xxxxxxxxxafter mount your database you have datafile or redo log files for open your database.
just try on your test machine and post here if any error you face.
regards
Taj
Message was edited by:
M. Taj -
I read on some site that there is an app that allows one to increase the desktop folder size limit in iOS 5.0.1 on an iPad2 does anyone have any info please?
Thanks for any thoughts in advance.....
DaveFirstly, Thanks for taking the time to reply :-)
It's not that I want gazillions of apps, I just wanted to put more in each folder so I don't have to have multiple folders with similar names....Weather1, 2, 3,etc.... but I see your point and appreciate the thoughts.
Dave -
Outlook 2010 - Data File Properties/Folder Size verses Windows Explorer pst file size
I am running Outlook 2010 32bit Version 14.0.6129.5000 on a Windows PC running Windows 7 Professional. All updates from MS are up to date.
I have several pst files I open with Outlook 2010. The size of the files displayed in Outlook are very different than what is displayed in Windows Explorer.
For example one of the pst file called "business.pst" when opened Outlook displays it under "Data File Properties -> Folder Size" that the Total Size (including subfolders) is 764,354 KB. Windows Explorer says
the file size is 1,190,417 KB.
For some reason MS Outlook 2010 is displaying the wrong folder size. Any ideas why this is the case?
Thanks,
PatOutlook mailbox grows as you create and receive items. When you delete items, the size of the Outlook Data File (.pst and .ost) file might not decrease in proportion to the data that you deleted, untill it has been compacted.
Normally, after you have deleted items from an Outlook Data File (.pst), the file will be automatically compacted in the background when you’re not using your computer and Outlook is running.
For an exception, when the Outlook Data File (.pst) is somehow corrupt, the compaction might not finish correctly. So the size of the Outlook Data File (.pst) file might remain the same before compaction.
To solve this, try run the
scanpst to fix the Outlook Data File (.pst) file first, after that, we can
manually start the compact command.
When finished, compare the file size again.
Max Meng
TechNet Community Support -
Finder won't display folder size in "Size" column or Info window
Normally, the Finder window will display a folder's size in the Size column. However, I have one folder which shows as "Zero KB" in the Size column. When I open the General Info window, the Size reads as "Zero KB on disk (Zero bytes) for 0 items."
In order to get the folder size I have to go into the folder, select all folders (there's many), then with either the Summary Info or Inspector window, it'll show the folder size ("185.32 GB on disk").
Why is this occurring in only this folder? Is it because it's a large folder or a folder with many subfolders?
I'm running Snow Lep 10.6.2. The folder is on an external drive in FAT32 format. I've already tried deleting the "com.apple.finder.plist" file and rebooting.
Thanks for any help.go to view menu->show view options and check the box 'calculate all sizes". then click "use as defaults".
-
Finder Folder Size Calculations
When I'm archiving, I have to do a lot of Get Info to calculate the folder sizes for DVD burning. Problem I'm having is the file size calculations start taking forever after working for a few minutes, or they don't calculate at all. Usually works fine when I start but after a minute or so It's like the computer get's tired and just stops. Even if it's a 1mb folder it will have trouble.
Is this normal? Seems really odd that something so incredibly simple can make my archiving process so difficult.
Files reside on a network HD.
Thanks for any help.I'd create a disk image of appropriate size, mount it, copy items to it until it tells you there's no more room. Then, burn it.
-
File / Folder size... How do Finder gets it?
Hi there!
While checking some of my hardrives for sucessfull shell script execution backups, i noticed that Finder took a loooot of time to provide the info about the Folder / File size on the Information window, while a simple ls -l could provide on the fly values regarding file or folder size, as always, fast simple and faithfull unix mindset!
Well, that makes me wonder, how does Finder calculates those values? Anyone got an hint?
Does it make it all by himself, if so. WHY? Been an "posix" system, shouldn't he be using whats already provided by its *nix heritage?
In this case, the value is already there, why don't use it and simply parse it to the desired format (MB / GB / TB)?
Maybe this could be only one of several features (GUI) that could improve it's performance from being more "conscious" of the system natively provided "toolkit".
Please, this is only my humble thouhgts on this, and please do shut me up if totally wrong!
Hope to get some nice and interesting feedback from the OS X community!
If there's a better spot to start this kind of discussion, please do mention it, i will appreciate it!
Best regards.Properties. Short Cuts, Change Icon resolved the issue after restarting. Problem resolved.. Thank you for all the help Dwain
-
Share Point folder size limit - File Services
I created a new folder inside a share point using Server Admin. Is there a way to set a folder size limit (quota) for that folder.
P.S. I am NOT talking about user accounts quotas for home folders created using Workgroup Manager! ...Just any new folder created, to be used on any volume, is there a way to set a mamimum size?
For example I have a 1TB volume on my Xserve RAID. I create a new folder but want to set its maximum capacity to 200GB. Is that possible?
Thank you very much in advance for any feedback.There's no direct way to set a limit on a folder size.
The 'simplest' method I can think of is to create partitions on your disk of the appropriate sizes and share these - they will have inherent size limits based on the size of the partition. It's a little messier but should solve your issue.
:: thinking :: you might be able to use disk images rather than partitions, although I've never tried sharing a mounted disk image, and you'd need to address automounting the disk images when the server starts, so it might not be viable. -
Maximum folder size in 10.5.2?
What is the maximum folder size now in 10.5.2?
For around 10 years I've been using numerous partitions to organize my files and for easy backup, but I know that quite a while ago (years) the maximum folder size was dramatically increased (2GB > ?).
Has anyone had problems with these larger folders?
My files are extremely important (clients' files) so safety and stability are more important to me than anything else (i.e. the simplicity of fewer partitions perhaps).
Of course I could simply do a search for specs regarding this on Apple's website, but it would not indicate as to just how stable such huge folders have been for everyone.
Thanks for any info on this.I went to
http://en.wikipedia.org/wiki/HierarchicalFileSystem
And I did carefully read the Apple white paper but still no where do I see a maximum folder size for within a volume
So I can only assume there is no maximum... that is... that you can actually have literally all of your files and folders in a volume within one folder even if it takes up 90% of the volume's/partition's space.
I originally posted this because rather than reformatting a couple of my hard drives... I was considering moving a couple of the partitions contents (each with around 40 GB of important files) to a larger partition and place each partition's contents in to individual folders. In-other-words... moving two 40 GB partitions to two separate folders within another 100 GB + partition. Thereby freeing up two 40 GB partitions without having to reformat my drives.
Anyone see a problem with that? -
Why does Time Machine folder sizes exceed physical capacity?
I have a MAC G3 Server Serving multiple MACs as a Storage, Media Server, and Time Machine storage. These other external MACs are all running Version 10.8.3. Only the Storage server is running 10.5.8.
All of the extenal Macs use Time Machine to backup to the served Storage media volume called "Time Machine".
The physical size of the "Time Machine" volume is 2.0 TB.
Yet when I am on the G3 Server, and perform a "Get Info" on the "Time Machine" volume, and also the folder that contains the Server's backups, the stored capacities exceed the physical capacity of the drive to which the volume resides.
The single physical drive is 2.0 GB and only has one partition called "Time Machine". In the root directory of this volume there is the one folder called Backups.backupdb, and a .sparsebundle DMG file for each of the remote systems that are bing backed up by Time Machine.
Each of the sparsebundles are well below the dirve's physical capacity, but both the local server's backup folder (Backups.backupdb) and the G3 server's local volume "Time Machine" show a used capcity that well exceeds the physical capacity of the drive on which it resides.
For example, here is the information from "Get Info" on the Backups.backupdb folder on the "Time Machine" volume when viewed from all systems:
Kind: Folder
Size: 3,456,930,478,415 bytes (3.46 TB on disk) for 122,254 items
But the Physical Volume "Get Info" shows this:
Kind: Volume
Capacity: 2.0 TB
Available: 169.92 GB
Used: 3,456,930,478,415 bytes (3.46 TB on disk) for 122,254 items
(NOTE: this is identical to the size
My guess is that there is some sort of deduplication involved.
If so, I would like to know the type of deduplication is performed?
Is it File Level deduplication, Block Level, or Content Level?
Also, is the deduplication inhererant on the volume (File System) level, or only specifically for Time Machine, via symbolic links?
Thank you in advance.
Jeff CameronThis is due to the way Time Machine stores files. While there is only one copy of each version of each file stored, there are multple links to that file, and each link is reported in Get Info at its restored size.
How Time Machine Works its Magic -
Folder size is twice that of its contents????
Startup Disk > Users > Username
My Username folder shows a folder size that is just about double the size of all of its visible folders. I can see this when viewed in the Finder in List View with Folder Size Shown as my default View setting. Any idea how I can figure out what is going on?
thanks
BradTopher,
Before I saw your reply, I emptied my trash and that brought the folder size down to what it should be to match its contents. I should have remembered that. None the less, I did what you suggested once I read your message and here it is:
total 2240
drwxr-xrwx 34 brad 503 1156 Feb 27 16:13 .
drwxr-xr-x 7 root admin 238 Dec 1 08:00 ..
-r-------- 1 brad 503 7 Dec 1 07:48 .CFUserTextEncoding
-rw-r--r--@ 1 brad 503 12292 Mar 1 16:22 .DS_Store
drwx------ 5 brad 503 170 Mar 1 16:42 .Trash
drwxr-x--x 4 brad 503 136 Apr 19 2011 .adobe
drwxr-xr-x@ 3 brad 503 102 Apr 27 2011 .autodesk
-rw------- 1 brad 503 546 Feb 15 2012 .bash_history
drwxr-xr-x 6 brad 503 204 Aug 28 2014 .blurb
drwx------ 3 brad 503 102 Feb 27 2014 .cache
drwxr-xr-x 4 brad 503 136 Feb 27 2014 .config
drwx------ 3 brad 503 102 Dec 6 2010 .cups
-rw-r--r-- 1 brad 503 1109647 Sep 23 2011 .fonts.cache-1
drwxr-xr-x 2 brad 503 68 Jun 18 2013 .gs5
drwxr-xr-x 3 brad 503 102 Sep 6 2011 .local
-rw-r--r-- 1 brad 503 239 Sep 6 2011 .mailcap
-rw-r--r-- 1 brad 503 368 Sep 6 2011 .mime.types
drwxr-xr-x 3 brad 503 102 Dec 18 2012 .viewcd
-rw-r--r-- 1 brad 503 90 Apr 17 2013 .vuescanrc
drwxr-xr-x 8 brad 503 272 Jun 26 2014 .wapi
drwxr-xr-x 4 brad 503 136 Dec 4 12:23 Applications
drwx------@ 3 brad 503 102 Dec 20 11:44 Creative Cloud Files
drwx---rwx@ 40 brad 503 1360 Mar 1 16:48 Desktop
drwx---rwx+ 7 brad 503 238 Nov 12 18:30 Documents
drwx------+ 3 brad 503 102 Feb 20 16:42 Downloads
drwx------@ 4 brad 503 136 Feb 22 22:43 Google Drive
drwxrwxr-x 5 brad admin 170 Dec 7 18:07 Incompatible Software
drwx------+ 66 brad 503 2244 Mar 1 15:07 Library
drwx------+ 5 brad 503 170 Sep 21 19:42 Movies -
Hi, everyone.
When I use command-F to search in Finder, the Folder Size option doesn't seem to work at all. Whenever I type some number (ie, greater than 1 kb), nothing was found. Is this function broken? Nor does the Size of Application or the File pathname work also.
Thanks for any suggestions.Hi Tempura
I believe that both Mike and I haven't yet to experience a corrupt index. Certainly in one case, I initiated it myself.
Although there appears a number have, if you analyse the postings, you find that more than not, the issues are not related to corrupt indexes.
Many issues are simply not understanding Spotlight, setting the Preference pane improperly, not letting Spotlight do a proper and complete indexing, messing around with the System prefs, not running Disk Utility to repair permissions after installing, updating or revising applications and the OS, insufficient ram, trying to index CDs, DVDs, servers, etc.
Personally, I find Spotlight amazing fast and accurate except when I have forgotten that I had adjusted the Preferance pane. I have tested various third-party products, and have tossed them out. Not that I don't share their unique functionalities, I just haven't found it to be advantageous to have them on board. Note that if you do install any third-party app, check to see how it may affect Tiger and all its utilities. There are some that will reset the preferences including Spotlight's and its ability to index.
If you haven't as yet, Apple and The XLab (the book especially) have some great stuff on Spotlight that is well worth reading. I say this, because no doubt, Spotlight is not going away and the introduction of Leopard will advance it even further, e.g., as described (http://www.apple.com/macosx/leopard/spotlight.html), and certainly in Time Machine.
Good luck
Daniel -
Sccm Client Cache folder size growing instead of limit set
Hi Everyone.
Greetings.
We have noticed that in our sccm environment, the cache folder size (under CCM path) in clients keeps increasing but however it should not as i believe the older contents automatically to be removed while downloading new contents.
The cache folder to download the contents as by default set to 5GB and we haven't modified anything on that.
Please suggest us on this as it would be much appreciated.
Thanks in advance
Regards,
ThangarajHi Jason,
Yes. However we are using software distribution as well.
We noticed recently this large size CCM folder (19GB) in one of our server as C drive is running out of space.
But when i looked at the contents inside that CCMCache folder, below are the findings.
The content size of all the advertisements are around 18GB.
The content size of software updates are around 1GB.
MS KB patches, Service packs are pushed through software advertisement(present inside folder named as package ID. So it cannot be considered as software updates deployment right?
Please suggest. Thanks.
Regards,
Thangaraj
Maybe you are looking for
-
My old computer had a Vista operating system, my new one has Windows 8. I am able to print and copy, but not scan. . I tried downloading the software that came with the printer, but it is too outdated to recognize the Windows 8 operating system.
-
Why won't firefox save the homepage I setted?
When I change my homepage, the next time I start firefox it is still the old one. I tried many things but still my homepage won't change. The old firefox did this properly. What can I do?
-
Why require cap file why not only class file as in java
why do we convert .class to .cap file why not simply load the .clss file to the card
-
Hi all [case expr when comparison_expr then return_expr end] It is written in the book that All of the expressions(expr,comparison expr and return_expr) must be of the same datatype.but when I run following statement it works .why??? select ename ,jo
-
Does anyone know of a DRM plug-in (like MS Office plug-in) for MS Outlook so Mail could be locked down?