Compress the database

Hi all,
i want to know, how can i compress my database? Can i delete .trc File? Please help!
thank you very much and greeting from germany!
rabbit

There are 3 locations where the traces files are stored. With this information you will be able to know what are the purpose of trace files and where they are located.
BACKGROUND_DUMP_DEST
Parameter type
String
Syntax
BACKGROUND_DUMP_DEST = {pathname | directory}
Default value
Operating system-dependent
Parameter class
Dynamic: ALTER SYSTEM
Range of values
Any valid local path, directory, or disk
BACKGROUND_DUMP_DEST specifies the pathname (directory or disc) where debugging trace files for the background processes (LGWR, DBWn, and so on) are written during Oracle operations.
An alert file in the directory specified by BACKGROUND_DUMP_DEST logs significant database events and messages. Anything that affects the database instance or global database is recorded here. The alert file is a normal text file. Its filename is operating system-dependent. For platforms that support multiple instances, it takes the form alert_sid.log, where sid is the system identifier. This file grows slowly, but without limit, so you might want to delete it periodically. You can delete the file even when the database is running.
See Also:
Oracle9i Database Administrator's Guide for more information on setting this parameter and on alert files
Your operating system-specific Oracle documentation for the default value of this parameter
"USER_DUMP_DEST" for information on setting a destination for server process trace files
http://download-west.oracle.com/docs/cd/B10501_01/server.920/a96536/ch114.htm#REFRN10008
USER_DUMP_DEST
Parameter type
String
Syntax
USER_DUMP_DEST = {pathname | directory}
Default value
Operating system-dependent
Parameter class
Dynamic: ALTER SYSTEM
Range of values
Any valid local path, directory, or disk
USER_DUMP_DEST specifies the pathname for a directory where the server will write debugging trace files on behalf of a user process.
For example, this directory might be set as follows:
On MS-DOS: C:\ORACLE\UTRC
On UNIX: /oracle/utrc
On VMS: DISK$UR3:[ORACLE.UTRC]
See Also:
Oracle9i Database Performance Tuning Guide and Reference for more information about the use of trace files
Your operating system-specific Oracle documentation for the range of values
http://download-west.oracle.com/docs/cd/B10501_01/server.920/a96536/ch1220.htm#REFRN10229
If you get connection as sys or system in SQL*PLus you can find the values of those init parameters.
SQL> show parameters user_dump_dest
NAME TYPE VALUE
user_dump_dest string /opt/ora9/admin/LQACNS1/udump
SQL>
SQL> show parameters background_dump_dest
NAME TYPE VALUE
background_dump_dest string /opt/ora9/admin/LQACNS1/bdump
SQL>
SQL>
Joel Pérez

Similar Messages

  • After compressed the database table affect the system performance

    There is a very big database table in my system called MSEG. It is about 910G and the wasted tablespace is about 330G. I want to compress the table, but the table is written and deleted frequently. will it affect the system performance if I compress the table. Or I can only select reorganize the dababase table avoid it affect the system performance? Thanks.

    Hi Huiyong,
    If you talk about table compression, it cannot be done online. we need to do this with a planned downtime. Table compression has some percentage of overhead on CPU. Refer SAP note for
    1289494
    FAQ: Oracle compression
    1436352
    Oracle Database 11g Advanced Compression for SAP Systems
    If you talk about online table reorg, yes definitely there would be impact on user performance.
    As the table size is very big it may take some days or hours to perform online Reorg.
    Other faster method is to perform table export import which is faster than online reorg. But it will again require downtime .
    Hope this helps.
    Regards,
    Deepak Kori

  • How to compact the database?

    Hi! Im new to java programming. I wish to seek help on how to compact the database using java.
    Any help will be greatly appreciated.
    thanks.

    You wish to compress the database in Java? Most databases have support for compression.

  • Exclusive Access To The Database

    Is there a way for my code to know if I have
    Exclusive Access To The Database?  If so, what would the
    Compress Database Command be?
    Application:  I would like my update script to include a command to compress the database, when no one else is in the database.

    Hi Mark,
    >> Is there a way for my code to know if I have Exclusive Access To The Database?
    I think you could check whether the Access is exclusive with vba code, some key code as below:
    Function IsCurDBExclusive () As Integer
    ' Purpose: Determine if the current database is open exclusively.
    ' Returns: 0 if database is not open exclusively.
    ' -1 if database is open exclusively.
    ' Err if any error condition is detected.
    Dim db As Database
    Dim hFile As Integer
    hFile = FreeFile
    Set db = dbengine.workspaces(0).databases(0)
    If Dir$(db.name) <> "" Then
    On Error Resume Next
    Open db.name For Binary Access Read Write Shared As hFile
    Select Case Err
    Case 0
    IsCurDBExclusive = False
    Case 70
    IsCurDBExclusive = True
    Case Else
    IsCurDBExclusive = Err
    End Select
    Close hFile
    On Error GoTo 0
    Else
    MsgBox "Couldn't find " & db.name & "."
    End If
    End Function
    For more information, you could turn to the link below:
    # ACC: How to Determine If a Database Is Open Exclusively
    http://support.microsoft.com/kb/117539
    >> If so, what would the Compress Database Command be?
    For compressing the database, I think you could use
    # Application.CompactRepair Method (Access)​
    Best Regards,
    Edward
    We are trying to better understand customer views on social support experience, so your participation in this interview project would be greatly appreciated if you have time. Thanks for helping make community forums a great place.
    Click HERE to participate the survey.

  • Image compressed and stored in the database.

    Hi,
    We have a tiff file stored in the database and then compressed using the following Intermedia Oracle method.
    process ('compressionFormat=FAX4,maxScale=1696 2200'); and save it back in the database.
    This oracle database is 8.1.7.4. The tiff file gets processed when we use TIFF file generated by old scanner but errors out with the TIFF file generated by latest scanners.
    The error I get is ORA 29400 , IMG-00704.
    There is a difference in TIFF version in the outputs generated by two scanners,
    The compression option : TIFF modified G3 in the old one
    while the new one has Lempel-Ziv
    Also, the tiff is loaded and saved fine but the image gets corrupted after the intermedia process method is run.

    Thank you for your reply.  I am glad to find that I did not miss an option.  I was aware that I could move my pictures into some other folder, but you have forgotten the solution that I chose.  That was to go back and use ZoomBrowser, which works to access photos in any folder I choose.  In addition it loads promptly.  I have only spent a brief time using the ImageBrowser but don't recall seeing any enhancements that over ZoomBrwoser.  Perhaps if I was attempting to interface with other software but as a stand-alone product, well perhaps you could enlighten me as to what makes ImageBrowser EX a better product..  Regarding the Accept as Solution: are you implying that Canon restricts their search function to only those responses that the OP marks as accepted?

  • Job output is no longer available in the database control file

    Im running my rman backup with Oracle Enterprise manager grid control 10.2.0.1.
    In nocatalog mode.
    When I look for the job output in the "View Backup Report" section, it says
    Job output is no longer available in the database control file.
    Here is my rman setup:
    RMAN> show all;
    using target database control file instead of recovery catalog
    RMAN configuration parameters are:
    CONFIGURE RETENTION POLICY TO REDUNDANCY 7;
    CONFIGURE BACKUP OPTIMIZATION OFF; # default
    CONFIGURE DEFAULT DEVICE TYPE TO DISK; # default
    CONFIGURE CONTROLFILE AUTOBACKUP ON;
    CONFIGURE CONTROLFILE AUTOBACKUP FORMAT FOR DEVICE TYPE DISK TO '%F'; # default
    CONFIGURE DEVICE TYPE DISK BACKUP TYPE TO COMPRESSED BACKUPSET PARALLELISM 2;
    CONFIGURE DATAFILE BACKUP COPIES FOR DEVICE TYPE DISK TO 1; # default
    CONFIGURE ARCHIVELOG BACKUP COPIES FOR DEVICE TYPE DISK TO 1; # default
    CONFIGURE CHANNEL DEVICE TYPE DISK MAXPIECESIZE 5 G;
    CONFIGURE MAXSETSIZE TO UNLIMITED; # default
    CONFIGURE ENCRYPTION FOR DATABASE OFF; # default
    CONFIGURE ENCRYPTION ALGORITHM 'AES128'; # default
    CONFIGURE ARCHIVELOG DELETION POLICY TO NONE; # default
    CONFIGURE SNAPSHOT CONTROLFILE NAME TO '/opt/oracle/app/oracle/product/10.2.0/db_1/dbs/snapcf_mus_prod.f'; # default
    anyone knows why i cannot see my rman job output?

    show parameter control_file_record_keep_time;
    NAME TYPE VALUE
    control_file_record_keep_time integer 7
    The backup is usually the last one. And it's the day before.
    If i go to the JOB tab in oem, then click on the job detail, i can see the detail of my rman backup. But i cannot see the backup detail in the View Backup Report section off the database. It says the error message : Job output is no longer available in the database control file

  • What is the best way to copy aperture library on to external hard drive? I am getting a message that say's "There was an error opening the database. The library could not be opened because the file system of the library's volume is unsupported".

    What is the best way to copy aperture library on to external hard drive? I am getting a message that say's "There was an error opening the database. The library could not be opened because the file system of the library's volume is unsupported". What does that mean? I am trying to drag libraries (with metadata) to external HD...wondering what the best way to do that is?

    Kirby Krieger wrote:
    Hi Shane.  Not much in the way of thoughts - - but fwiw:
    How is the drive attached?
    Can you open large files on the drive with other programs?
    Are you running any drive compression or acceleration programs (some drives arrive with these installed)?
    Can you reformat the drive and try again?
    Hi Kirby,
    I attached the UltraMax Plus with a USB cable. The UltraMax powers the cable so power is not an issue. I can open other files. Also, there is 500GB of files on the drive so I cannot re-format it. Although, I noted I could import the entire Aperture Library. However, I do not want to create a duplicate on my machine because that would be defeating the purpose of the external drive.
    Thanks,
    Shane

  • Is it better to store the image in the database or to use BFile

    Hi all,
    I've a doubt regarding the handling of the image. For example i've the images of the persons amounting to 50,000(the number will be increased in future). So i just want to know which is the better menthod. is it better to store all the images in the database or to store in local OS and use Bfile concept. I'm using Forms 6i and Forms 10g(10.1.2.0.2). OS: Windows XP
    Regards,
    Alok Dubey
    Edited by: Alok Dubey on Nov 14, 2008 5:43 PM

    But the total number (i.e.., 50,000) is scaring me to use the BlobI don't think the database will cause problem just due to the amount of data stored in blobs. Of coure you would have to check which parameters to set for the blobs to make storage efficient. If
    You didn't mention your db-version, in 11G you could also think of using compression and deduplication to deal with the data amount.
    In general i agree with Francois, the best way depends, but in terms of data consistency and easy backup/restore there might be some advantages in using the blob-version.
    hope this helps.

  • Rman Backupset size is exceeding the database size

    Hi
    my rman backup size is exceeding the database backup size and occupying the full mountpoint. and finally due to space issue, backup is failing.
    Below is the RMAN script
    run {
    backup
    incremental level 0
    tag ASCPLVL0
    database plus archivelog ;
    delete noprompt backupset completed before 'sysdate - 4/24' ;
    Please look into this
    Regards
    M. Satyanvesh

    Are you using compression?
    Ans: No
    The size of an RMAN backup isn't always proportional to the size of the database.
    Ans: yeah I know, but it should be somewhere near to database size(lets say 100gb or 150 gb variance)
    How many archivelogs are you backing up with your database?  This is possibly a factor in the size of your db backup.
    Ans: archive log count per day is 18  and size is 34gb
    Have you got a retention policy in place and do you regularly delete obsolete backups/archivelogs?
    Ans: yes
    Are you taking this backup as part of a backup strategy? or is this just a one off for some other purpose which would seem to be the case.
    Ans: This is production systesm so its a part of backup strategy

  • How to compress FM database

    Hello
    How to compress FM database.
    FM 4.2(7b) is used with Postgres, and the database is currently at 11.2 gigs.
    Is it true that there is a vacuum command that can be run against SQL dbs.

    Hello Chetan,
    Here is how to compress PostgresSQL
    FM Server is running slow with Postgres.
    Please check database size ($INSTALLDIR/dcm/db/data), size should be less than 3 GB. (On windows, right-mouse click on “data” folder and select “properties” menu item, a dialog will pop up and display folder size. On linux/solaris, just run “du –k $INSTALLDIR/dcm/db/data.” command)
    Run the following command to reduce size:
    1.Stop FMServer.
          2. cd $POSTGRES_DIR/bin
          3. run ./psql.exe –U db_username dcmdb (where dcmdb is the database name)
          4.run this command at psql prompt.
                    #vacuum full analyze verbose;
                    #\q
          5. You should see database size is reduced. (The above might take a while)
    Change the following parameter inpostgresql.conf to slow down the growth. Here is the instruction:
    1. stop postgresql services
          2. edit $INSTALLDIR/db/data/postgresql.conf
          Change following parameter:
          checkpoint_segments = 6        
          max_fsm_pages = 532000
    shared_buffers = 128MB
    temp_buffers = 32MB
    work_mem = 4MB
    maintenance_work_mem = 16MB
    max_fsm_pages = 532000
    log_min_duration_statement = 200
          3. Restart postgresql services. (remove first '#' if it is there)

  • Final Cut Server not storing clip proxy in the database

    FCS generates a clip proxy but doesn't store it in the database unless the user that installed FCS is logged into the server.
    The error message I get is:
    ERROR: EREQERROR
    Encountered error using Compressor: <Error>: kCGErrorFailure: Set a breakpoint @ CGErrorBreakpoint() to catch errors as they are logged.
    I checked that the UID stored in the configuration file for FCS matches the UID of the user that installed FCS per the following:
    http://support.apple.com/kb/HT3282
    Note that the clip proxy is stored in the db as long as the user that installed FCS is logged into the server.
    I am running FCS on a Mini and have a cluster of three minis running the compression. I've confirmed that it doesn't matter which node in the cluster does the transcode.
    Has anybody else experienced this problem ? Have a solution ?

    Sorry about the delay.
    Yes I can run compressions on the host using the GUI and the same cluster. I tried several different compressions and they worked without a hitch.
    When I use FCS the compression also runs (I can find the transcoded clip file in the spool directory), and batch monitor says it was successful. The problem appears to be that FCS doesn't store it in the db unless I'm logged in to the host server.

  • Reducing the database size

    I have upgraded from a G5 desktop to a macbook pro. Wow, big speed increase. I have been increasingly going into photography and my aperture 3 (love it compared to version 2!) is at 500 or so gigs. That would include the imported iPhoto database.
    My MBP has a 750 gb hard drive.
    Obviously, memory is going to be a problem soon. I could upgrade to 1TB hard drive, but that won't really address the problem for long.
    I did upgrade to 8bg for memory, huge speed boost to ap3.
    I am looking over my entire library and am wondering how many of the pictures I really need on the laptop, which is intended more for a working type of deal. My G5 may be slow, but it runs. Hot. But it runs and runs and runs.
    So my thoughts are what if I were to keep the iphoto library on the G5 and just keep current pics on the laptop? The new raw images are chomping through drive space quick and I'm just wondering how to streamline the library.
    I've read many pro's and con's about one database Vs referenced files. I'm a man on the go with my camera and laptop strapped to my back. Just don't want to deal with another device. Tried it, it worked with FW 800, but really, hope to not go that route.
    Therefore, having given that background info, here's what my thoughts are. I'm open for suggestions and discussions please.
    #1 - is it possible to extract/remove just the iphoto portion of the library that is now embedded in the database? I don't need to so much as export it out as it is on multiple backups, TM backups, and on the G5. I would like to simply remove the iphoto database for now.
    #2 - I would also maybe like to do a similar approach to various other photos that are part of ap3. Is it possible to remove "chunks" (so to speak) of the database through ap3?
    #3 - What other options might exist? I do want to keep it all in the one ap3 database on the 750 gb hard drive for now. I can always go referenced later if I should choose that route.
    #4 - Are there compression programs? Any other options I've not mentioned?
    Thanks

    I do appreciate the information. I am super hesitant to have to connect to an external drive as I am constantly doing adjustments. It is just what I do. I take pics, load em in, and then run them through all manner of changes through CS4 and photoframe 4.6.4 and photomatix HDR. I love editing. Odd thing is, I can finish with a clients pictures and go back over them a month later and see so many things to change.
    I love doing this as much as I love to take pictures.
    I suppose the best thing is to get the max internal drive. I did try to use an external FW 800 drive for a while, but carrying around another hard drive, power cord, etc was a pain and my backpack went from full to overly stuffed. Just talking about the reality of it.
    I live on the move, so mobility has become everything for me. I'll consider your words and keep trying to figure out what to do.
    I'm guessing a 1 TB drive is as large as I can get right now?
    Thanks

  • Size of the database

    HI All,
    I have a 10GB database, when I backup the database using RMAN. What will be the size of the database approx.
    Thanks

    Dj3 wrote:
    HI All,
    I have a 10GB database, when I backup the database using RMAN. What will be the size of the database approx.
    ThanksThe size of the database will be exactly what it was before the backup.
    Oh, you meant what will be the size of the backup?
    It depends.
    Are you taking image copies or backup sets?
    With or without compression?
    A 10g database really isn't very big to start with. Why don't you just run a backup and see for yourself?
    If you are trying to determine how much space you need to hold your backups, don't forget that it's more than "just a backup". You've got the db files, the control files, the archivelogs .... and multiple generations to allow PITR throughout your specified recovery window.

  • DB2 change the database recovery mode

    Hi Friends,
    We are doing an upgrade, in the process will be necessary to change the DB recovery mode.
    Database is DB2 9.5 and the platform is Unix (AIX). I wold like to clarify the process of change the recovery mode, I underestand that it is to switch off the logg retain, but I have found in some forums that in order to switch on the logg retain, when the upgrade finished, I have to have an full backup of this database.
    I need to plan this activity and its effects.
    Please do you know something about it, can you clarify me the process?
    Thanks a lot.
    Regards
    Enrique Sánchez

    The process is pretty straightforward:
    1. Shutdown SAP and deactivate the database.
    2. Enable recovery mode by setting logarchmeth1 (as user db2<sid>)
    db2 update db cfg for <SID> using logarchmeth1 disk:<archivepath> immediate
    3. Perform an offline backup of the database (as user db2<sid>)
    db2 backup database <sid> to <backuplocation> compress without prompting
    The backup time depends on the size of your system. Once the backup is complete, you can activate the database and bring up SAP.
    - Sameer

  • How can I change the database & schema used by an Application?

    Hi community,
    I am very new on Essbase and I need help to find out how can I change the database connection (and schema) used by the database of an Application. The figure is that I need to point this database to a QA environment (currently it points to DEV).
    If I do a right click over the properties of the database, I can see the following tabs: [ General ][ Dimensions ][ Statistics ][ Modifications ][ Compression ]. However, there is no option to change the database connection.
    Is there a way to do this?
    Thanks in advance for any help! Cheers!

    Are you trying to change a database connection of Planning application?
    Essbase application do not have a relational connection.
    Regards
    Celvin
    http://www.orahyplabs.com

Maybe you are looking for

  • How to change the data type of an item in administrator

    Hi, A summary folder was created. One of the item in the summary folder is a sum. That sum item was created in the business area (based on another item). I can't refresh that summary folder because it gives me an oracle error (Ora-12008 and Ora-01401

  • JAXB to generate java classes for XSD.

    Hi I have a XSD, which is importing other couple of xsds in it, tried to generate java classes using xjc, it is throwing error. C:\vittal\Project\received\development-1\da_xsd>xjc -p com daAuthoring.xsd parsing a schema... [ERROR] Property "Alt" is a

  • Explain the output of compareTo( )?

    Hello, I am hoping someone can explain exactly how the int output of the following code is obtained: String s = "bam"; String t = "boom"; //int i = s.compareTo(t); // -14 int i = t.compareTo(s); // 14 System.out.println("i is: " + i);I don't understa

  • Recovery disc not recognized

    product hp pavilion g72  i installed windows 8 premier which windows 7 was deleted now the recovery disc i have says its not for the computer. now what should do

  • Cannot Install to Mac OS X

    When attempting to install applications from Adobe Application Manager, I get the following error with all applications: The download appears corrupted.  Press Cancel, wait a few minutes and try again.  (-60) The operating system is OS X 10.7.3.  I h