Database seems to fragmented

I have one scenario on our live production DB.This problem seems to be like performance perspective.I want your advice and your optimal solution to overcome from this issue.
1Likewise we have our live production system in which we have did some year end activity(purging of old data(delete old data) and this activity did it from the GUI based.We have not to delete all the records from the database but some of record which has been not at all in used.We are willing to go with Truncate command but unfortunately in tools does not provide with truncate command syntax
2After this activity we comes to know database table and as well Indexes are become more fragmented.
3It has also reduce database response time also now days some queries are taking more then 5 min for execution time
4More ever i have checked that all most indexes as well as tables are got fragmented almost.We did some what workaround for this and i.e. rebuild the indexes but still we are facing the same problem because deletion activity happens by the end of month or two month
5for This live production database we don't have the downtime.
Any valuable suggestion will more appreciate

Hi!
I'm sorry to tell you this but you're not in the right forum!
You must go to the DB forum!
Warm regards!
Max.
P.S.
Have you already run the statistics?

Similar Messages

  • Lite database seems to crash after creating a database adapter

    Hello,
    I'm using the SOA suite 10.1.3.3.0 and Oracle Lite database that comes with it.
    I'm designing in JDeveloper with the same version number.
    After creating a database adapter, the database gives back the exception:
    "Internal Error: Invalid Connect String".
    A restart fixes it, but i don't feel like restarting each time obviously.
    Pls help.
    Message was edited by:
    user614695

    Hello,
    Is this a Bug ? Has anyone else faced this error?
    It's not a bug, it's working as intended. Contained users don't have instance level permissions and cannot "login" to the instance (which is what the "browse" button is attempting). In order for it to work, the database name must be in the connection string
    (which with the browse button, it will not be).
    Welcome to contained users, they aren't for everyone.
    Sean Gallardy | Blog | Microsoft Certified Master

  • How to save the CLOB domain data in to database fron page fragment

    Hi ,
    Am using Jdev 11.1.1.2.1.
    In the database am using i have a CLOB Domain Data. I want to Edit it.
    Am displaying it in <af:richTextEditor> , But when i Edit it and save, it will throw an error saying that the String cannot be converted to CLOB Domain data.
    Please let me know how to save it.
    Thanks,
    hari

    Check this blog post:
    http://baigsorcl.blogspot.com/2010/05/adding-boolean-checkbox-to-table.html
    af:selectBooleanCheckbox can't submit value to VO?
    Thanks,
    Navaneeth

  • Apple TV Seems Slow/Fragmented

    Hi,
    My 30GB Apple TV has been behaving strangely lately, especially when using YouTube. Every now and then it freezes and you can hear a lot of hard drive activity. It reminds me of using a computer that doesn't have enough RAM and is using the hard drive for Virtual Memory.
    I only have 4.83GB of free space left on my Apple TV. Could this be the problem? When it was new it was very fast and responsive. But now it freezes a lot and crunches away at the hard drive.
    Is there a way to defragment the Apple TV's hard drive?

    in the PC world it's not good to use more than 90% of any drive -- can't say for MacWorld, though).
    Same thing, it causes the dicectory to get fragmented and even corrupted, on the issue of de-frag, OSX doesn't need de-fragging, unlike OS9 and windows, although there are companies out there still providing de-frag software for the mac. Whether or not the diet version of OSX on the tv works the same way I don't know, however as Mike points out there isn't actually anyway of de-fragging, so it doesn't really matter.
    If you are interested in why the issue is occuring, you might want to follow a diagnostic route, otherwise, I'd try a restart first and a restore if problems persist.

  • Huge SharePoint_Config Database/Timer Job History Overflow

    Hello All,
    I know there are quite a few sites about this issue, but none of them seem to work for me so I was hoping for some help.
    Just for reference, this is in my dev environment on a VM running:
    SP2010
    SQL Server 2008 R2
    Windows Server 2008 R2
    It is the same old story of SharePoint's "Delete Job History" timer job not running and therefor the “TimerJobHistory” database table growing massive. For the last couple of months
    my drive keeps running out of space and putting a halt to my work. So I went to the Net looking for a solution. I found a few that looked promising, but either caused more problems or simply did not work. Now that I have my environment back up and running
    I was hoping for some help in solving this.
    Does anyone know how to purge the “TimerJobHistory” table so my SharePoint_Config database will not keep filling up my space? From there is there anyway to prevent this from happening again?
    The “Delete Job History” will not run currently even though I have adjusted the days to keep history and increased the duration that the job should run.
    So far here is what I tried and the results:
    http://sharepoint.it-professional.co.uk/?p=228 – Runs for a while and seems promising, but keeps growing the log and database
    file until out of space, then crashes. I have to spend a lot of time freeing up space, detaching, and reattaching the config database. At this point I have removed everything from that drive that I can so no more space can be freed.
    Shrinking files – This is only a temporary fix and I heard that it is a bad idea to do this more than once every now and again anyway. I am sure my database is already fragmented terribly
    (which also if anyone can point me to a working solution for defragging in SQL Server 2008 R2 I would greatly appreciate it.)
    There are a number of other sites that make recommendations that seem to be… well… less than what would be considered a “good practice” so I avoided those.
    For the time being I have done what I had hoped not to and turned the SharePoint Timer service off to prevent the database from filling up again. If I turn it back on the log fill will be full
    in a matter of seconds which will move me back to no space and in a bad position.
    This seems to be happening to all the devs here, but I got a bit too brave and tried to find a solution… I learn a lot from my mistakes though!
    If any other information is needed, please let me know.
    Any help would be GREATLY appreciated.
    Thanks,
    J.

    Hi,
    We need to reduce the amount of data being deleted by the timer job during each run, so I need to modify the daystokeephistory value of the timer job.
    The following PowerShell script for your reference:
    $history = Get-SPTimerJob | Where-Object {$_.name -eq "job-delete-job-history"}
    $history.daystokeephistory = 25
    $history.update()
    $history.runnow()
    More information:
    http://convergingpoint.blogspot.com/2012/05/sharepoint-configuration-database-too.html
    http://www.techgrowingpains.com/2012/01/out-of-control-sharepoint_config-database/
    Best Regards
    Dennis Guo
    TechNet Community Support

  • How to calculate the size of database (a different one)

    Hello Friends,
    I am told to move the data from server to another machine (cloning is one very good option, I agree. But I expect a different answer regarding exports and imports). so How should I go about the task. The destination machine has got unlimited space. So thats not a problem. My questions are :
    1) How should I start the task ( I generally go for studying the structures and their sizes. Is it ok ?)
    2) If I am using Unix machine and there is a limitation that my server will not support file sizes of more than 2 GB, What should I do ?
    3) Shall I have to go for a full database backup or fragment the task ? If I do that , there are many schemas, so it will become tedious. But full backup will exceed OS size limitation. What should be done ?
    4) Is there anyway, I can go through a dump file, to find out, the database objects present inside that and note the related dependencies.
    Please respond.
    Regards,
    Ravi Duvvuri

    1) They are Unix machines. So will the size problem occur(if there is a limitation). How to overcome that?
    If the OS are of the same version you will have any problem. Regarding the storage only you have to have space in disk to store the datafiles, redo logs and controlfiles and nothing else.
    2) I am trying Export/import measure. Though there are other good methods, I just want to explore the full capabilities of Exp/Imp
    r.- Recreate the controlfiles is more effective and fast if the OS are of the same version.
    3) And the oracle version is 9i 2. (If I have to perform this on 8i (both source and destination, will the methods are going to differ ?)).
    R.- The method is the same for 8i/9i/10g.
    How should I go about doing this export/import ?
    r.- To use this method you have to have the original database started.
    To recreate the controlfile without having the datafile that you mentioned you have to get out it from the CREATE CONTROLFILE sentence and that's it.
    For Example: I mean, if your database has 8 datafiles and you have only 7, you have to include only 7 in the CREATE CONTROLFILE sentence.
    Joel Pérez
    http://otn.oracle.com/experts

  • Resizing tablespace is taking too much time and freezing the database

    This is my database setup:
    3 node rac on 32 bit linux (rh 4)
    2 SANs (3gb + 2 gb)
    asm config: everything is default from installation
    current datafiles: 3.2T out of 5.0T (autoextend on)
    this is how I do to resize the tablespace manually to test how long it takes:
    alter tablespace my_tbs resize 3173562M; //from 3172538M (adding another 1gb)
    And it took 33 minutes to complete.
    Could someone tell me what is wrong?
    ps:: when i check the instance performing, the "SMON" user is like 97% of db activities. And while the resizing is running, the database seem to freeze.
    Thanks,
    Chau
    Message was edited by:
    user626162

    sorry, it is 5 Tera bytes total.
    there are 2 sans, one is 1.7T (one partition). the other san has 3 partitions with 1.1T each. I have only 1 datagroup
    I used big file so there is only one big file for this tablespace. the size of the tablespace is 3.2T used with autoextend on (increment size is 1G)
    this is the command I used to manually resize the tablespace:
    alter tablespace my_table_space resize <3229923M>; //current size + 1G
    Thanks,
    Chau

  • I need information about a database

    Hi
    I am making an application and the data base that I am using is a dbf. I have found two jdbcs to connect me. One is DBF_JDBC20.jar but it is trial. I began the application with this. After I found tinySQL.jar I try to change but some functions do not work.
    I need to know if Does anybody know a database seemed to dbf?. For example. I can use the mdb(MSAccess) but my program has got to run in Linux and it does not exist any free jdbc.
    I dont know if it exists any another database without server if it exists tell me. please
    thanks in advanced.

    yes, but it is a server. ?
    I need something I do not install a server. By that i use u dbf.You can install MySQL on any PC. Is that what you mean?

  • Console.app not updates in LOG DATABASE

    After a forced shutdown of the MacBook Pro, the console.app displays all logfiles and also LOG DATABASE QUERIES, but the content of the LOG DATABASE only date back to the fored shutdown point in time. Afterwards, system.log, etc are updated, but the LOG DATABASE is not.
    I have checked for the /var/log/asl.db, which I do not have (touching it also did not help). I have an /var/log/asl directory containing following files (the LongTTL.asl was updated after forced shutdown, 14:19 was restart time):
    drwxr-xr-x 11 root wheel 374 4 Jan 15:55 .
    drwxr-xr-x 53 root wheel 1802 4 Jan 15:41 ..
    -rw------- 1 marcus wheel 81500 2 Jan 23:42 2009.01.02.U501.asl
    -rw-r--r-- 1 root wheel 51312 2 Jan 23:40 2009.01.02.asl
    -rw------- 1 marcus wheel 102955 3 Jan 22:10 2009.01.03.U501.asl
    -rw-r--r-- 1 root wheel 61440 4 Jan 00:09 2009.01.03.asl
    -rw------- 1 marcus wheel 41827 4 Jan 14:19 2009.01.04.U501.asl
    -rw-r--r-- 1 root wheel 38431 4 Jan 14:18 2009.01.04.asl
    -rw-r--r-- 1 root wheel 3547 4 Jan 15:55 LongTTL.U0.asl
    -rw-r--r-- 1 root wheel 113923 4 Jan 15:55 LongTTL.asl
    -rw-r--r-- 1 root wheel 8 4 Jan 14:19 StoreData
    Any more ideas how to repair/enable LOG DATABASE in console.app?
    thx.

    Somehow the forced shutdown corrupted that date files in var/log/asl subdirectory. Restarting machine the following day showed new messages within LOG DATABASE, but messages from the day before are not displayed any more. So the LOG DATABASE seems to work again.

  • I cannot find the new data in the database

    Hi
    I'm newbie at SQL.
    I read and watched a lot about the SQL. But still can't figure out the problem.
    THE PROBLEM:
    we are using accounting software for our business.
    the software keeping records of clients names and other information.
    when I open the software, i can access everything, but i can't edit the File Number for example.
    So I tried to access the files from within windows. Then I learned that the files I'm looking for are called Database.Which brought me to SQL server.
    short story: I opened the database using SQL server but it contains only data that are three years old.
    How i can find the new data. Right now the database seems to have only 97K records.

    well, what i did is open SQL server, selected the database, then scroled to the Clients Table then opened it.
    I can only see the ID and Name, other information is always NULL
    I scrolled down to the last one which has ID 97000 for example.
    Then I went back to the Application, from there i can see the other information about every client, and i can see that this client file was created like three years ago
    I started to think that the application is using a different database
    even though I searched for any MDF file or a file that was recently edited, I can't find anything else

  • The Best Way to Check a Database.

    Hello All,
    I'm working on an app. where I'll retrieve upwards of 1 million items.
    For each retrieval, I have to make sure that the item was not previously retrieved and stored. That means on the 1millionth item, I have to test the previous 999,999 items . If a match is found, I would ignore it. Other wise I would add it to the store.
    I thought about storing all items in a simple hashtable. But 1 million items in a hashtable probably wouldn't be supported on a standard 128MB pc.
    So my only other option seems to be database.
    But the time required to retrieve and compare to the database seems like it would take forever.
    My best guess would be to retrieve say 3000 items then do all my tests at that time and store. Clear the memory and repeat. Is that the strategy you guys would try ?
    Does that seem doable -- i'll be using a standard jdbc-odbc bridge by the way
    stev
    Then I'm going to dump it into a database (MSAccess 2000) using the batch method -if the addBatch method is supported by

    If you create your database table to have an index which requires unique values, then an insert of a duplicate will throw an exception. I don't know whether addBatch will be practical here -- it might throw the exception and not add the rest of the batch, but then again it might not. Of course, adding 1 million items is going to take some time, but the other advantage of the database is that the data is permanently stored in a form that's easy to access in a variety of ways.

  • Database or text file

    Since I was able to get my first question answered so quicky (thanks again), I wonder if I could get an opinion from the experienced programmers here.
    I am creating a computer forensics application to verify file signatures found in the header , to the extension. For example, a .pdf file has a hex file signature of 25 50 44 46, a .com file has a signature of 4d 5a, etc. Each file type has it's own file signature.
    My question is two-fold. First, what would be considered an optimal means of storing and accessing the data for each file type? Second, what would be the optimal means for adding new data?
    With my limited knowledge, I have two means avaialable. One would be to create a text file with the necessary information. Then all I would need to do is find the correct string, and parse the information necessary to make the correction. My other thought was to make an individual class for each type of file, and add classes as necessary. Since each type of file has a differnt signature, and it's found in different places in the header, and even sometimes skips spaces, essentially each type has to be handled a little differently. Then I would just need to constructors to access the proper class.
    Any thoughts?

    database seems as too much for your problem, serarching the
    text file for each entry seems very resource/time consuming.
    I would propouse making a Hashtable with the extension
    as a key, and the signature as the value, and then having a
    loadHashtable and a saveHashtable method so you would just have to
    create the hashtable when you start your app and save it when you end it:
    class FileHeader{
    String extension,hex_signature;
    public FileHeader(String ext,String hex_sig){
    extension = ext;
    hex_sig = hex_signature;
    //other methods
    public String getExtension(){
    return extension;
    public String getSignature(){
    return hex_signature;
    public class MyApp{
    java.util.Hashtable files_hash;
    public MyApp(){
    files_hash=new java.util.Hashtable();
    loadHashtable();
    public void loadHashtable(){
    //load the hashtable with FileHeader objects containing every filetype
    public void saveHashtable(){
    //save all the extension,signature pairs in a file
    public boolean checkSignature(String extension,String proposed_signature){
    FileHeader file = (FileHeader) files_hash.get(extension);
    return file.getSignature().equals(proposed_signature);

  • Rebuild Database did not reduce the database size

    I just completed a rebuild database. The rebuild completed much quicker than anticipated and in the end the did not reduce in size.
    This is my first rebuild and since original installation I have deleted a number of masters. From my understanding thumbnails and previews are not deleted when the master is deleted. In order to delete the thumbnails and previews the db must be rebuilt. Is this correct?
    The db is over 31.1gb. The rebuild took approx 10 minutes and when completed the db size was still 31.1gb. There was a message during the rebuild that read "Recovering projects". I was expecting to see stats showing previews and thumbnails being generated in the activity window but don't see it. Has the db been rebuilt?
    Thanks for any assistance.

    Rebuilding the database is for disaster-recovery, not file maintenance. I certainly wouldn't do it unless something was fundamentally broken and the application didn't run.
    "rebuilding" the database seems to do the following:
    (1) remove the database file completely (not even to the trash, it destroys the database utterly)
    (2) run through the plist files and reconstruct a brand-new database from them all
    I checked this (on an empty library) using hard-links from the commandline, and step (1) above scared me sufficiently that I'd only rebuild the database as an absolute last resort.
    Now, 31GB seems a lot for a database that only took 10 minutes to rebuild - are you sure you're not telling us the size of the library (to find the database size, right-click on the library and select view-package-contents, then open the 'Aperture.aplib' folder and find the size of the 'Library.apdb' file. That's the file you're changing when you 'rebuild database'.)
    If you're telling us the size of the entire aperture library instead, then that size will be completely dominated by the master RAW files you've imported. I'd guess your library will be less than 10MB.
    My understanding is that previews are deleted (since they're stored as separate files alongside the master image, if you look in the folder hierarchy), but thumbnails aren't. The thumbnails are aggregated together into one-file-per-project (presumably for speed of access), and I'd guess the space is reclaimed for future use, but the file doesn't get any smaller. That's a classic trade-off for speed-of-access, and users like us are normally very keen on things being fast
    -=C=-

  • Requirements for database applications

    Hey all,
    I have a couple of questions:
    Is it possible to create applications with databases in a standalone way?
    If I made an application that uses JDBC what would any users of the application require to use it? (if they could install it on their own computers) would they need database software installed?
    Is there any alternative methods you would recommend using rather than a database? for example if I were to store family tree data, a database seems logical, but would you use an alternative storage method?
    Thank you!

    If I made an application that uses JDBC what would any users of the application require to use it? (if they could install it on their own computers) would they need database software installed?Usually you'll need a JDBC driver, which is a single jar file. No other DB software is usually needed on a DB client.
    (Fine print: there are types of drivers that need e.g. .dll files installed, or other cruft like that. But mainstream DBs usually do it the easy way: one .jar file. The driver jar connects to the DB server over the network, sends SQL commands to it, and receives results.)
    You'll need a DB server program. It can run on the same computer as the client, or a different one. The DB vendor will have an install procedure for the server. There are small ("embedded") DB servers, and big ("Oracle" :-) ones. The smallest DBs can even be a single jar file that contains the JDBC API and the code that stores the data in files - no other components needed.
    Is there any alternative methods you would recommend using rather than a database? for example if I were to store family tree data, a database seems logical, but would you use an alternative storage method?There are a number of different ways to store data. Files come to mind. For the example of a family tree, a relational database seems a pretty good fit. The devil is in the details.

  • Camera raw in Elements 12 - Using the database instead of xmp sidecar files

    PSE12 ACR
    Is there somewhere some documentation about using the option to save the ACR settings in a database rather than in xmp sidecar files ?
    I have found the location of the 'Database' file in C:/Users/[my name]/AppData/Roaming/Adobe/CameraRaw/
    Several questions:
    - how to ensure that the 'Database' is backed up with the built-in backup process in the Organizer ?
    - that Database seems to be independent of the catalogs
    - How is the Database updated if you move or rename files from the Organizer ?
    - What happens if you already have xmp sidecar files when you choose the option of the Database in the ACR dialog ?
    - If you upgrade catalogs from Elements to Lightroom, are the settings saved in the database recognized by Lightroom ?

    What is your screen resolution on the computer you're running PSE8 on?
    ACR requires a vertical screen resolution of 768.  If you're on a small laptop or a normal-sized netbook, PSE will skip the ACR-user-interface and go directly into the editor.
    The update for ACR will be on http://www.adobe.com/downloads/updates/ and choose Camera RAW - Windows/Mac at the top, and then after choosing the ACR version, the page that comes up should have a link for a PSE-specific install process.  The text on the main ACR update page will have the Photoshop instructions so you have to look for a link for the PSE instructions.  It should involve downloading a ZIP, extracting the files, and replacing the plug-in file manually.

Maybe you are looking for

  • JAX-WS: How to choose from multiple client certificates on the fly?

    I have a webapp that is calling a web service supplied by a vendor. The vendor requires the use of client certificates for authentication, and I have successfully called their service using the PKCS#12 keystore they gave us with JAX-WS 2.2 using code

  • Portal error on KM Content tab

    Hi Friends,   We added new patch nw2004s sp10.then we got error in km content tab.Plz help regarding this Portal Runtime Error An exception occurred while processing a request for : iView : pcd:portal_content/super_admin/com.sap.portal.content_admini

  • Cannot downgrade my Flashbuilder to 4.6

    Hi, I installed flash builder 4.7 an hour ago and I found that there's no design view anymore. I'd like to use Flash Builder 4.6 since it has the design view. However, the system does not allow me to install 4.6. It always display this error ( please

  • Notes Folder Empty

    I copied .txt files to the notes folder and they worked fine. Today I added a file and now when I access the notes folder on the IPOD it shows blank. I checked the IPOD on the computer and it stills shows the files in the folder. This is the second t

  • Copy and paste border style

    How can I copy and paste border style in the new Number?  In the previous version, I'm able to do this by allowing border selection, now I can't.